Washington, D.C.’s first installment of its minimum wage increase, from $8.25 to $9.50, which took effect in July 2014, came at a cost of approximately 2,400 service-sector jobs in the following twelve months. If Washington’s voters pass the $15 minimum wage that is on the ballot in November, it is likely that a total of up to 13,300 jobs would be lost.
Two years ago, then-Mayor Vincent Gray signed into law a minimum wage increase in the District of Columbia, raising the city’s wage floor by 39 percent in three installments. The first took effect in July 2014, while the second (to $10.50) occurred in 2015. The third to $11.50, is slated for this summer. Bureau of Labor Statistics data has already demonstrated the negative consequences of the first stage.
While higher minimum wages raise the incomes of some workers, they also lower employment by making it more expensive for businesses to hire people, particularly the young and unskilled. This is exactly what happened in the nation’s capital.
In the twelve-month period following the first D.C. minimum wage increase, employment in the District’s leisure and hospitality sector shrank by 0.1 percent, or approximately 100 jobs. The leisure and hospitality sector includes hotel and restaurant workers, who are particularly susceptible to minimum wage increases.
This 100-job loss, however, does not tell the whole story. Research on the minimum wage suggests that most of the harmful effects happen because of a reduction in job growth, not outright job losses. In other words, instead of laying off workers, businesses will open slower and close faster.
This phenomenon disguises many of the minimum wage’s effects, since many of the jobs that are “lost” are jobs that never were. Therefore, a complete examination of the D.C. minimum wage increase requires us to make comparisons.
In the year-long period after the first wage increase, employment in all other District industries grew by 1.9 percent. Jobs in higher-wage industries, where the minimum wage is less relevant, thus dramatically outpaced those in the leisure and hospitality sector, where growth was negative 0.1 percent.
That is not the only comparison worth considering. In the 12 months before D.C. raised its minimum wage, jobs in the leisure and hospitality sector grew at a healthy rate of 2.2 percent. After the city government raised the minimum wage, employment growth in this sector turned negative.
By comparing the change in rates of job growth before and after the wage increase in all other industries to the change in the affected leisure and hospitality sector, we can estimate how many leisure and hospitality jobs were eliminated by the minimum wage increase.
The higher minimum wage reduced employment growth in the District’s leisure and hospitality sector by 3.5 percentage points, according to this comparison. This represents a loss of 2,400 jobs. This is only for the first of three wage increases. Had the District’s minimum wage gone up to $15, a measure that is on the ballot in November, the loss would have been 13,300 jobs.
While this method only produces an estimate — there are many reasons why the actual number could be higher or lower — it should make policymakers pause before they consider raising the minimum wage again.
These estimated job losses are 24 times what we see by just looking at the raw numbers, which underscores the importance of comparing what is to what might have been. The minimum wage is no exception when it comes to the unintended consequences of policy. Lawmakers — especially those who want to raise the minimum wage to $15 — should remember this.
Friday, January 29, 2016
DC Minimum Wage Goes Up, Jobs Vanish | Foundation for Economic Education
DC Minimum Wage Goes Up, Jobs Vanish | Foundation for Economic Education
Tuesday, January 26, 2016
5 Gunfighting Myths Debunked By Massad Ayoob
5 Gunfighting Myths Debunked By Massad Ayoob
“IF YOU CAN’T DO IT WITH SIX, YOU CAN’T DO IT AT ALL!”
Alas, that’s not always the case. Sometimes you can’t do it with six, but you can end the deadly threat with, oh, seven…or eight…or 19…or maybe 33.
“MY CAR IS NEVER FAR AWAY, SO I’LL JUST KEEP MY HANDGUN/LONG GUN/SPARE AMMUNITION THERE.”
That’s a convenient excuse for not carrying those things, but it’s unrealistic. In the case just mentioned, Sergeant Gramins began in his patrol car with a 12 gauge Remington 870 pump shotgun in an overhead rack and an AR-15 patrol rifle in the trunk, and it happened so fast that he was never able to deploy anything but the pistol on his hip and the magazines in his belt pouches.
The history of gunfighting is, when the fast and furious shooting starts, what we have on our person is all that we’re likely to have to fight with.
“YOU MUST PRACTICE ONLY POINT SHOOTING, BECAUSE YOU’LL NEVER BE ABLE TO SEE YOUR SIGHTS IN A GUNFIGHT!”
However sincerely some seem to believe that, it’s simply untrue. I’ve lost count of how many gunfights I’ve studied where the survivor said something like, “I was pointing the gun and firing as best I could and nothing was happening. Then I remembered to aim with my sights, and the other guy went down and it was over.” If you study the history of Wyatt Earp, you’ll find that he may well have killed 10 men with gunfire. He told his biographer Stuart Lake that—with one exception—he was always careful to align his “foresight” with his “back sight” and to squeeze, not jerk, the trigger. Wyatt Earp died at a ripe old age, never having sustained a gunshot wound himself.
“YOU MUST PRACTICE ONLY AIMED FIRE, BECAUSE YOU’LL NEVER BE ABLE TO HIT ANYTHING POINTING!”
This is also over-simplistic and untrue. I just quoted two great gunfighters, Wyatt Earp and Bill Allard, who won many shootouts carefully aiming their guns. But notice that each had “one exception.”
Wyatt Earp confronted Frank Stilwell, believed to be the murderer of his brother Morgan Earp, at the train station in Tucson, Arizona. Stilwell grabbed the barrel of Earp’s shotgun in an apparent attempt to disarm him. Earp levered the butt of the shotgun down and the muzzles up, jammed the twin barrels into Stilwell’s midsection and cut loose. The battle was over.
Saturday, January 23, 2016
6 Big Cities See Hiring Fade After Minimum Wage Hikes | Stock News & Stock Market Analysis - IBD
6 Big Cities See Hiring Fade After Minimum Wage Hikes | Stock News & Stock Market Analysis - IBD
U.S. cities that implemented big minimum-wage hikes to $10 an hour or more in 2015 have seen a strikingly similar aftermath: Job gains have fallen to multiyear lows at restaurants, hotels and other leisure and hospitality venues.
The data aren’t, for the most part, stark and reliable enough to amount to smoking-gun proof.
But Chicago, Oakland, San Francisco, Seattle, Los Angeles and Washington, D.C. — all on the leading edge of the push for big minimum wage hikes — all show worrisome job trends.
D.C. Jobs Decline Linked To Wages
The strongest evidence comes from the nation’s capital, where leisure and hospitality employment, which rose at least 3% annually over the prior four years, fell an average of 1% from a year ago in the three months through November. So instead of adding 2,000 or more jobs per year, restaurants, hotels and the rest of the leisure and hospitality sector have lost about 700 jobs.
The timing coincides with the $1 minimum-wage hike to $10.50 an hour last July. That jump followed a boost from $8.25 to $9.50 an hour that took effect in mid-2014. Another jump to $11.50 is set for this July.
The D.C. data are key because they reveal outright job losses confined to the city limits. Researchers studying the latest round of citywide minimum-wage hikes generally have had to rely on data for a big chunk of the broader metropolitan area, making the analysis more speculative. More reliable data through 2015 will be available in June via the Quarterly Census of Employment and Wages.
Still, the available data provide plenty of reason to be wary of the big minimum wage hikes in the pipeline, as well as the push for a national minimum wage of $12 an hour by Democratic front-runner Hillary Clinton and the fight for a $15 wage by the further left.
Chicago Hospitality Hiring Stalls
Job gains in the Chicago-area leisure and hospitality sector slumped to a five-year low after the Windy City’s $1.75-an-hour minimum-wage hike to $10 an hour took effect in July, Labor Department data show. Annual employment gains at restaurants, hotels and other leisure-sector venues averaged just 1.1% from September through November, about half the pace seen in 2014 and the weakest since the summer of 2010.
Chicago’s minimum wage will get another bump to $10.50 an hour on July 1, another stop on the way to $13 by 2019.
The Chicago data cover the Chicago-Naperville-Arlington Heights area, of which Chicago represents only about 40% of the population.
Bay Area Job Blues
Bay Area job growth in the leisure and hospitality sector slumped to a five-year low after San Francisco and Oakland adopted the highest citywide minimum wage in the country of $12.25 an hour last spring.
After rising close to 5% a year, leisure and hospitality industry hiring slowed to just 2.2% from a year ago in November in the Bay Area. Meanwhile, such employment rose 4.9% in the rest of California, where the minimum wage was generally $3.25 lower — before the $1 statewide hike to $10 on Jan. 1.
Oakland’s minimum wage got an inflation-related bump to $12.55 with the start of 2016, with San Francisco’s jumping to $13 in July.
The Bay Area data cover the entire San Francisco-Oakland-Hayward metro area, of which the two cities’ population is one-third.
L.A. Hotel Jobs Hit By $15.37 Wage
Los Angeles-area hotels saw job growth fizzle after the L.A. City Council mandated that hotels with at least 300 rooms start paying workers a minimum of $15.37 an hour, the highest minimum in the nation, starting last July. The same wage will apply to workers at 150-room locations this coming July.
After growing by 3% or more the prior three years, Los Angeles County accommodation industry employment fell by an average of 3%, or 1,300 jobs, vs. a year earlier in the first 10 months of 2015. Meanwhile, hotel and motel jobs in the rest of California saw steady growth. Preliminary November data show employment bounced back just above year-earlier levels.
Seattle Restaurants Curb Hiring
Seattle-area restaurant job gains have fallen below 2% from a year ago for the first time since 2010 following a minimum wage hike in April from $9.47 to $11 an hour for companies with more than 500 employees.
For smaller employers, the minimum got a smaller bump to $10. That just rose to just $10.50 at the start of 2016, or $12 for employees who don’t get employer health insurance.
Job growth at Seattle-area restaurants is now less than half the pace in the prior three years. Meanwhile, as highlighted by American Enterprise Institute economist Mark Perry, restaurant job growth in the rest of Washington state hasn’t seen a similar slowdown.
However, these seasonally unadjusted data cover the entire Seattle-Bellevue-Everett metro, of which Seattle is one-fourth of the population.
ObamaCare Impact
Another challenge in looking for the minimum-wage hikes’ employment impact is that employers might alter total hours worked, rather than payrolls. That response might be likelier in the ObamaCare era, when employers with at least 50 full-time-equivalent workers can face large fines if they fail to offer health benefits to employees who clock at least 30 hours a week.
Employers seem to have figured out ways of avoiding the ObamaCare mandate, but Seattle’s linking of the minimum wage level to health benefits appears designed to avoid such gaming of the system.
Wednesday, January 20, 2016
The Real Victims of Victimhood - The New York Times
The Real Victims of Victimhood - The New York Times
BACK in 1993, the misanthropic art critic Robert Hughes published a grumpy, entertaining book called “Culture of Complaint,” in which he predicted that America was doomed to become increasingly an “infantilized culture” of victimhood. It was a rant against what he saw as a grievance industry appearing all across the political spectrum.
I enjoyed the book, but as a lifelong optimist about America, was unpersuaded by Mr. Hughes’s argument. I dismissed it as just another apocalyptic prediction about our culture.
Unfortunately, the intervening two decades have made Mr. Hughes look prophetic and me look naïve.
“Victimhood culture” has now been identified as a widening phenomenon by mainstream sociologists. And it is impossible to miss the obvious examples all around us. We can laugh off some of them, for example, the argument that the design of a Starbucks cup is evidence of a secularist war on Christmas. Others, however, are more ominous.
On campuses, activists interpret ordinary interactions as “microaggressions” and set up “safe spaces” to protect students from certain forms of speech. And presidential candidates on both the left and the right routinely motivate supporters by declaring that they are under attack by immigrants or wealthy people.
So who cares if we are becoming a culture of victimhood? We all should. To begin with, victimhood makes it more and more difficult for us to resolve political and social conflicts. The culture feeds a mentality that crowds out a necessary give and take — the very concept of good-faith disagreement — turning every policy difference into a pitched battle between good (us) and evil (them).
Consider a 2014 study in the Proceedings of the National Academy of Sciences, which examined why opposing groups, including Democrats and Republicans, found compromise so difficult. The researchers concluded that there was a widespread political “motive attribution asymmetry,” in which both sides attributed their own group’s aggressive behavior to love, but the opposite side’s to hatred. Today, millions of Americans believe that their side is basically benevolent while the other side is evil and out to get them.
Second, victimhood culture makes for worse citizens — people who are less helpful, more entitled, and more selfish. In 2010, four social psychologists from Stanford University published an article titled “Victim Entitlement to Behave Selfishly” in the Journal of Personality and Social Psychology. The researchers randomly assigned 104 human subjects to two groups.
Members of one group were prompted to write a short essay about a time when they felt bored; the other to write about “a time when your life seemed unfair. Perhaps you felt wronged or slighted by someone.” After writing the essay, the participants were interviewed and asked if they wanted to help the scholars in a simple, easy task.
The results were stark. Those who wrote the essays about being wronged were 26 percent less likely to help the researchers, and were rated by the researchers as feeling 13 percent more entitled.
In a separate experiment, the researchers found that members of the unfairness group were 11 percent more likely to express selfish attitudes. In a comical and telling aside, the researchers noted that the victims were more likely than the nonvictims to leave trash behind on the desks and to steal the experimenters’ pens.
Does this mean that we should reject all claims that people are victims? Of course not. Some people are indeed victims in America — of crime, discrimination or deprivation. They deserve our empathy and require justice.
The problem is that the line is fuzzy between fighting for victimized people and promoting a victimhood culture. Where does the former stop and the latter start? I offer two signposts for your consideration.
First, look at the role of free speech in the debate. Victims and their advocates always rely on free speech and open dialogue to articulate unpopular truths. They rely on free speech to assert their right to speak. Victimhood culture, by contrast, generally seeks to restrict expression in order to protect the sensibilities of its advocates. Victimhood claims the right to say who is and is not allowed to speak.
What about speech that endangers others? Fair-minded people can discriminate between expression that puts people at risk and that which merely rubs some the wrong way. Speaking up for the powerless is often “offensive” to conventional ears.
Second, look at a movement’s leadership. The fight for victims is led by aspirational leaders who challenge us to cultivate higher values. They insist that everyone is capable of — and has a right to — earned success. They articulate visions of human dignity. But the organizations and people who ascend in a victimhood culture are very different. Some set themselves up as saviors; others focus on a common enemy. In all cases, they treat people less as individuals and more as aggrieved masses.
Robert Hughes turned out to be pretty accurate in his vision, I’m afraid. It is still in our hands to prove him wrong, however, and cultivate a nation of strong individuals motivated by hope and opportunity, not one dominated by victimhood. But we have a long way to go. Until then, I suggest keeping a close eye on your pen.
Sunday, January 17, 2016
On Sumner on Card & Krueger
I’m surprised that the famous 1994 Card-Krueger study of the minimum wage caused Scott Sumner to shift his thinking about the consequences of that policy. (Scott says in the comments that his thinking about the minimum wage has not shifted so much that he favors it.) I reveal below just why I’m surprised, but first I must saddle you with a longish preface.
The principal practical economic argument against the minimum wage is that it puts some low-skilled workers out of jobs. The core of economic theory informs us of this result, in the same way that the core of physics tells us that a dime dropped into an Olympic-sized swimming pool causes the water level of that pool to be higher than it would be had the dime not been dropped into it – and no amount of failure to detect empirically the resulting rise in the water level will cause physicists to doubt that a dime dropped into a body of water causes the water level to rise. (Howls would greet any physicist who said “Well, physiometricians keep studying the dropping of dimes into big pools – real-world pools that have swimmers and divers constantly going into and out of them – and these physiometricians very often, although admittedly not always, find empirically that none of these dropped dimes has any effect on the water level of the pools. So, being a data-driven physicist, I conclude that a dime dropped into a real-world big swimming pool does not displace water in those pools. I am, I repeat, driven by the facts and not dogmatically by theory!”)
Unlike with physics (because the economy is a vastly more complex phenomenon than is the physical universe), what the core of economic theory does not reveal is the magnitude of the job losses. Will the implementation today in Someplace, USA, of a minimum wage of $X.YZ destroy 1% of the current number of jobs in Someplace? Ten percent of these jobs? Eighty percent? One-twentieth of one percent? That’s an empirical question that no amount of a priori theorizing can answer. Empirical investigation is necessary – although keep in mind three facts:
(1) Even the best such investigation can never reveal the correct answer with complete certainty, because no empirical investigation of reality can adequately control for the many factors in addition to the implementation of the minimum wage that affect the labor market in Someplace; not only does the demand for, and supply of, labor change for reasons having nothing to do with Someplace’s minimum wage, but, also, employers and workers have many practically unobservable margins in addition to the job-vs.-no-job margin on which to adjust to a minimum wage (such as reducing the amount of leisure time de facto allowed to workers while workers are formally on the job);
(2) the correct answer is time-dependent; the number of job losses caused today in Someplace by the implementation today in Someplace of a minimum wage will almost certainly be fewer than are the number of job losses caused by this minimum wage in Someplace over the course of the two months following its implementation – a number which itself will be different (likely smaller) than is the number of job losses caused in Someplace by this minimum wage over the course of the year following its implementation; the full effects of a minimum wage take time to play out; and
(3) whatever is the correct answer for the job-losses caused in Someplace, USA, by the implementation of a minimum wage (over whatever particular span of time is considered), that answer is unique to Someplace, USA, and to that particular time in Someplace, USA; the correct answer for the job-losses effect of the minimum wage imposed today in Someplace, USA, – even if that answer were given to us by all the angels in heaven and confirmed by god himself – tells us very little about the magnitude of the job losses caused by minimum-wage implementations in Elsewhere, USA, Somewhere, Canada, and Ourtown, UK; indeed, this correct answer doesn’t even tell us very much about what the job losses would be in Someplace, USA, had the minimum wage been implemented there at a different time (say, last year, or three years from today).
So, now to Scott’s reaction to Card-Krueger – namely, to Scott’s being persuaded by Card and Krueger that the magnitude of job losses caused by minimum wages is less than he’d previously believed: My main problem with Card-Krueger is, as it has always been, that that paper is very short-run. As explained above in point number (2), the impact of the minimum wage takes time, and nothing in economic theory says that that time will be ‘short’ when measured on a calendar. This fact, combined with the reality that minimum wages have been around in the U.S. since before WWII (meaning that by the early 1990s employment and production practices were already adjusted to their existence, and employers by then plausibly expected continuing, occasional hikes in minimum wage), greatly diminishes the information content of a measured change in employment following a minimum-wage hike, even when that change occurs in one particular jurisdiction (New Jersey) that adjoins another (Pennsylvania).
I’ve always believed that what Gertrude Stein said of Oakland, CA, applies to the Card-Krueger study: “There is no there there.”
……
* By the way, this truth applies even if empirical researchers discover that implementing a minimum wage caused the employment of low-skilled workers toincrease because of monopsony power in Someplace. Such a finding, even if it is flawless, does not tell us that Elsewhere, USA, also has such monopsony power that a minimum wage in Elsewhere will result in no job losses there. Nor does this finding tell us how long the minimum wage in Someplace can remain in place before it starts to jobs to be fewer than otherwise. Most importantly, this finding does not tell us even that the minimum wage is a sound policy in Someplace: it might well have been the case that, had no minimum wage been imposed in Someplace, that market forces would have responded to the monopsony power and created better jobs than were the ones that were created in response to the minimum wage.
Saturday, January 16, 2016
The Evidence Is Piling Up That Higher Minimum Wages Kill Jobs - WSJ
Economists have written scores of papers on the topic dating back 100 years, and the vast majority of these studies point to job losses for the least-skilled. They are based on fundamental economic reasoning—that when you raise the price of something, in this case labor, less of it will be demanded, or in this case hired.
Among the many studies supporting this conclusion is one completed earlier this year by Texas A&M’s Jonathan Meer and MIT’s Jeremy West, which reaffirmed that “the minimum wage reduces job growth over a period of several years” and that “industries that tend to have a higher concentration of low-wage jobs show more deleterious effects on job growth from higher minimum wages.”
The broader research confirms this. An extensive survey of decades of minimum-wage research, published by William Wascher of the Federal Reserve Board and me in a 2008 book titled “Minimum Wages,” generally found a 1% or 2% reduction for teenage or very low-skill employment for each 10% minimum-wage increase.
That has long been the view of most economists, although there are some outliers. In 1994 two Princeton economists, David Card (now at Berkeley) and Alan Krueger, published a study of changes in employment in fast-food restaurants in New Jersey and Pennsylvania after the minimum wage went up in New Jersey. The study not only failed to find employment losses in New Jersey, it reported sharp employment gains. The study has been widely cited by proponents of a higher minimum wage, even though further scrutiny showed that it was flawed. My work with William Wascher showed that the survey data collected were so inaccurate that they badly skewed the study’s findings.
More recently, a 2010 study by Arindrajit Dube of the University of Massachusetts-Amherst, T. William Lester of the University of North Carolina at Chapel Hill, and Michael Reich of the University of California, Berkeley, found “no detectable employment losses from the kind of minimum wage increases we have seen in the United States.”
This study and others by the same research team, all of whom support a higher minimum wage, strongly contest the conclusion that minimum wages reduce low-skill employment. The problem, they say, is that state policy makers raise minimum wages in periods that happen to coincide with other negative shocks to low-skill labor markets like, for instance, an economic downturn.
They argue that the only way to accurately discover whether minimum wages cause job losses is by limiting control groups to bordering states and counties because they’re most likely to have experienced similar economic conditions. This approach led to estimates of job losses from minimum wages that are effectively zero.
But as Ian Salas of Johns Hopkins, William Wascher and I pointed out in a 2014 paper, there are serious problems with the research designs and control groups of the Dube et al. study. When we let the data determine the appropriate control states, rather than just assuming—as Dube et al. do—that the bordering states are the best controls, it leads to lower teen employment. A new study by David Powell of Rand, taking the same approach but with more elegant solutions to some of the statistical challenges, yields similar results.
Another recent study by Shanshan Liu and Thomas Hyclak of Lehigh University, and Krishna Regmi of Georgia College & State University most directly mimics the Dube et al. approach. But crucially it only uses as control areas parts of states that are classified by the Bureau of Economic Analysis as subject to the same economic shocks as the areas where minimum wages have increased. The resulting estimates point to job loss for the least-skilled workers studied, as do a number of other recent studies that address the Dube et al. criticisms.
Some proponents defend a higher wage on other grounds, such as fairness, or compensating for the low bargaining power of low-skill workers. But let’s not pretend that a higher minimum wage doesn’t come with costs, and let’s not ignore that some of the low-skill workers the policy is intended to help will bear some of these costs.
Thursday, January 14, 2016
Minimum Wage Dishonesty - Walter E. Williams - Page full
Michael Hiltzik, a columnist and Los Angeles Times reporter, wrote an article titled "Does a minimum wage raise hurt workers? Economists say: We don't know." Uncertain was his conclusion from a poll conducted by the Initiative on Global Markets, at the University of Chicago's Booth School of Business, of 42 nationally ranked economists on the question of whether raising the federal minimum wage to $15 over the next five years would reduce employment opportunities for low-wage workers.
The Senate Budget Committee's blog says, "Top Economists Are Backing Sen. Bernie Sanders on Establishing a $15 an Hour Minimum Wage." It lists the names of 210 economists who call for increasing the federal minimum wage. The petition starts off, "We, the undersigned professional economists, favor an increase in the federal minimum wage to $15 an hour as of 2020." The petition ends with this: "In short, raising the federal minimum to $15 an hour by 2020 will be an effective means of improving living standards for low-wage workers and their families and will help stabilize the economy. The costs to other groups in society will be modest and readily absorbed."
The people who are harmed by an increase in the minimum wage are low-skilled workers. Try this question to economists who argue against the unemployment effect of raising the minimum wage: Is it likely that an employer would find it in his interests to pay a worker $15 an hour when that worker has skills that enable him to produce only $5 worth of value an hour to the employer's output? Unlike my fellow economists who might argue to the contrary, I would say that most employers would view hiring such a worker as a losing economic proposition, but they might hire him at $5 an hour. Thus, one effect of the minimum wage law is that of discrimination against the employment of low-skilled workers.
In our society, the least skilled people are youths, who lack the skills, maturity and experience of adults. Black youths not only share these handicaps but have attended grossly inferior schools and live in unstable household environments. That means higher minimum wages will have the greatest unemployment effect on youths, particularly black youths.
A minimum wage not only discriminates against low-skilled workers but also is one of the most effective tools in the arsenal of racists. Our nation's first minimum wage came in the form of the Davis-Bacon Act of 1931, which sets minimum wages on federally financed or assisted construction projects. During the legislative debates, racist intents were obvious. Rep. John Cochran, D-Mo., said he had "received numerous complaints in recent months about Southern contractors employing low-paid colored mechanics getting work and bringing the employees from the South." Rep. Miles Allgood, D-Ala., complained: "That contractor has cheap colored labor that he transports, and he puts them in cabins, and it is labor of that sort that is in competition with white labor throughout the country." Rep. William Upshaw, D-Ga., complained of the "superabundance or large aggregation of Negro labor."
During South Africa's apartheid era, the secretary of its avowedly racist Building Workers' Union, Gert Beetge, said, "There is no job reservation left in the building industry, and in the circumstances, I support the rate for the job (minimum wage) as the second-best way of protecting our white artisans." The South African Economic and Wage Commission of 1925 reported that "while definite exclusion of the Natives from the more remunerative fields of employment by law has not been urged upon us, the same result would follow a certain use of the powers of the Wage Board under the Wage Act of 1925, or of other wage-fixing legislation. The method would be to fix a minimum rate for an occupation or craft so high that no Native would be likely to be employed."
It is incompetence or dishonesty for my fellow economists to deny these two effects of minimum wages: discrimination against employment of low-skilled labor and the lowering of the cost of racial discrimination.
Monday, January 11, 2016
Meet the Cry-Bully: a hideous hybrid of victim and victor - Spectator Blogs
In the 1970s, there was a big difference between bullies and cry-babies. Your mum would have preferred you to hang around with the latter, but sometimes the former had a twisted charisma so strong that you found yourself joining in the taunts of ‘Onion Head! ’ at some poor unfortunate creature sporting a cranium of a somewhat allium caste. After a bit, of course, if you had anything about you, you realized what a knob you were being and went off to sample the more solitary, civilized pleasures of shoplifting and reading Oscar Wilde with the bedroom curtains closed. But you could be certain, as you festered in your pilfered Chelsea Girl vest, that bullies were bullies and cry-babies were cry-babies and never the twain would meet.
Fast forward some four decades and things are not so simple. This is the age of the Cry-Bully, a hideous hybrid of victim and victor, weeper and walloper. They are everywhere, these duplicit Pushmi-Pullyus of the personal and the political, from Celebrity Big Brother to the frontline of Islamism. Jeremy Clarkson is a prime cry bully, punching a producer and then whining in The Sunday Times about ‘losing my baby’ (The baby being Top Gear). Perez Hilton, recently of the CBB house, is a good example too, screaming abuse at his wretched room-mates until they snapped and hit back, at which point he would dissolve in floods of tears and flee to the Diary Room to claim that he felt ‘unsafe’. Stephen Fry is one, forever banging on about his own mental fragility yet mocking Stephen Hawking’s voice at a recent awards ceremony.
Esther Rantzen — an anti-bullying campaigner — strikes me as another. An otherwise anodyne interviewer recently felt moved to ask her if she was a bully, due to her reputation as an over-bossy boss during her That ’s Life heyday and the rather vile reports of her yelling out ‘My husband’s ex-wife ’ during a trivia quiz at a TV bash – the question being ‘What burnt in Richmond?’, the locality in which the woman whose husband she took had recently been cremated. Rantzen has re-created herself as a pathetic widow, complaining about having no one to go on holiday with, so desperately lonely for human companionship that she once rang her daughter to say that the Almighty would want them to live together. (Not as much as He’d want people not to gloat over the deaths of people whose lives they had ruined, surely.)
Even social media – the source of so much fun and friendship for most of us – becomes a double-edged sword in the hands of the Cry-Bully. They will threaten women with rape on Twitter then boo-hoo about the invasion of their privacy when called to account. It’ s a sort of Munchausen’s syndrome – causing one’s own misery then complaining about it – seen most sadly in the case of Hannah Smith, the 14-year-old girl who took her own life in 2013 after allegedly being cyber-bullied on the teen website Ask.fm. It turned out that some 98 per cent of the abusive messages came from poor Hannah herself, with only four posts being contributed by actual trolls.
Cry-Bullies do end up isolated, as their determination to be victim and victor eventually wears out the patience of the most forebearing friend. But they can also be found hanging around in gangs; then, Cry-Bullies really come into their own and are not just irritating but dangerous. Islamism is the ultimate Cry-Bully cause; on one hand stamping around murdering anyone who doesn’ t agree with you, on the other hand yelling ‘ISLAMOPHOBIA’ in lieu of having a real adult debate about the merits of your case. Their ‘helpline’ is even called Tell Mama – bless. The British-born Islamist recently sentenced to twelve years had no problem posing with severed heads (‘Heads, kaffirs, disgusting’) and asking friends back home to send him condoms which he planned to use raping women captured as ‘war booty’ but then claimed to be having nightmares and suffering from depression in order to escape jail.
The transexual and pimp-lobbies bring classic Cry-Bully tactics into play whenever they come across someone who doesn’t – shock, horror! – think the same as them, as unashamed feminists from the activist Julie Bindel to the comedian Kate Smurthwaite have discovered. In these cases, the claim that ‘safe spaces’ might be violated by the presence of someone who thinks differently to them; but born women, mysteriously, are expected to surrender the ultimate Safe Space – the female toilets – to pre-op chicks with dicks if they are not to be accused of violent bigotry.
I don’t like much about monarchy, but the old saw they are said to live by – ‘Never complain, never explain’ (if only Prince Charles could do this!) – is a good one. The Cry-Bully always explains to the point of demanding that one agrees with them and always complains to the point of insisting that one is persecuting them. They really are the very worst sort of modern moaner.
Saturday, January 09, 2016
A Fascinating Survey on Climate Change of Scientists (and Engineers) in Alberta, Canada | The Lukewarmer's Way
A Fascinating Survey on Climate Change of Scientists (and Engineers) in Alberta, Canada | The Lukewarmer's Way
From the abstract: “This paper examines the framings and identity work associated with professionals’ discursive construction of climate change science, their legitimation of themselves as experts on ‘the truth’, and their attitudes towards regulatory measures. Drawing from survey responses of 1077 professional engineers and geoscientists, we reconstruct their framings of the issue and knowledge claims to position themselves within their organizational and their professional institutions.”
From the paper: “A survey of scientists in Alberta Canada produced 5 broad groupings of opinion.
The largest group of APEGA respondents (36%) draws on a frame that the researchers label ‘comply with Kyoto’. In their diagnostic framing, they express the strong belief that climate change is happening, that it is not a normal cycle of nature, and humans are the main or central cause. They are the only group to see the scientific debate as mostly settled and the IPCC modeling to be accurate, e.g., ‘I believe that the consensus that climate change is occurring is settled.
The second largest group (24%) express a ‘nature is overwhelming’ frame. In their diagnostic framing, they believe that changes to the climate are natural, normal cycles of the Earth. Their focus is on the past: ‘If you think about it, global warming is what brought us out of the Ice Age.’ Humans are too insignificant to have an impact on nature.’
Ten percent of respondents draw on an ‘economic responsibility’ frame. They diagnose climate change as being natural or human caused. More than any other group, they underscore that the ‘real’ cause of climate change is unknown as nature is forever changing and uncontrollable. Similar to the ‘nature is overwhelming’ adherents, they disagree that climate change poses any significant public risk and see no impact on their personal life. They are also less likely to believe that the scientific debate is settled and that the IPCC modeling is accurate.
‘Fatalists’, a surprisingly large group (17%), diagnose climate change as both human- and naturally caused. ‘Fatalists’ consider climate change to be a smaller public risk with little impact on their personal life. They are sceptical that the scientific debate is settled regarding the IPCC modeling: ‘The number of variables and their interrelationships are almost unlimited – if anyone thinks they have all the answers, they have failed to ask all of the questions.’
The last group (5%) expresses a frame the researchers call ‘regulation activists’. This frame has the smallest number of adherents, expresses the most paradoxical framing, and yet is more agentic than ‘comply with Kyoto’. Advocates of this frame diagnose climate change as being both human- and naturally caused, posing a moderate public risk, with only slight impact on their personal life. They are also sceptical with regard to the scientific debate being settled and are the most indecisive whether IPCC modeling is accurate: ‘the largest challenge is to find out what the real truth is… I don’t know what the impact really is. I suspect it is not good.’
They believe that the Kyoto Protocol is doomed to failure (‘can’t do it, even though we should’), yet they motivate others most of all to create regulation: ‘Canada should implement aggressive policies to reduce GHG emissions in the spirit of the Kyoto Accord.’ They also recommend that we define and enact sustainability/stewardship, reduce GHGs, and create incentives.”
Common SJW Phrases Translated into English
Common SJW Phrases Translated into English
1. “Let’s have a Conversation.” The direct translation for this is “your opinion hurts my feelings and you need to change it.” But this fails to capture the essence of the phrase. When an SJW says this, what they really mean is that you have departed from the traditional narrative, and you are being warned that your non-conforming opinion needs to be changed immediately. Failure to do so will result in denunciation and accusations of racism, sexism, etc… Having a conversation means agreeing with the SJW on all particulars.
2. “Educate yourself!” Directly translated, this is “I can’t understand why you don’t agree with the accepted narrative.” But again, there are nuances here. This can be considered a final warning before denunciation and attempts to attack your character. The SJW is warning you that there will be consequences if you don’t agree with the politically correct narrative. Perhaps they will try to get you fired, or dox you, or some other form of unpleasantness. The subtle translation is “obey the dictates of Social Justice or else!”
3. You are a Racist!” You’re white. And probably male, cis-gendered, and straight. Remember, all white men are racists, and any accusation of racism is prima facie evidence of guilt. Progressive white men can be exempted by proclamation by the SocJus community. But this exemption can be revoked at any time, in which case you revert to being a racist.
4. “You’re a Misogynist!” You’re white. And probably male, cis-gendered, and straight. Remember, all white men are sexists, and any accusation of sexism is prima facie evidence of guilt. Progressive white men can be exempted by proclamation by the SocJus community. But this exemption can be revoked at any time, in which case you revert to being a sexist.
5. “You’re an Islamophobe!” You have common sense, which is, of course, a violation of accepted SocJus norms. The only sense you are allowed to have is fed to you by the media. When the media tells you Islam is good and Christianity is evil, you must accept this with no further discussion or demands for evidence.
6. “I can’t be a racist because racism equals privilege plus discrimination.” This means the SJW is a racist and hates white people. Bahar Mustafa, a woman of Turkish ethnicity famously proclaimed that her Turkish heritage meant that she could never be a racist. My Armenian ancestors would disagree, of course. But college feminists are far more oppressed than people thrown into rail cars, shot, beaten to death, and/or crucified.
7. “That’s Triggering!” Directly translated, this statement means roughly: “waaaaaaaah waaah waaaaaaaaaah!” A more nuanced translation would be “this made me cry, and I will throw a temper tantrum unless you make it go away.”
8. “We need a Safe Space!” The SJW wishes to re-institute segregation along racial and ethnic lines. Members of the KKK are currently kicking themselves for not thinking of this idea first. “Damnit,” says the Grand Wizard Dragon of Podunk, “if only we called Jim Crow ‘Safe Spaces for People of Color’ we could have pulled it off…”
....
Wednesday, January 06, 2016
Finding the Real Conservative | FreedomWorks
Finding the Real Conservative | FreedomWorks
As yet another primary election season heats up, how do we cut through the rhetoric and evaluate candidates? One sure way is to have a measuring stick based on more than personal opinion.
One such standard is the word “conservative.” I hear candidates and elected officials use it all the time. What does it really mean? In my opinion, the late Russell Kirk spelled it out better than just about anyone. This all-but-forgotten man laid out ten principles of conservative thought many seem to have forgotten. See how many you recognize.
First, conservatives believe in an enduring moral order. This concept is much broader than religious dogma. Kirk said that human nature was a constant, and moral truths were permanent. That’s not surprising considering that 94% of Americans believe in God, according to pollster George Barna. Surprisingly, Kirk said that a society in which men and women are governed by an enduring belief in moral order—by a strong sense of right and wrong—and by personal convictions about justice and honor—that would be a good society, regardless of the political machinery. Politics do not determine the trajectory of a nation—the people do. Nancy Pearcy put it well when she said that politics is downstream from culture.
Second, tradition in a culture is important and should not be tossed out on a whim. Kirk actually calls this “continuity.” What he meant is that order and justice and freedom are the result of centuries of trials and reflections and sacrifice. Change should be gradual and calculated—never undoing traditions as a knee-jerk reaction. Often times, an election cycle bring cries for “change,” but true conservatives should always be wary of change. Wary doesn’t mean completely closed to some change though. It just means “slow change.” If you look at how our bi-cameral system of government loaded with checks and balances was designed, clearly our founders thought “slow” was good. For this reason, Presidential Executive Orders should be used sparingly.
Third, conservatives adhere to Edmund Burke’s mantra that the individual is foolish, but the species is wise. Using that advice, real conservatives stand on the shoulders of those who have gone before them and look to enduring wisdom. That means not only the Ronald Reagans, but other great thinkers and statesmen beyond our lifetime like T.S. Eliot, Adam Smith, Sir Walter Scott, and of course, Burke himself. Not sure you will see any of these authors on display as you walk in your local library.
Fourth, true conservatives look at the long-term consequences of laws and policies. I fear this principle frequently gets tossed in favor of reelection. Kirk said that rushing into legislation or policies without weighing the long-term consequences will actually create new abuses in the future. We should slow down and look as far as we can into the future.
Fifth, conservatives know good and well that you can’t totally level the economic playing field, and in fact, we should not aspire for it. Robbing one taxpayer to pay another truly violates conservative thought because it is not sustainable. In our society, we have tried to make charity the government’s job, and true conservatives have to take issue with that practice. Churches and non-profits should take serious their role in culture.
Sixth, mankind is messed-up. Kirk didn’t exactly quote the Bible, but conservatives believe that because man is flawed from birth that no perfect social order can ever be created. All that we can reasonably expect, Kirk said, is a tolerably ordered, just and free society, in which evil and suffering continue to lurk. Can morality be legislated? Kirk would say that all laws are an effort to legislate morality, and that is okay.
Seventh, conservatives know that great societies are built upon the foundation of private property. We see it in the Ten Commandments. Policies that seek to redistribute wealth and property should be an anathema to the real conservative. That is one of my issues with COP21, the Paris Agreement on the reduction of climate change, and the EPA’s Clean Power Plan. Both are a form of wealth redistribution. While getting rich should not be the conservative’s chief aim, the institution of private property has been a powerful instrument for teaching responsibility, shaping integrity, creating prosperity, and providing the opportunities for people to think and act. It is the opportunity to go from rags to riches. This opportunity has given us the Truett Cathys (of Chick-fil-a fame) and others who worked their way up from nothing.
Eighth, conservatives favor smaller government at a federal level, and champion small governments such as county commissions and city councils. Decisions most affecting the lives of citizens should be made locally, and as Kirk would say, voluntarily. That is how I got started. I ran a city council race for a friend. A strong, centralized, and distant federal government tends to be more hostile to human freedom and dignity.
Ninth, the conservative believes in flattening the power—or limiting government. Real conservatives know the danger of power being vested in just a few even it is called benevolent. Constitutional restrictions are necessary, political checks and balance a must, and enforcement of the law a must—all the while balancing the claims of authority with the claims of liberty.
Finally, conservatives should be slow to change. Any thinking conservative would be resistant to hastily throwing out the old way of doing something in favor of something completely new—even in the name of “positive change.” Progress, or change, is important—for Kirk said a society would stagnate without it. Change has to be reconciled with the permanent though, and both are important.
When Kirk revised these ten principles in 1993 before his death in 1994, he said that the word “conservative” was being abused. If alive today, he probably wouldn’t be surprised that the distortion has not stopped.
You Know Less Than You Think About Guns - Reason.com
You Know Less Than You Think About Guns - Reason.com
Do More Guns Mean More Homicides?
This simple point—that America is awash with more guns than ever before, yet we are killing each other with guns at a far lower rate than when we had far fewer guns—undermines the narrative that there is a straightforward, causal relationship between increased gun prevalence and gun homicide. Even if you fall back on the conclusion that it's just a small number of owners stockpiling more and more guns, it's hard to escape noticing that even these hoarders seem to be harming fewer and fewer people with their weapons, casting doubt on the proposition that gun ownership is a political crisis demanding action.
In the face of these trend lines—way more guns, way fewer gun murders—how can politicians such as Obama and Hillary Clinton so successfully capitalize on the panic that follows each high profile shooting? Partly because Americans haven't caught on to the crime drop. A 2013 Pew Research Poll found 56 percent of respondents thought that gun crime had gone up over the past 20 years, and only 12 percent were aware it had declined.
Do Gun Laws Stop Gun Crimes?
Another of National Journal's mistakes is a common one in gun science: The paper didn't look at gun statistics in the context of overall violent crime, a much more relevant measure to the policy debate. After all, if less gun crime doesn't mean less crime overall—if criminals simply substitute other weapons or means when guns are less available—the benefit of the relevant gun laws is thrown into doubt. When Thomas Firey of the Cato Institute ran regressions of Isenstein's study with slightly different specifications and considering all violent crime, each of her effects either disappeared or reversed.
Is Having a Gun in the Home Inherently Deadly?
Stroebe notes that the two major post-Kellermann studies most often used to demonstrate an association between gun ownership and risk of homicide shared one of Kellermann's fatal flaws: They offer no information about whether the gun used to kill the gun owners was their own. And despite Kellermann's finding that living alone was very risky, one of the follow-ups, a 2004 study by Linda Dahlberg and colleagues, found that it was only those with roommates who faced a higher risk of a specifically gun-related homicide.
While most of the articles in the Preventive Medicine issue were standard anti-gun material, one piece perhaps inadvertently undermined a popular argument for expanding background checks. "Sources of Guns to Dangerous People: What We Learn By Asking Them," by Philip Cook and colleagues, surveyed a set of jailed criminals in Cook County, Illinois. It found that they "obtain most of their guns from their social network of personal connections. Rarely is the proximate source either direct purchase from a gun store, or theft." So the go-to remedy for gun control advocates seeking to limit homicides might not have much impact on actual gun criminals.
How Often Are Guns Used Defensively?
The survey work most famous for establishing a large number of DGUs—as many as 2.5 million a year—was conducted in 1993 by the Florida State University criminologists Gary Kleck and Marc Gertz. Kleck says they found 222 bonafide DGUs directly via a randomized anonymous nationwide telephone survey of around 5,000 people. The defender had to "state a specific crime they thought was being committed" and to have actually made use of the weapon, even if just threateningly or by "verbally referring to the gun." Kleck insists the surveyors were scrupulous about eliminating any responses that seemed sketchy or questionable or didn't hold up under scrutiny.
Extrapolating from their results, Kleck and Gertz concluded that 2.2 to 2.5 million DGUs happened in the U.S. each year. In a 2001 edition of his book Armed, Kleck wrote that "there are now at least nineteen professional surveys, seventeen of them national in scope, that indicate huge numbers of defensive gun uses in the U.S." The one that most closely matched Kleck's methods, though the sample size was only half and the surveyors were not experienced with crime surveys, was 1994's National Survey of the Private Ownership of Firearms. It was sponsored by the U.S. Justice Department and found even more, when explicitly limiting them to ones that met the same criteria as Kleck's study—4.7 million (though the research write-up contains some details that may make you wonder about the accuracy of the reports, including one woman who reported 52 separate DGUs in a year).
The major outlier in the other direction, nearly always relied on for those downplaying the defensive benefits of guns, is the Bureau of Justice Statistics' National Crime Victimization Survey (NCVS), a nationally representative telephone survey, which tends to find less than 70,000 DGUs per year.
In the October 2015 special issue on "gun violence prevention," Preventive Medicine featured the latest and most thorough attempt to treat the NCVS as the gold standard for measuring defensive gun usage. The study, by Harvard's Hemenway and Sara J. Solnick of the University of Vermont, broke down the characteristics of the small number of DGUs recorded by the NCVS from 2007 to 2011. The authors found, among other things, that "Of the 127 incidents in which victims used a gun in self-defense, they were injured after they used a gun in 4.1% of the incidents. Running away and calling the police were associated with a reduced likelihood of injury after taking action; self-defense gun use was not." That sounds not so great, but Hemenway went on to explain that "attacking or threatening the perpetrator with a gun had no significant effect on the likelihood of the victim being injured after taking self-protective action," since slightlymore people who tried non-firearm means of defending themselves were injured. Thus, for those who place value on self-defense and resistance over running, the use of a weapon doesn't seem too bad comparatively; Hemenway found that 55.9 percent of victims who took any kind of protective action lost property, but only 38.5 percent of people who used a gun in self-defense did.
Could More Guns Mean Less Crime?
Do 'Common-Sense Gun Laws' Work?
Elusive Knowledge
Tuesday, January 05, 2016
The Perils of Leuchtenburg | Power Line
The Perils of Leuchtenburg | Power Line
When I heard a few weeks ago that there was a new history of the presidency, The American President, by William Leuchtenburg, my first thought was—Leuchtenburg is still alive?? Indeed he is, 92 years old now. It was over 30 years ago that I read one of his best-known books, The Perils of Prosperity: 1914-1932, published in 1958! It was a smug and lazy liberal narrative of entirely typical of the historiography of what might be called “the Age of Schlesinger.” It was not as bad as John Hicks’s Republican Ascendency, 1921-1933, which managed to misquote Calvin Coolidge along to way to delivering what amounted to a partisan Democratic Party pamphlet, but Leuchtenburg’s account is contradictory, superficial, and generally forgettable.
Looks like he has repeated this method in The American Presidency. Louis Gould’s generally kind review of The American Presidency in the Wall Street Journal suggests at one point the kind of breezy superficiality typical of Leuchtenburg:
One sentence seems odd. Writing of Nixon’s infrequent press conferences, the author says: “Eisenhower noted disapprovingly that, while FDR averaged eighty press conferences a year, Nixon held only four in all of 1970.” One wonders how the former president conveyed his disapproval of events in 1970 since he had died in 1969.
So naturally we shouldn’t be surprised that Salon (yes, I know) published an excerpt from Leuchtenburg that revived all of the liberal clichés of the 1980s that were long ago debunked, under the headline “Behind the Ronald Reagan myth: ‘No one had ever entered the White House so grossly ill informed.’”
Where to begin? Well, might as well begin at the beginning, with short comments interspersed in [brackets]:
No one had ever entered the White House so grossly ill informed. [Wrong.] At presidential news conferences, especially in his first year, Ronald Reagan embarrassed himself. [Did Leuchtenburg actually read any of the transcripts? If so he might have quoted something. Or he might have noticed Reagan’s startling economic literacy on display.] On one occasion, asked why he advocated putting missiles in vulnerable places, he responded, his face registering bewilderment, “I don’t know but what maybe you haven’t gotten into the area that I’m going to turn over to the secretary of defense.” [Details please? Would this have been the 1981 press conference where Reagan deliberately emulated Eisenhower’s intentional press conference practice of dissembling incoherently, in this case to clean up after an Al Haig mess? Or was this June press conference where Reagan dissembled in service of diplomatic ambiguity over our stance on Israel’s bombing of Iraq’s nuclear reactor, where clarity was exactly the thing to be avoided? Leuchtenburg is consistently uninterested in exploring or providing any such contextual details.] Frequently, he knew nothing about events that had been headlined in the morning newspaper. . . [This is not just wrong but stupid.]In all fields of public affairs—from diplomacy to the economy—the president stunned Washington policymakers by how little basic information he commanded. His mind, said the well-disposed Peggy Noonan, was “barren terrain.” Speaking of one far-ranging discussion on the MX missile, the Indiana congressman Lee Hamilton, an authority on national defense, reported, “Reagan’s only contribution throughout the entire hour and a half was to interrupt somewhere at midpoint to tell us he’d watched a movie the night before, and he gave us the plot from War Games.”
Would this have been the legendary 1983 meeting where Reagan’s apparent ignorance was long ago debunked as a deception of the congresscritters present—Reagan knew the MX missile details cold—who only years later figured out that Reagan was having them on at virtually every such White House meeting? Leuchtenburg doesn’t give enough details to know for sure. Reagan explained several times that he didn’t speak up in meetings because he knew whatever he said would be in the paper the next day, and it served his purposes to be underestimated and to deflect the preening congresscritters ruining his afternoon.
The president “cut ribbons and made speeches. He did these things beautifully,” Congressman Jim Wright of Texas acknowledged. “But he never knew frijoles from pralines about the substantive facts of issues.” Some thought him to be not only ignorant but, in the word of a former CIA director, “stupid.” Clark Clifford called the president an “amiable dunce,” [Would this be the same Clark Clifford who was later convicted of bank fraud? Yes; who’s the dunce now?] and the usually restrained columnist David Broder wrote, “The task of watering the arid desert between Reagan’s ears is a challenging one for his aides.” [No mention that Broder later recanted this judgment.]His White House staff found it difficult, often impossible, to get him to stir himself to follow even this rudimentary routine. When he was expected to read briefing papers, he lazed on a couch watching old movies. [Not true.] On the day before a summit meeting with world leaders about the future of the economy, he was given a briefing book. The next morning, his chief of staff asked him why he had not even opened it. “Well, Jim,” the president explained, “The Sound of Music was on last night.” [So how did Reagan perform the next day at that summit, Mr. Leuchtenburg? Turned out to be a triumph for him by all accounts, including a nice smackdown of Pierre Trudeau. Too much trouble to mention that I guess. More generally, any mention of when and why Reagan wrote his own talking points for his summits with Gorbachev, not to mention how Reagan actually performed in these high-stakes meetings? No mention in this excerpt at least.]
Leuchtenburg does include this:
He was able to forge agreements with Democrats in the capital because he had the advantage, as a veteran of Screen Actors Guild battles, of being an experienced negotiator. (In later years, he said of his haggling with Mikhail Gorbachev: “It was easier than dealing with Jack Warner.”) His chief Democratic opponent in the legislature, who started out viewing Reagan with contempt, wound up concluding that he had been a pretty good governor, “better than Pat Brown, miles and planets and universes better than Jerry Brown”—the two most conspicuous Democratic leaders of the period.
You would think these observations would prompt some reflection on Leuchtenburg’s part about whether there wasn’t some artfulness or depth to Reagan not visible on the surface, but apparently not.
Then there’s this:
When he announced that he was planning to run for governor of California, he encountered ridicule. At a time when Robert Cummings was a prominent film star, the Hollywood mogul Jack Warner responded, “No, Bob Cummings for governor, Ronald Reagan as his best friend.”
Every other account of this anecdote I’ve ever seen has Warner saying, “No no, Jimmy Stewart for governor; Ronald Reagan for best friend,” rather than Cummings. I certainly makes more sense than Cummings, who was no longer very prominent in the mid-1960s. Maybe the book offers a source for this?
I could go on, but I guess I’ll just take this to the bank:
Yet he was to leave office regarded as a consequential president, and a number of scholars were even to write of an “Age of Reagan.”
:):)
A Higher Minimum Wage Doesn't Reduce Welfare Costs To Taxpayers - Forbes
A Higher Minimum Wage Doesn't Reduce Welfare Costs To Taxpayers - Forbes
This might be one of those things about the minimum wage that people will have a hard time quite grasping. For it seems so obvious that if people working for low wages are getting welfare payments that the rest of us taxpayers have to cough up for them, if we raise the minimum wage, we’ll have to cough up less in taxes because welfare payments will go down. And it’s undoubtedly true that that could happen. The important question though is does this happen? And the truth of the matter seems to be that no, it doesn’t happen.
What’s missing is that sure, those people who keep their jobs and their hours with a higher minimum wage gain less in welfare payments. But some people don’t keep their jobs and their hours as a result of this higher minimum wage. And so what is important to the welfare bill is whether the incomes gained and reducing welfare outweigh those lost and replaced by welfare? The answer seems to be that it’s much of a muchness, no great effect either way.
The paper is discussed here:
For years, the union-backed Fight for $15 campaign has argued that raising minimum wages will curb low-wage workers’ reliance on government assistance programs—saving taxpayers money.Now that EPI, the Employment PI, is rather more nakedly political than we might be happy accepting the unsupported word of. As opposed to the other EPI, the Economic PI, whose every utterance is taken as gospel over on the left but which does still manage to produce respectable and reasonable research from time to time. However, let us be fair to this EPI, their basic underlying approach does seem to be markedly better than an earlier paper which contradicts their results:
Not so, says a new study that found federal and state minimum-wage boosts have had no statistically significant impact on working-age adults’ net use of several such programs, including Medicaid and the Supplemental Nutrition Assistance Program formerly known as the food-stamp plan.
The study, partly funded by the right-leaning Employment Policies Institute, contends a $15 minimum wage is poorly targeted to recipients of these programs. Among those who would be affected by a $15 minimum wage, just 12% are SNAP recipients and just 10% are Medicaid recipients.
One previous study Mr. Sabia is challenging is a 2014 paper updated last year by Rachel West, now a senior policy analyst at left-leaning Washington think-tank Center for American Progress, and Michael Reich, an economics professor at the University of California, Berkeley.
That study examined the effects of minimum wages on SNAP enrollments and expenditures. It concluded that a 10% increase in the minimum wage does reduce SNAP enrollments and expenditures by levels that would have saved taxpayers nearly $4.6 billion a year—6.1% of SNAP expenditures in 2012—if a federal bill seeking a $10.10 minimum wage had been approved.
Mr. Sabia said that study’s model was flawed in part because it didn’t examine the impact among workers and nonworkers. A portion of his study set out to examine that using the same model and produced results that made the prior study’s findings hard to believe, Mr. Sabia said.
Ms. West acknowledged that her study didn’t examine the effects on workers versus nonworkers, but she said she stands by her results.
Let’s just think through that starting with our basic logic.
We’re going to raise the minimum wage. Our normal theoretical result is that some workers will lose hours or even their jobs as a result of this. When we move over to look at the effects upon the welfare bill we thus want to know what is the effect of people who keep their jobs earning more, minus the costs of people earning less or nothing and thus requiring more welfare.
So, the paper that says that Hallelujah! the welfare bill is reduced manages not to look at all at the increased welfare costs of people becoming unemployed. That’s not convincing, is it? And then we’ve got the new paper that looks at both results and nets them off and finds pretty much no effect on the welfare bill. I’ll run with that second one if you don’t mind.
However, this then leads us to an insight about the true effects of a rise in the minimum wage. If more poor people get more money as a result of the rise then welfare bills should go down. If welfare bills don’t go down in aggregate then therefore poor people aren’t getting more in aggregate as a result of the rise. And this speaks to the great empirical debate about the minimum wage at present. Because we do still have that theoretical result: that higher wages should increase unemployment. And some studies say this isn’t so in detail: many others say it is. But here we’ve a study approaching the point from a completely different angle, using an entirely different set of numbers, yet conforming with our theoretical model, not the disputed empirics of recent years.
Because the welfare bill isn’t falling, the poor on welfare cannot in aggregate be net gainers from a rise in the minimum wage. We must be having hours restrictions and job losses to make up for the higher incomes going to those who don’t lose their jobs or shifts.
Or, as many people have been saying for a long time now, minimum wage rises don’t help the poor. So let’s not have them, let’s have the only minimum wage that makes any sense at all, $0. As once even the New York Times knew was correct. After all, 96% of employed Americans earn more than the minimum wage meaning that we know very well that there’s something else determining wages.
Saturday, January 02, 2016
15 Stats That Destroy Liberal Narratives - John Hawkins - Page full
15 Stats That Destroy Liberal Narratives - John Hawkins - Page full
1) “Muslims account for only about 1 percent of the U.S. population but account for about half of terrorist attacks since 9/11. That means Muslims in the United States are about 5,000 percent more likely to commit terrorist attacks than non-Muslims.” -- Mark Krikorian
2) “Consider, for example, that in 1958 a mere 4 percent of Americans approved of interracial marriage. By 2013, that number had grown to 87 percent. In 2012 these once-taboo unions hit an all-time high.
Ku Klux Klan membership has shrunk drastically from millions a century ago to fewer than 5,000 today. The Black Panthers are essentially extinct. While plenty of other hate groups have attempted to fill the void, they have always operated on the margins of society. Black politicians are now common—President Obama’s percentage of the white vote was almost perfectly in line with that received by other recent Democrats, all of whom were white.
Granted, these statistics offer but a snapshot of American society, but the more one looks, the more a trend emerges. America is a lot of things; racist isn’t one of them.” -- Greg Jones
3) “The harsh reality awaiting these low-income Americans is undeniable: according to 2013 data from a 2014 Merritt Hawkins study, 55% of doctors already refuse new Medicaid patients. According to the HSC Health Tracking Physician Survey, 2008, the percentage of doctors that refuse new Medicaid patients dwarf by about 8 to 10 times the percentage that refuses new private insurance patients.
Such ‘insurance’ from Obamacare not only fails to provide access to doctors, but research in the top medical journals such as Cancer, American Journal of Cardiology, Journal of Heart and Lung Transplantation and Annals of Surgery, show that Medicaid beneficiaries suffer worse outcomes than similar patients with private insurance ... all at an added cost of another $800 billion by CBO estimates to taxpayers after the decade.
It is not hyperbole to call Medicaid a disgrace at its annual cost of about $450 billion, and expanding it rather than helping poor people buy private insurance is simply inexplicable.” -- Scott Atlas
4 ) “In other words, all of the disruption, spending, taxation, and premium hikes in Obamacare has only reduced the percentage of U.S. residents without health insurance by 2.7 percentage points, from 13.9% to 11.1%: a remarkably small reduction, and far lower than what the law was supposed to achieve.” -- Avik Roy
5) “Bernie Sanders thinks you can pay for an 18 trillion dollar expansion of the welfare state — to make it align with a Denmark that doesn’t actually exist — simply by taxing ‘the billionaire class.’ There are 536 billionaires in America. Even if you confiscated everything they had — which, by the way, would surely destroy the American economy by triggering the greatest round of capital flight in human history and amount to government seizure of countless businesses — it wouldn’t come close to covering the tab of Sanders’s proposals.” -- Jonah Goldberg
6) “In 2010, 38,329 people died from drug overdoses, twice the number a decade earlier. More people died of drug overdoses than from automobile accidents (30,196), murders (13,000) or gun accidents (700).” -- Ann Coulter
7) “Between 1979 and 2010, for instance, the average after-tax income for the poorest quintile of American households rose from $14,800 to $19,200; for the second-poorest quintile, it rose from $29,900 to $39,100. Meanwhile, per-person antipoverty spending at the state and federal level increased sixfold between 1968 and 2008 — and that’s excluding Medicare, unemployment benefits and Social Security.” -- Ross Douthat
8) “Just last month, the Senate Judiciary Committee received a report that in just four years, 121 illegal aliens who had been released by ICE went on to murder Americans.” -- Mark Krikorian
9) “Officially known as the Supplemental Nutrition Assistance Program, or SNAP, the food-stamp program has become the country’s fastest-growing means-tested social-welfare program. Only Medicaid is more expensive. Between 2000 and 2013, SNAP caseloads grew to 47.6 million from 17.2 million, and spending grew to $80 billion from $20.6 billion, according to the Agriculture Department. SNAP participation fell slightly last year, to 46.5 million individuals, as the economy improved, but that still leaves a population the size of Spain’s living in the U.S. on food stamps.” -- Jason Riley
10) “Pace Mr. Obama, the state-prison population (which accounts for 87% of the nation’s prisoners) is dominated by violent criminals and serial thieves. In 2013 drug offenders made up less than 16% of the state-prison population; violent felons were 54% and property offenders 19%. Reducing drug-related admissions to 15 large state penitentiaries by half would lower those states’ prison count by only 7%, according to the Urban Institute.
In federal prisons—which hold only 13% of the nation’s prisoners—drug offenders make up half of the inmate population. But these offenders aren’t casual drug users; overwhelmingly, they are serious traffickers. Fewer than 1% of drug offenders sentenced in federal court in 2014 were convicted of simple drug possession, according to the U.S. Sentencing Commission. Most of those possession convictions were plea-bargained down from trafficking charges.” -- Heather Mac Donald
11) “The conservative Heritage Foundation estimated unlawful immigrant households paid $39.2 billion in 2010, but received $93.7 billion in government services.” -- Oliver Darcy
12) “On Wednesday, a Washington Post article announced that ‘The San Bernardino shooting is the second mass shooting today and the 355th this year.’ Vox, MSNBC’s Rachel Maddow, this newspaper and others reported similar statistics. Grim details from the church in Charleston, a college classroom in Oregon and a Planned Parenthood clinic in Colorado are still fresh, but you could be forgiven for wondering how you missed more than 300 other such attacks in 2015. At Mother Jones, where I work as an editor, we have compiled an in-depth, open-source database covering more than three decades of public mass shootings. By our measure, there have been four ‘mass shootings’ this year, including the one in San Bernardino, and at least 73 such attacks since 1982.” -- Mark Follman
13) “As Pew Research cheerfully reports, previous immigrants were ‘almost entirely’ European. But since Kennedy's immigration act, a majority of immigrants have been from Latin America. One-quarter are from Asia. Only 12 percent of post-1965-act immigrants have been from Europe -- and they're probably Muslims.
Apparently, the ‘American experiment’ is actually some kind of sociological trial in which we see if people who have no history of Western government can run a constitutional republic.
As of 1970, there were only 9 million Hispanics in the entire country, according to the Pew Research Center. Today, there are well more than 60 million.” -- Ann Coulter
14) “No fewer than eight major studies from around the world have found homosexuality is not a genetic condition.
Peter Sprigg of the Family Research Council says that these numerous, rigorous studies of identical twins have now made it impossible to argue that there is a ‘gay gene.’ If homosexuality were inborn and predetermined, then when one identical twin is homosexual, the other should be, as well.
Yet one study from Yale and Columbia Universities found homosexuality common to only 6.7 percent of male identical twins and 5.3 percent of female identical twins.
The low rate of common homosexuality in identical twins – around six percent – is easily explained by nurture, not nature.
Researchers Peter Bearman and Hannah Brueckner concluded that environment was the determining factor. They rejected outright that ‘genetic influence independent of social context’ as the reason for homosexuality. ‘(O)ur results support the hypothesis that less gendered socialization in early childhood and preadolescence shapes subsequent same-sex romantic preferences.’
‘Less gendered socialization’ means, a boy was without a positive father figure, or a girl was without a positive mother figure.
In light of the evidence, Sprigg said simply, ‘No one is born gay.’” -- Mark Hodges
15) “Over the last year, only 1.3 million Americans of working age have entered the workforce, even as the population of this same demographic increased by more than 2.8 million. Just over 1 million members of this group found jobs. That's right -- of the new additions to the working age population, less than four in 10 found jobs.
The newspapers touted the reduction in the unemployment rate to 5.3 percent as a cause for celebration. Yet for every three Americans added to the working age population (16 and older), only around one new job (1.07) has been created under Obama. At this pace, America will soon officially have a zero unemployment rate. But that will only be because no one will be looking for work.” -- Stephen Moore
Subscribe to:
Posts (Atom)