Thursday, February 22, 2018

Baseballot: Sentence-Diagramming the Second Amendment

Baseballot: Sentence-Diagramming the Second Amendment

For public information, here is what the Second Amendment actually says: "A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed."
Intelligent people can have intelligent disagreements about how to interpret this sentence, especially since it is a particularly confusing piece of the law of the land. There are too many nouns and not enough verbs, for one thing; for another, what's up with all the commas?
Well, first off, don't worry so much about all the commas. Commas were used a lot more freely—and haphazardly—in colonial times than they are today. Different copies of the Constitution, transcribed by different printers, have different numbers of commas. Back then, it was simply a punctuation mark inserted to give speakers (remember, the Declaration of Independence and Constitution were read from public places across the colonies) cues on where to pause and take a breath. Apparently it was even the British legal tradition at the time to disregard commas when interpreting statues—they were considered annotations more than parts of the text.
Now that we have codified grammatical rules to an extent that was unimaginable—even impossible—in the 18th century, we have also abandoned the once-common practice of inserting commas between subjects and their predicates. That accounts for the two Second Amendment commas that look weirdest to the modern eye: the first ("Militia, being") and the third ("Arms, shall"). Take those out, and you're left with two clear clauses regardless of what you do with the second, middle comma.
The second half ("the right of the people to keep and bear Arms shall not be infringed") is the money phrase and the independent clause of the amendment—in other words, it's the main idea. This seems great for gun advocates—except it's not the only idea. We also have to figure out what to do with the dependent, participial clause that begins the sentence: "A well regulated militia being necessary to the security of a free State."
Those who have studied Latin will recognize this immediately as an ablative absolute clause. Under this Latin device, an entire string of words before the main idea of a sentence would be put in the ablative case, which is used to express means or accompaniment (i.e., it's used after the word "with"). Here's an example:
Omnibus paratis, familia discessit ad urbem. With everything prepared, the family departed for the city.
It's a construction we still use in English sometimes, as the non-awkwardness of the translation suggests. However, in Latin, the ablative absolute is used for a special reason: to express purpose. Therefore, a less literal, but more colloquial, translation would be, "Since everything was ready, the family departed for the city."
If you were to translate the Second Amendment into Latin and then back into English, the best translation would read something like, "Since a well-regulated militia is necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed." That establishes a much clearer causal link for the right to bear arms with the necessity of a "well-regulated militia." This gives rise to the liberal interpretation of the amendment.
However, since the Bill of Rights is, in fact, not written in Latin, the true meaning is open for debate. Progressive legal scholars like Jeffrey Toobin observe that, for much of American history, the right to bear arms was only understood in the context of protecting state militias. Only in the late 1970s, Toobin argues, when the Republican Party was making its hard turn to the right, did the NRA campaign successfully to alter the national perception of the amendment to cover individual citizens' gun rights. To Toobin, that makes the individual-rights interpretation wrong. However, it can also be convincingly argued that the organized-militia-only interpretation flew in the face of the actual, original intent by the Framers. Colonial-era writing samples suggest that the phrase "bear arms" was a deliberate choice and specifically refers, then as now, to individual possession of weapons, not the military use of them.
What is clear is that the relationship between the first clause and the second clause is the key to understanding the amendment's meaning: does the dependent clause qualify or restrict the independent one or not? On one hand, the first half could be a specific and exclusive raison d'ĂȘtre for the entire main clause, in the full spirit of the ablative absolute, as if the Bill of Rights had been written in Latin itself. Or the first clause could just be irrelevant fluff, a throwaway statement that may be tangentially true but does not affect the amendment's main point; it might as well not even be there. The middle ground is that the first clause is neither meaningless nor decisive; it provides a context, and perhaps an explanation, for the main thrust of the amendment, but it doesn't take anything away from the core meaning.
So are background checks unconstitutional? You can argue that they do take away certain citizens' rights to bear arms, or that they obstruct law-abiding citizens' rights even if they do get their guns in the end. But background checks also fit perfectly with the idea of a "well-regulated" gun-toting population, should you accept the liberal or the moderate interpretation of the Second Amendment. Anything short of denying the relevance of the first clause, and you acknowledge that the Constitution believes regulation of guns is important. (Ironically, conservatives who deny that are thus also denying any similarities between the Second Amendment's structure and the ablative absolute of Latin—many conservatives' favorite language.) However, many of the senators who claimed that they believed background checks violated the Second Amendment were moderates, even Democrats—not conservative ideologues.
If you believe in the hard-right interpretation of the Second Amendment, that's fine. But I don't think these swing-vote senators on the background-checks bill do. Given their overall philosophies and temperaments, a much more nuanced view of Second Amendment law seems likely. Maybe they too, like the rest of us, need a grammatical lesson and a refresher on what the Constitution actually says.

Saturday, November 11, 2017

Gibson's Bakery sues Oberlin College over racial profiling accusations, Oberlin cuts business ties

Gibson's Bakery sues Oberlin College over racial profiling accusations, Oberlin cuts business ties

The folks at Gibson's Bakery caught some Oberlin College students shoplifting. They turned them over to the police, and all three subsequently plead guilty.

Because of this, they're accused of racial profiling.
So I'm guessing the students were not white. Or Asian, for that matter.

Friday, November 10, 2017

Freeman Dyson on ‘heretical’ thoughts about global warmimg | Watts Up With That?


My first heresy says that all the fuss about global warming is grossly exaggerated. Here I am opposing the holy brotherhood of climate model experts and the crowd of deluded citizens who believe the numbers predicted by the computer models. Of course, they say, I have no degree in meteorology and I am therefore not qualified to speak.
But I have studied the climate models and I know what they can do. The models solve the equations of fluid dynamics, and they do a very good job of describing the fluid motions of the atmosphere and the oceans. They do a very poor job of describing the clouds, the dust, the chemistry and the biology of fields and farms and forests. They do not begin to describe the real world that we live in.
The real world is muddy and messy and full of things that we do not yet understand. It is much easier for a scientist to sit in an air-conditioned building and run computer models, than to put on winter clothes and measure what is really happening outside in the swamps and the clouds. That is why the climate model experts end up believing their own models.

Monday, October 09, 2017

Fascism’s Karl Marx: Man the left doesn’t want you to meet

Fascism’s Karl Marx: Man the left doesn’t want you to meet

For fascism, the State and the individual are one. – Giovanni Gentile, “Origin and Doctrine of Fascism”
The myth that fascism and Nazism are phenomena of the right relies heavily on Americans not knowing what fascism and Nazism really mean, what those ideologies stand for. Leftists in academia and the media have worked hard to portray fascism and Nazism in terms of sheer demagoguery and generic authoritarianism, carefully concealing the ideological roots that would reveal fascism and Nazism’s true political colors.

Think about this: We know the name of the philosopher of capitalism, Adam Smith. We also know the name of the philosopher of Marxism, Karl Marx. So, quick: What is the name of the philosopher of fascism? Yes, exactly. You don’t know. Virtually no one knows. This is not because he doesn’t exist, but because the political left – which dominates academia, the media and Hollywood – had to get rid of him to avoid confronting fascism and Nazism’s unavoidable leftist orientation.

So let’s meet the man himself, Giovanni Gentile, who may be termed fascism’s Karl Marx. Gentile was, in his day, which is the first half of the 20th century, considered one of Europe’s leading philosophers. A student of Hegel and Bergson and director of the Encyclopedia Italiana, Gentile was not merely a widely published and widely influential thinker; he was also a political statesman who served in a variety of important government posts. How, then, has such a prominent and influential figure vanished into the mist of history?

Let’s consider some key aspects of Gentile’s philosophy. Following Aristotle and Marx, Gentile argues that man is a social animal. This means that we are not simply individuals in the world. Rather, our individuality is expressed through our relationships: we are students or workers, husbands or wives, parents and grandparents, members in this or that association or group and also citizens of a community or nation. To speak of man alone in the state of nature is a complete fiction; man is naturally at home in community, in society.

Right away, we see that Gentile is a communitarian as opposed to a radical individualist. This distinguishes him from some libertarians and classical liberals, who emphasize individuality in contradistinction to society. But Gentile so far has said nothing with which conservatives – let’s say Reaganite conservatives – would disagree. Reagan in 1980 emphasized the importance of five themes: the individual, the family, the church, the community and the country. He accused the centralized state – big government – of undermining not merely our individuality but also these other associations. “The Big Lie: Exposing the Nazi Roots of American Left,” Dinesh D’Souza’s stunning new explanation of what makes the the leftists in America tick, is now available at the WND Superstore.

Gentile now contrasts two types of democracy that he says are “diametrically opposed.” The first is liberal democracy, which envisions society made up of individuals who form communities to protect and advance their rights and interests, specifically their economic interests in property and trade. Gentile regards this as selfish or bourgeois democracy, by which he means capitalist democracy, the democracy of the American founding. In its place, Gentile recommends a different type of democracy, “true democracy,” in which individuals willingly subordinate themselves to society and to the state.

Gentile recognizes that his critique of bourgeois democracy echoes that of Marx, and Marx is his takeoff point. Like Marx, Gentile wants the unified community, a community that resembles the family, a community where we’re all in this together. I’m reminded here of New York Gov. Mario Cuomo’s keynote address at the 1984 Democratic Convention. Cuomo likened America to an extended family where, through the agency of government, we take care of each other in much the same manner that families look out for all their members.

While Marx and Cuomo seem to view political communities as natural, inevitable associations, Gentile emphasized that such communities must be created voluntarily, through human action, operating as a consequence of human will. They are, in Gentile’s words, an idealistic or “spiritual creation.” For Gentile, people by themselves are too slothful and inert to form genuine communities by themselves; they have to be mobilized. Here, too, many modern progressives would agree. Speaking in terms with which both Obama and Hillary would sympathize, Gentile emphasized that leaders and organizers are needed to direct and channel the will of the people.

Despite Gentile’s disagreement with Marx about historical inevitability, he has at this point clearly broken with modern conservatism and classical liberalism and revealed himself to be a man of the left. Gentile was, in fact, a lifelong socialist. Like Marx, he viewed socialism as the sine qua non of social justice, the ultimate formula for everyone paying their “fair share.” For Gentile, fascism is nothing more than a modified form of socialism, a socialism arising not merely from material deprivation but also from an aroused national consciousness, a socialism that unites rather than divides communities. Check out all of Dinesh D’Souza’s smash hits at the WND Superstore!

Gentile also perceived socialism emerging out of revolutionary struggle, what the media today terms “protest” or “activism.” Revolutionaries, Gentile says, must be ready to disregard conventional rules and they must be willing to use violence. Gentile seems to be the unacknowledged ancestor of the street activism of Antifa and other leftist groups. “One of the major virtues of fascism,” he writes, “is that it obliged those who watched from the windows to come down into the street.”

For Gentile, private action should be mobilized to serve the public interest, and there is no distinction between the private interest and the public interest. Correctly understood, the two are identical. Gentile argued that society represents “the very personality of the individual divested of accidental differences … where the individual feels the general interest as his own and wills therefore as might the general will.” In the same vein, Gentile argued that corporations too should serve the public welfare and not just the welfare of their owners and shareholders.

Society and the state – for Gentile, the two were one and the same. Gentile saw the centralized state as the necessary administrative arm of society. Consequently, to submit to society is to submit to the state, not just in economic matters, but in all matters. Since everything is political, the state gets to tell everyone how to think and also what to do – there is no private sphere unregulated by the state. And to forestall resistance to the state, Gentile argued that the government should act not merely as a lawmaker but also a teacher, using the schools to promulgate its values and priorities.

“All is in the state and nothing human exists or has value outside the state.” Mussolini said that, in the Dottrina del fascismo, one of the doctrinal statements of early fascism, but Gentile wrote it or, as we may say today, ghost wrote it. Gentile was, as you have probably figured by now, the leading philosopher of fascism. “It was Gentile,” Mussolini confessed, “who prepared the road for those like me who wished to take it.”

Gentile served as a member of the Fascist Grand Council, a senator in the Upper House of Parliament, and also as Mussolini’s minister of education. Later, after Mussolini was deposed and established himself in the northern Italian province of Salo, Gentile became, at il Duce‘s request, the president of the Italian Academy. In 1944, Gentile was accosted in his apartment by members of a rival leftist faction who shot him at point-blank range.

Gentile’s philosophy closely parallels that of the modern American left. Consider the slogan unveiled by Obama at the 2012 Democratic Convention: “We belong to the government.” That apotheosis of the centralized state is utterly congruent with Gentile’s thinking. Only Gentile would have provided a comprehensive philosophical defense that the Democrats didn’t even attempt. In many respects, Gentile provides a deeper and firmer grounding for modern American progressivism than anyone writing today.

John Rawls, widely considered a philosophical guru of modern progressivism, seems like thin gruel compared to Gentile in offering an intellectual rationale for ever-expanding government control over the economy and our lives. While Rawls feels abstract and dated now, Gentile seems to speak directly to leftist activists in the Democratic Party, in the media, and on campus.

One might naively expect the left, then, to embrace and celebrate Gentile. This, of course, will never happen. The left has the desperate need to conceal fascism’s deep association with contemporary leftism. Even when the left uses Gentile’s rhetoric, its source can never be publicly acknowledged. That’s why the progressives intend to keep Gentile where they’ve got him, dead, buried and forgotten.

“The Big Lie: Exposing the Nazi Roots of American Left,” Dinesh D’Souza’s stunning new explanation of what makes the the leftists in America tick, is now available at the WND Superstore.

Thursday, July 20, 2017

The Latest ‘Hate’ Smear Target Is a Civil-Rights Group - WSJ


SPLC Watch

Ed Meese III:

The headlines were both inflammatory and untrue: “Attorney General Jeff Sessions Criticized for Speaking to ‘Hate Group,’ ”reported NBC. Reports from ABC and other major news outlets used similar language. Readers might be surprised to learn that the group in question is the Alliance Defending Freedom, a respected civil-rights law firm.

So where did this scurrilous charge originate? With the Southern Poverty Law Center, which labels the ADF a “hate group.” The designation had nothing to do with the law firm’s policies or behavior. It’s just that the SPLC objects to its traditional views on the Constitution, the First Amendment and the meaning of marriage. No responsible media outlet should parrot the SPLC’s hate list without seeking to understand not only its motives but also the consequences of spreading false charges.

Sunday, July 16, 2017

Ranking the States by Fiscal Condition 2017 Edition | Mercatus Center

Ranking the States by Fiscal Condition 2017 Edition | Mercatus Center

I'm in the 43rd ranked state.


The Inconvenient Truth about Ghetto Communities’ Social Breakdown

| National Review


Among the many painful ironies in the current racial turmoil is that communities scattered across the country were disrupted by riots and looting because of the demonstrable lie that Michael Brown was shot in the back by a white policeman in Missouri — but there was not nearly as much turmoil created by the demonstrable fact that a fleeing black man was shot dead by a white policeman in South Carolina.

Totally ignored was the fact that a black policeman in Alabama fatally shot an unarmed white teenager, and was cleared of any charges, at about the same time that a white policeman was cleared of charges in the fatal shooting of Michael Brown.

In a world where the truth means so little, and headstrong preconceptions seem to be all that matter, what hope is there for rational words or rational behavior, much less mutual understanding across racial lines?

When the recorded fatal shooting of a fleeing man in South Carolina brought instant condemnation by whites and blacks alike, and by the most conservative as well as the most liberal commentators, that moment of mutual understanding was very fleeting, as if mutual understanding were something to be avoided, as a threat to a vision of “us against them” that was more popular.

That vision is nowhere more clearly expressed than in attempts to automatically depict whatever social problems exist in ghetto communities as being caused by the sins or negligence of whites, whether racism in general or a “legacy of slavery” in particular. Like most emotionally powerful visions, it is seldom, if ever, subjected to the test of evidence.

The “legacy of slavery” argument is not just an excuse for inexcusable behavior in the ghettos. In a larger sense, it is an evasion of responsibility for the disastrous consequences of the prevailing social vision of our times, and the political policies based on that vision, over the past half century.

Anyone who is serious about evidence need only compare black communities as they evolved in the first 100 years after slavery with black communities as they evolved in the first 50 years after the explosive growth of the welfare state, beginning in the 1960s.

You would be hard-pressed to find as many ghetto riots prior to the 1960s as we have seen just in the past year, much less in the 50 years since a wave of such riots swept across the country in 1965.

We are told that such riots are a result of black poverty and white racism. But in fact — for those who still have some respect for facts — black poverty was far worse, and white racism was far worse, prior to 1960. But violent crime within black ghettos was far less.

Murder rates among black males were going down — repeat, down — during the much-lamented 1950s, while it went up after the much celebrated 1960s, reaching levels more than double what they had been before. Most black children were raised in two-parent families prior to the 1960s. But today the great majority of black children are raised in one-parent families.

Such trends are not unique to blacks, nor even to the United States. The welfare state has led to remarkably similar trends among the white underclass in England over the same period. Just read Life at the Bottom, by Theodore Dalrymple, a British physician who worked in a hospital in a white slum neighborhood.

You cannot take any people, of any color, and exempt them from the requirements of civilization — including work, behavioral standards, personal responsibility, and all the other basic things that the clever intelligentsia disdain — without ruinous consequences to them and to society at large.

Non-judgmental subsidies of counterproductive lifestyles are treating people as if they were livestock, to be fed and tended by others in a welfare state — and yet expecting them to develop as human beings have developed when facing the challenges of life themselves.

One key fact that keeps getting ignored is that the poverty rate among black married couples has been in single digits every year since 1994. Behavior matters and facts matter, more than the prevailing social visions or political empires built on those visions.

Saturday, July 15, 2017

SPLC Watch

This is a post where I'm going to collect bits about the Southern Poverty Law Center as I find them, because in many cases, I'm having trouble finding them again.

http://www.nationalreview.com/article/449476/splc-dangerous-lies-alliance-defending-freedom-no-hate-group

Attorney General Jeff Sessions delivered a speech to an alleged hate group at an event closed to reporters on Tuesday night, but the Department of Justice is refusing to reveal what he said.

Sessions addressed members of the Alliance Defending Freedom, which was designated an “anti-LGBT hate group” by the Southern Poverty Law Center in 2016, at the Summit on Religious Liberty at the Ritz-Carlton, Laguna Niguel, in Dana Point, California.

I’m at that summit right now. I heard the attorney general’s speech, I delivered a speech myself, and I’m even now sitting right next to friends and former colleagues at ADF listening to a lecture on censorship in the European Union. Let me give you a peek under the curtain. I’ll let you in on the major themes of the week.

We heard from men and women who’ve long served gay customers and formed lasting friendships with gay neighbors who now face death threats because they simply refused to lend their artistic talents to celebrate a gay wedding. We heard one man’s voice break as he told the story of how his father fought across Europe and helped liberate a concentration camp from Nazi control — and now his son is called a “Nazi” in part because he wants all people to enjoy the same rights of conscience and wants no man or woman to be coerced into supporting events they find immoral.

....

Let’s be clear. The Southern Poverty Law Center, the “civil rights watchdog group” that ABC and NBC so prominently cite, has become a dangerous joke. It’s a joke because the very idea that Christians are members of a “hate group” merely because they advocate for orthodox Christian principles and the liberty to live those principles is so intellectually and ideologically bankrupt that it’s barely worth addressing.

Indeed, I’d encourage you to read the SPLC’s information page on the Alliance Defending Freedom. It consists of a collection of quotes where ADF attorneys explain the implications of an unrestrained sexual revolution on religious liberty, and it details how ADF files cases to protect the First Amendment rights of its clients. That’s it. No violence. No hate. Mere Christianity.

And this will be dismissed out of hand because it's Accuracy In Media:
http://www.aim.org/aim-column/southern-poverty-law-center-belongs-on-southern-policy-law-centers-hate-list/

Alliance Defending Freedom is representing Jack Phillips, the cake baker who declined to create a cake for a gay wedding and whose case the Supreme Court recently agreed to hear.

It pushed the case that forced public schools to provide equal access to after-school Bible clubs as to other organizations. And it convinced the Supreme Court to affirm an Arizona tuition tax credit program could be used for any schools parents choose, including those run by churches.

It got the court to affirm the rights of communities to restrict where sexually oriented businesses could locate. And it was the group behind the Supreme Court ruling upholding the ban on partial-birth abortions.

One can take issue with any or all these positions. But they are the positions of a Christian group, founded by preachers, that makes grants or provides pro bono attorneys to groups and individuals whose religious rights and rights to free association are threatened. They are not the positions of a hate group.

So how did this group end up in a thousand headlines proclaiming it a “hate group?” Because the Southern Poverty Law Center said so. And for the left and its handmaidens in the media, that’s all it takes.

Wednesday, July 12, 2017

Slavery and Islam - Part 1: The Problem of Slavery | Yaqeen Institute for Islamic Research

Slavery and Islam - Part 1: The Problem of Slavery | Yaqeen Institute for Islamic Research

A lecture that has evoked some controversy, because of this article
Is there slavery in Islam? When people pose this question they usually assume it’s the Islam part that needs clarification. Everyone already knows what slavery is. Actually, it’s quite the opposite. The Islam part is relatively straightforward. The real problem is trying to pin down what we mean by slavery. The more we scratch the surface of that word and try to define its reality, the more we find that our assumptions and even our words fail us. What we think we mean by slavery means little outside our own American experience, and the moment we try to fix what slavery is as a human phenomenon we find a hall of mirrors reflecting our own assumptions back at us. We all think we know what slavery is, but would we really know slavery if we saw it?
We all think we know what #slavery is, but would we really know slavery if we saw it?…

Imagine we could explore the phenomenon of slavery throughout history. Imagine that, as huge Doctor Who fans, we hitch a ride in the Tardis, which allows us to travel across space and time. Our first stop is an exotic, desert land where slavery is common. We visit a well-off home, where we find certain people performing domestic work while an older man sits drinking tea. Everyone has the same dark skin color. Suddenly the lounging tea-drinker shouts at a young man serving him and smacks him hard with a fly swatter. We are eager to know who all these people are. Fortunately, the TARDIS translates all languages directly to your brain. We ask one of the men serving tea his name, and he says his name is Saffron and that he is “One of the delicate folk” working in the household. He has worked in this house for five years, but he tells us that, in one year’s time, he’ll have saved enough money to move on and start his own teashop. We ask about the young man getting smacked. “Oh, that poor boy… he’ll be here till the old man dies.”

Back in the TARDIS, we voyage on through time and space, this time to meet the powerful prime minister of an expansive empire. The prime minister enters the throne room surrounded by dozens of armed soldiers, and we sense the trepidation in the hushed muttering of the audience around us. One voice whispers,“The minister is worth 80 million gold ducats.” “He’s married to the king’s daughter,” responds another. The minister and his bodyguards are all light skinned and fair-haired. Many of those there to offer petitions and seek favor have a darker, olive complexion.

After meeting the minister we voyage on, now to a colder land where we meet a man working in a clock factory. He hates his life, so we agree to take him with us. But the factory owner catches him leaving, and the man is thrown in prison.

We voyage still onward in the TARDIS to a new land where, passing down the road, we see a crew of dark-skinned youths clearing brush in the hot sun, their legs shackled and all joined by chains. A light skinned man watches over them with a weapon in hand.

Where has the TARDIS taken us in our exploration of slavery? The first place we visited was the city of Mecca in the 1400’s. The ‘soft and delicate (raqÄ«q)’ man Saffron was a slave in the wealthy man’s household who had an agreement with his master to buy back his freedom on installments (mukataba). RaqÄ«q was the standard term for slave, and epicurean names like Saffron were typical. The younger man being smacked for bad service, who was tied to the household seemingly forever, was the wealthy man’s own son.

The second place we visited was the capital of the Ottoman Empire in 1579. The minister was Sokollu Mehmet Pasha, the grand vizier and de facto ruler of the empire during the time of three sultans. At the time of our visit, he had already been one of the empire’s richest and most powerful men for almost two decades. He was also a slave of the sultan. He was born in Bosnia, as were all his guards, who were also slaves of the sultan.[1]

The land where we met the man working in a clock factory was England in 1860. Although the worker was a free man, according to labor laws in England at the time a worker who failed to show up for work was guilty of stealing from his employer and was tried and sentenced as a criminal. Finally, the last place we visited was a land in which slavery had long been illegal: rural Arizona in 2004, where the local sheriff was overseeing a juvenile chain gang.

The Problem of Defining \ˈslā-v(ə-)rē\

How would we know who’s a slave and who isn’t on our voyage? Most Westerners today would probably think that the young man being smacked and the chained laborers were slaves, because we associate slavery with physical degradation, harsh labor and violence. We would probably not assume the ‘soft and delicate’ man was a slave because he told us he would soon move to another job on his own terms, while we associate slavery with a total loss of agency, presumably for life. We would certainly not presume that the minister was a slave, since he clearly wielded immense wealth and the power over life and death throughout an empire.

If we are searching for the phenomenon of slavery, what are we really looking for? Is it the label ‘slave’ that matters? Or is it the reality of the condition behind it? The soldiers and administrators of China’s Manchu Qing dynasty (1644-1912) were technically slaves (aha) of the dynasty and proudly referred to themselves as such. This title slave was later applied to anyone of Manchu descent in Qing China. But the word had no link to the reality of any servile condition.[2] Up through the 1800’s, the upper administration of the Ottoman Empire was in the hands of people technically classified as kul (a privileged sultanic slave) who had more power and esteem than their free counterparts.[3]

When we come across a word that translates as ‘slave’ in English, does that word necessarily mean what we mean by slavery? Our word slave in English comes from the Medieval Latin word for Slavic peoples, Sclavus, since they were the population in the Balkans from which European slave traders drew their cargo up through the thirteenth century.[4] A common English dictionary definition of a slave is ‘someone who is legally owned by another person and is forced to work for that person without pay.’ This notion of slavery as reducing human beings to things owned by other people has been a major theme in how the concept has been understood in the West. It was crucial to how abolitionists understood slavery in the eighteenth and nineteenth centuries, when the movement to end slavery began. But the roots of this definition go further back to the roots of Western heritage. They lie in Roman law, which divided people into two categories: the free (a free person has the ‘natural right’ to ‘do as he pleases, unless prevented by the force of law’) and slaves, who exist as the property of others.
Definition of a slave is someone who is legally owned by another person & is forced to work…

But even defining slavery through concepts like ownership and exploitation leaves more questions than answers. What does ownership mean? In American law we think of ownership as a ‘bundle of rights’: the rights to use, exclude, destroy and sell off. Sometimes an owner has some of them, often with significant restrictions, and sometimes the owner has them all. We would probably not think of kids ‘owning’ their toys, since they are clearly not in control of them (ideally!). But children in America legally do ‘own’ the toys we give them. But their ownership is not complete, since their right to use them is highly restricted by their parents.
Definition of a slave is someone who is legally owned by another person & is forced to work…

Ownership is as much about how we imagine relationships as exercising real control. As the famous social historian Orlando Patterson points out, who and what we say we own is really only a matter of our customs and manners.[5] Modern Americans would gasp at the notion of ‘owning’ their children, but from the Roman through the medieval period in Europe parents could and did sell their children off as slaves to creditors in order to pay debts. Moreover, poor parents abandoning their children was a regular source for slave markets in Europe.[6] Yet all these children started off as technically ‘free’ in the legal sense, not legally owned by anyone. In the US, wives and husbands have numerous claims on and powers over each other and their labor, as becomes clear during divorce.[7] But we would never speak about marriage as a relationship of ownership. Conventions in early imperial China were different. There, husbands regularly listed their (free!) wives as property in their will, bequeathing them to some friend.[8] Astoundingly, between 1760 and 1880 –less than a century and a half ago – there were 218 cases of Englishmen holding auctions to sell off their wives, even advertising this in the newspaper.[9]
Ownership is as much about how we imagine relationships as exercising real control.…

What would it mean to ‘own’ a person? Does it mean to have total control over them? We have full control over our young children, but, unlike a chair or a pen, we cannot seriously physically harm them without legal consequence. In fact, this distinction between ownership and control is not very helpful for defining slavery. As with our children today, it was impermissible for Muslims to kill or seriously injure their slaves, and those who did faced legal consequences under the Shariah. In some contexts, ownership might fail completely as a concept for understanding slavery. Slavery existed in imperial China, but it was not conceptualized through ownership. Slaves were not legally ‘owned’ at all for the very technical reason that Chinese law could not categorize people as ‘things.’[10]

Slaves were not legally ‘owned’ for the very technical reason that Chinese law could not…
If we think about slavery as exploitation, does slavery mean not compensating someone for their labor? Sokollu Mehmet Pasha was a slave ‘owned’ by the Ottoman sultan, but he was also paid handsomely for his work as grand vizier. Saffron was owned by his master, but only partially, since he had already bought back a portion of his freedom through wages he earned elsewhere in his time off. He received no pay from his master, but the master paid for his food, clothes and shelter. Incidentally, in this regard the slave was no different from the master’s own son. Both were his dependents, relying on his support for their basic needs.

We usually think of slavery as something that exists in a dichotomy with freedom. But what does freedom mean? As the legal scholar Vaughan Lowe jibes, inverting Rousseau’s famous line about man’s natural state of freedom, “Man is born in chains, but everywhere he thinks himself free.”[11] Almost no human being is free of dependence on others and on society as a whole. Almost everyone is forced to work in order to earn wages to buy food. The son in the household we visited in Mecca was technically free, but he depended on his father for all his support and had to obey him or face his anger. If he fled his home to get away from his nasty father, he’d be ostracized by all those he knew and loved. The man’s slave, meanwhile, had evenings off to earn his own money and would soon be free of his master. Who was free in this situation?

At a theoretical level, how we understand freedom in the West is inherited from Classical Greece and Rome, where ‘free’ was the legal category of citizens of a democratic republic. A free person is autonomous, at liberty to do whatever he or she wants unless the law prohibits it. Everyone else is a slave. But even in Classical times this legal definition of freedom was no more than a “rhetorical argument,” as one scholar puts it, since in reality few people in the Greek and Roman world were ‘free’ by this definition. Almost everyone was constrained by powerful social, economic and even legal bonds.[12] Ironically, even in theory this notion of freedom only applies in liberal democracies. In autocracies – perhaps a majority of societies in human history – almost no one is free by this definition.[13]

Nor does freedom exist on a single plane. It is often relational, expanding or contracting depending on the relationship in question. In the ancient and medieval Mediterranean world (both Europe and Islamic civilization), a slave’s intense subordination was not absolute. He or she was subordinated to his or her master, not to society as a whole. So Roman and later Byzantine masters used slaves to run their shops and to be the public faces of their businesses, negotiating and arguing with countless ‘free’ customers and contractors on a daily basis.[14] The slave was not the lowest rung on the ladder in the streets of Rome or Constantinople/Istanbul. If their master was a powerful or wealthy person, the slave enjoyed the status of that connection in public life. The status of the slave depended on the status of his or her master.

In Rome's & Constantinople's streets, the status of a slave depended on the status of their…
How We See Slavery – American Chattel Slavery

By now you should see that any question about slavery is very complicated. One of the biggest challenges that historians and anthropologists interested in slavery face is whether there is even some single institution of slavery that exists across time and space that they can even study.[15] It’s tempting to assume that, though the details might differ, there is something called slavery out there, popping up throughout history, and that we’d know it if we saw it. But, of course, as our hypothetical trip in the TARDIS shows, what we would recognize as slavery is determined by our own cultural memory of what the English word \ˈslā-v(ə-)rē\ means to us.

When Americans think of slavery we think of Twelve Years A Slave and Roots. The images are seared marks in our mind: African men, women and children being seized by ruthless slave traders, torn from their homes and each other, packed like chattel into the holds of stifling slave ships, sold like cattle at auction to white plantation owners, who worked, oppressed and lashed them mercilessly for the rest of their lives. Slavery in our cultural memory is ‘the original sin’ of America: the reduction of a person, against their will, to the status of property, owned by another person who had absolute right over their labor and who deprived them of the natural right to freedom and family.

The Spectrum of Coerced Labor

Yet as we have seen, ownership, freedom and exploitation come in shades of gray. They exist on spectrums. Historians and sociologists have attempted to delineate categories on this spectrum, in part to determine if we can really talk about slavery as something separate from other forms of forced labor or involuntary servitude. The main categories on this ‘continuum of dependency’ other than slavery are:[16]

In Rome's & Constantinople's streets, the status of a slave depended on the status of their…
Serfdom: In Europe, this tradition goes back to ancient Greece. Laborers, usually peasant farmers, were free in the sense that they owned their own clothes, tools, livestock as well as the fruits of their labor. But they were bound to the land on which they lived or to their landlord wherever he might go.[17] Serfdom in Europe developed as the status of free peasants and settled Barbarian prisoners of war in the late Roman Empire collapsed into a single class of “quasi-servitude” not too different from slavery.[18] Serfdom disappeared in most of Western Europe in the wake of the Black Death in the 1300s, though it continued in the institution of villeinage in England until around 1600 and continued into the 1800s in mining areas of Scotland and German speaking lands. Serfdom is most associated with Russia, where it came to replace slavery in agriculture and domestic spheres in the late 1600s and early 1700s.[19]
Master/Servant Relationship: When serfdom disappeared from Western Europe, it was replaced by the relationship between the laborer and the landowner/employer. Unlike our modern notion of a worker’s contract, however, failing to live up to this contract was a criminal offense. Only in the British colonies in North America did a notion of free labor eventually appear in the 1700s, and this did not make its way back to Britain until 1875.[20]
Debt servitude: This has been one of the most widespread forms of coerced labor. When a person is unable to repay a debt, he or she becomes the slave of the creditor. This was extremely common in Southeast Asia, where our Western model of slavery was extremely rare.[21]
Bonded labor/indentured servitude: This is similar to debt servitude and has been very common in history. A person willingly enters into an agreement to exchange their labor and a loss of some freedoms for a fixed period of time in return for some service or up-front payment. This differs from debt servitude because the person willingly surrenders their labor and a degree of freedom.
These categories are not fixed or hermetically sealed. They bleed into each other, making it very hard to come up with a clear line distinguishing slavery from other forms of coerced labor. Scottish mining serfs often wore collars with the names of their masters on them, for example, something we’d probably associate more with slavery.[22] Indentured servants from Britain, who made up two thirds of the immigrants to British North America before 1776, could be sold, worked to exhaustion and beaten for misbehavior. They could not marry and, in Virginia at least, could be mutilated if they tried to escape. In Maryland the punishment was death.[23]

Slavery in colonial America was worse, but only in that it was permanent. On the other hand, as early as the 1400s in the Ottoman Empire people captured in war were sometimes settled to work lands owned by the sultan. Although technically slaves, their condition was closer to serfdom. These slaves formed families that lasted generations and passed down the land they worked to their children. Only if a head of household died without any children would his estate revert back to the imperial treasury. Later on, as Ottoman cities industrialized, factory owners preferred using slave labor because slaves would not leave for seasonal work elsewhere. By agreeing to mukātaba contracts with these slaves – in which the slaves bought their own freedom by installments – these factory owners were able to maximize the slaves’ productivity.[24] They were, in effect, more like wage laborers working for a set term in a master/servant relationship than slaves.

In Rome's & Constantinople's streets, the status of a slave depended on the status of their…
We might think of slavery as distinguished from other types of coerced labor by the question of choice. Indentured servants chose to enter into those contracts. Slaves would never choose to become slaves, right? But realities are much more complicated. Outside of slavery in the Americas, ‘voluntary slavery’ was not uncommon at all.[25] In Ming China many impoverished tenants sold themselves into slavery when they could not pay rent.[26] In 1724, the Russian czar abolished slavery and converted all of Russia’s slaves into serfs because serfs were offering themselves as slaves to avoid paying taxes; serfs paid taxes, slaves did not.[27] Earlier, in the fifteenth-century duchy of Muscovy, what scholars term the ‘limited service contract slavery’ became common. In such a contract, a person asks someone wealthy for a loan for a year, at which point the person will pay them back and will also work for them in the meantime instead of paying interest. If the borrower cannot pay the creditor back in a year, they become their slave. Most often, they became a lifetime slave. This type of slavery replaced all other forms of slavery in Russia. And yet there was also indentured servitude at the same time, differing from slavery only in that an indentured servant could not be physically harmed by their master.[28]

Unlike bonded laborers or serfs, we might think of slaves as people with little or no legal right to protection. This has often been true. In Ming China, slaves were often referred to as “not human.” Not only could they not own property, marry or have legitimate children, but killing one of them also posed no legal problem.[29] Among the Toraja people of Sulewesi (today in Indonesia), someone who had been convicted of a capital crime could have one of his slaves executed instead of himself.[30] A judge in South Carolina in 1847 declared that a slave “can invoke neither magna carta nor common law”; for the slave the law was whatever the master said.[31]

In Rome's & Constantinople's streets, the status of a slave depended on the status of their…
Yet not only were legal realities often quite complicated, so were the social realities behind the laws. In Roman law, slaves were conceptualized as people with no rights. Since they were, in theory, prisoners of war who had been spared execution, they were legally dead anyway.[32] And during the period of the Roman Republic (6th-1st centuries BCE), there was no legal restriction on a master’s treatment of his slaves. But such laws are not very helpful in distinguishing free from slave, however, since Roman heads of household at that time also enjoyed the theoretical ‘power of life and death’ over every man, woman and child in the family.[33] As the number of slaves in the expanding Roman Empire increased, however, laws were put in place to protect them. Under the emperor Hadrian (d. 138 CE) excessive punishment was forbidden, as was killing a slave without a legal ruling. The emperors Antoninus Pius (d. 161 CE) and later Constantine (d. 337 CE) made it clear that if a master killed his slave in cold blood or by excessive punishment he was guilty of homicide. And in the legal code of emperor Justinian (d. 565 CE) it was clear that the master’s rights to do violence to his slave were limited to reasonable discipline.[34]

In early America, all thirteen colonies had laws regulating race and slavery, which were occasionally updated. Although ten states in the South had slave codes making it a crime to mistreat slaves, mistreatment was understood in relation to the severity of the disobedience or infringement that the master was punishing. Amputating limbs, castration and execution were all allowed as punishments when the alleged crime was severe. And it was almost impossible for slaves to challenge any treatment in court, since they could not even testify. Nonetheless in North Carolina and Virginia a handful of white slave-owners were executed or imprisoned for murdering or cruelly treating their slaves.[35]

Definitions that Never Seem to Work

As a leading scholar of slavery, David Davis, observed, “The more we learn about slavery, the more difficulty we have defining it.”[36] A trans-historical definition of slavery has indeed proven very hard to find. As a leading scholar on Ottoman slavery has remarked, it is difficult to treat slavery as one definable phenomenon just in the Ottoman Empire, let alone globally (though he stresses that the varieties of slavery in the Ottoman realm were different in degrees not different in kind).[37] Nur Sobers-Khan has observed about slavery in Ottoman Istanbul, that it was so diverse that it doesn’t make sense to talk about slavery as a unified phenomenon even in one city let alone in the whole Mediterranean region.[38] Scholars don’t even agree on where to start. Many historians, proceeding from a Marxist paradigm, have sought to explain slavery as a purely economic phenomenon. Others, especially scholars of slavery in the Islamic world, have stressed that slavery is often much more of a social phenomenon.
David Davis, observed, “The more we learn about slavery, the more difficulty we have defining…

Definitions of slavery have tended to revolve around three notions: the slave as a family-less outsider, the slave as property, and the slave as the object of violence.[39] But for a definition to fit all the things that people today commonly associate with slavery, that definition has to be so vague that it’s almost useless. So slavery is “the forced labor of one group by another,” according to some social scientists.[40] Others have suggested that the slave is always an outcast.[41] According to Davis, to apply across human history, slavery can only be defined as extreme social “debasement;” whatever the hierarchy, slaves are always at the bottom.[42]

Some scholars have proposed more specific definitions for slavery as an economic, legal and social condition. One argues that slavery is a mode of exploitation that is uniquely characterized by its means of reproducing itself, namely through political violence or captivity in war.[43]

The most influential, specific definition comes from Orlando Patterson, who defines slavery as always exhibiting three features. First, slavery involves perpetual domination ultimately enforced by violence. Second, slavery involves a state of natal alienation, “the loss of ties of birth in both ascending and descending generations” that preclude making claims of birth or passing them on to one’s children and that cuts the slave off from family and community except as allowed by the masters. They inherit no protection or privilege and can pass none on to their children. Finally, slaves are denied any honor. Slavery is thus defined as the “permanent, violent domination of natally alienated and generally dishonored persons.”[44]

But Patterson’s definition fails to apply to many instances of what we would otherwise think of as slavery. Sometimes it was the slaves who dominated free people, as in the case of the Turkish slave soldiers of the Abbasid caliphs in the ninth and early tenth centuries. Even before the Ottomans began their system of imperial slaves, Egypt and Syria were ruled by the Mamluk (literally, ‘slave’) state (c. 1260-1517). Although they were freed after they finished their military training, the Mamluk dynasty of Turkic or Caucasian warlords reproduced itself generation after generation by importing new slave soldiers into a ruling military elite that defined itself by its military slave experience.[45] Far from being dominated by anyone, they were their own masters and dominated the whole of the state and society. Patterson argues that slave elites in Islamic civilization were still effectively powerless because their fate still hung on the whim of their masters. But the frequency with which Abbasid Turkic slaves, Egyptian Mamluks, and Ottoman Janissaries summarily executed their masters when it suited them strongly suggests otherwise.

Nor have those who identify as slaves always been natally alienated. Byzantine imperial slaves could own property and bequeath it to their children.[46] The Ottoman agricultural slaves settled on imperial lands passed their estates on to their children for generations. Unlike Roman slavery, where the status of a child’s mother determined its status, the main position in the Shariah was that a slave woman who gave birth to her master’s child became free when her master died, as did her child. Until then he could not sell her. Far from being natally alienated from her child, its status as the child of a freeman ensured the mother’s own freedom. Elite imperial slaves like Sokollu Mehmet Pasha were technically natally alienated in the sense that, according to the letter of the Shariah in Ottoman lands, their wealth reverted back to the treasury (bayt al-mal) upon death. But in reality, when an elite imperial slave like Sokollu died, what transpired was a form of negotiation between state officials and the heirs. Since many of these slaves had amassed – and stashed – immense wealth, it was more efficient for the state to negotiate for a portion of it in return for allowing the heirs to receive the remainder without legal problems.[47] Here the slave’s natal alienation functioned more as an irregular estate tax than a total deprivation of their right to pass on their property to their heirs. There were also other easy means for circumventing the natal alienation of wealth. Like many wealthy citizens of the Ottoman Empire, imperial slaves could place their wealth in endowments (Ar. waqf, pl. awqaaf) and make their descendants the beneficiaries. [48]

Furthermore, the children of Ottoman imperial slaves could not pass their wealth on to their children (it reverted back to the imperial treasury upon their death), but their children retained the privileges of their fathers’ proximity to power as well as the status of their mothers. Sokollu Mehmet’s wife was the daughter of the sultan, so his sons attained high office. What is more striking is that, in many cases, Ottoman imperial slaves maintained their relationships to their original families in the Christian areas of the Balkans, using their newfound power to elevate their relatives.[49] Sokollu Mehmet appointed his brother as Orthodox Patriarch in the Balkans, and his cousin later followed him to the office of grand vizier.[50] Later, in the late eighteenth century, the Georgian slave elite in charge of administering the Ottoman province of Egypt maintained close relations to their families back in the Caucasus and even received visits from them.[51]

Sometimes exploiting family connections was one of the major purposes of enslavement. Though technically slaves, Christian Europeans captured by the Ottoman naval forces of Algiers in the eighteenth century were often more like hostages. They could send and receive mail from their families and, if their masters were lucky, their families paid ransoms to free them. In the meantime, they could own property, make money (those assigned to elite jobs like ‘cofeegi’, coffee pourer, might live better than in their home country) and mix freely.[52]

Slavery in Islam – A Political Question

Before delving into how slavery existed in Islam (see next essay), we should note that this is not a question asked in a vacuum. It hasn’t been for well over two centuries. In conversations and debates the response, ‘Well, does that mean slavery would be ok?’ is the ultimate trump card against someone arguing for indulging different values. Slavery is the ideal example to invoke because its evil is so morally clear and so widely acknowledged. Who would defend slavery? It is the Hitler of human practices. Yet despite all its power, the word slavery is rarely defined. In that sense, it is much like the word terrorism – its power lies in the assumptions behind its meaning and the moral condemnation it carries. But it is very poorly defined.

Like the word terrorism, slavery is also a deeply, deeply political issue, not in the sense of politics as what we see on the nightly news, but rather in the sense that it is inherently tied to questions of power. Just as the practice of slavery is an extreme exercise of power by some human beings over others, wielding the language of slavery is a claim to moral authority over others. It is no surprise that advocates of ending brutal or unacceptably exploitative labor practices such as sweatshops, child sex trafficking, forced marriage and organ trading refer to such phenomena as ‘modern day slavery.’ The reason for invoking the word ‘slavery’ instead of other definitions such as bonded labor or child labor is clear: slavery provokes an emotional reaction that spurs people into action and support for a cause. From students to rock stars, who wouldn’t support ending slavery?

Though such practices are indeed reprehensible, with ‘modern day’ slavery we run across some familiar problems. If we took the definitions of slavery used by activists fighting ‘modern slavery’ (the main one is it’s slavery ‘if you can’t walk away’) and applied them to just Western history we’d find that almost no one was free by their standards.[53] As some scholars have observed, the most prominent advocates for ending modern day slavery have not applied the label to the forced labor of criminals in the American penal system.[54] This is no doubt a very political choice, since fewer rockstars and students would be as willing to accuse the US government of engaging in ongoing slavery. So even when invoked for noble causes today, ‘slavery’ is still a deeply political word, both in the emotional reaction it triggers and in the self-censorship that people use in when and where they apply it.

The political nature of slavery is particularly pronounced in the history of Islam and the West. During the eighteenth and even nineteenth centuries the fear of being captured by Muslim pirates in the Atlantic and western Mediterranean loomed large in the Western European (particularly British) imagination. And indeed thousands of British and Americans were taken as slaves in such a way. We still see the cultural imprint of this fear in movies like Never Say Never Again (1983), where James Bond rescues Kim Basinger from a remarkably out of place Arab slave auction, and Taken (2008), where Liam Neeson finally rescues his daughter from first (Muslim) Albanian traffickers and finally from a lascivious Arab sheik. But, like the selective use of the term ‘modern day slavery,’ this conversation is selective in its claim to Western moral authority. During the same era that Europeans and Americans were decrying capture and enslavement by Muslim pirates, the enslavement by Europeans of Muslims from the Ottoman Empire was booming.[55] And our Western cultural memories are even more selective. Western theatregoers likely felt no outrage in The Spy Who Loved Me (1977) when Bond visits the harem of his Arab sheik friend and is offered one of the women (when in the Orient, says the sheik, “one should delve deeply into its treasures”). From the British tabloids to then private citizen Donald Trump, in 2015 many parroted the claim that Muslims in northern England were luring young white girls in as sex slaves. Some Muslims were doing this, but few media reports stated that the majority of offenders were actually white men.[56]

The political nature of slavery is particularly pronounced in the history of Islam and the…
Conclusion: Focus on the Conditions, not the Word

The word slavery has been political even when it has been invoked for the best of causes. And the political forces that have shaped how slavery is understood have often hobbled the best efforts of those fighting against the extreme exploitation of fellow human beings. Abolitionists in the nineteenth century chose to define slavery as treating human beings as property in part because, if they defined slavery as harsh deprivation or exploitation, their pro-slavery opponents would just point to the factory of conditions of industrial England and American and note that ‘free’ workers were being treated just as badly.[57]

Having emphasized that slavery consisted of humans being treated as property, abolitionists were left with no objection to continued exploitation of the same people they had just freed once it became technically illegal to own people. British abolitionists succeeded in ending slavery in the Indian Ocean in the 1830s. But then they found that laborers were still being transported to East Africa from India in the same horrid conditions as slaves and with the same high mortality rate. They were just called ‘coolies’ rather than slaves.[58] Today, decades after the legal right to own other human beings was abolished globally, activists referred to as new abolitionists, seeking to mobilize public concern over exploitative labor, have redefined slavery as ‘not being able to walk away.’[59]

Ultimately, the word ‘slavery’ can mean so many things that it’s not very useful for accurate communication. It often ends up referring to things we don’t mean when we think of slavery, or it fails to match things we do associate with slavery. As such, the word slavery has limited use as a category or conceptual tool. It’s much more useful to talk about the extreme exploitation of human beings’ labor and the extreme deprivation of their rights. In any society, whether it has ‘slavery’ or not, we are likely to find such conditions. Instead of fixating on a word or ill-defined category, it is much more useful to focus on regulating conditions and protecting people’s rights in order to prevent such extreme debasement. And, as our next essay will show, this is precisely what the Shariah aimed to do.

As such, the word slavery has limited use as a category or conceptual tool.…
Disclaimer: The views, opinions, findings, and conclusions expressed in these papers and articles are strictly those of the authors. Furthermore, Yaqeen does not endorse any of the personal views of the authors on any platform. Our team is diverse on all fronts allowing for constant enriching dialogue that helps us produce only the finest research.

[1] The Ottoman tradition of elite slavery may have been inherited from the late Roman and Byzantine Empires, where imperial slaves (often eunuchs) could rise to high positions in the military and the administration; Youval Rotman, Byzantine Slavery and the Mediterranean World, trans. Jane Marie Todd (Cambridge, MA: Harvard University Press, 2009), 104; Cam Grey, “Slavery in the Late Roman World,” in The Cambridge World History of Slavery: Volume I The Ancient Mediterranean World, ed. Keith Bradley and Paul Cartledge (Cambridge: Cambridge University Press, 2011), 499.

[2] Pamela Kyle Crossley, “Slavery in Early Modern China,” in The Cambridge World History of Slavery: Volume 3 AD 1420-1804, ed. David Eltis and Stanley Engerman (Cambridge: Cambridge University Press, 2011), 200.

[3] Christoph K. Neumann, “Whom did Ahmet Cevdet represent?,” in Late Ottoman Society, ed. Elisabeth Özdalga, 117-134. London: Routledge, 2005), 117.

[4] David Brion Davis, Challenging the Boundaries of Slavery (Cambridge, MA: Harvard University Press, 2003), 17-18.

[5] Orlando Patterson, Slavery and Social Death (Cambridge, MA: Harvard University Press, 1982), 22.

[6] Grey, “Slavery in the Late Roman World,” 496; Rotman, Byzantine Slavery, 174-76.

[7] Patterson, Slavery and Social Death, 22.

[8] Crossley, “Slavery in Early Modern China,” 191.

[9] Julia O’Connell Davidson, Modern Slavery: The Margins of Freedom (New York: Palgrave Macmillan, 2015), 162.

[10] Crossley, “Slavery in Early Modern China,” 187.

[11] Vaughan Lowe, International Law: A Very Short Introduction (Oxford: Oxford University Press, 2015), 1.

[12] Here quoting Youval Rotman, Byzantine Slavery, 19.

[13] Rotman, Byzantine Slavery, 17-18.

[14] Rotman, Byzantine Slavery, 97-98.

[15] Joseph C. Miller, The Problem of Slavery as History (New Haven: Yale University Press, 2012), 12.

[16] David Eltis and Stanley Engerman, “Dependence, Servility, and Coerced Labor in Time and Space,” in The Cambridge World History of Slavery Volume 3, 3.

[17] Richard Hellie, “Russian Slavery and Serfdom, 1450-1804,” in The Cambridge World History of Slavery Vol. 3, 276-77.

[18] Cam Grey, “Slavery in the Late Roman World,” 484-6.

[19] Hellie, “Russian Slavery,” 284, 292-93.

[20] Eltis and Engerman, “Dependence, Servility, and Coerced Labor,” 7; Davidson, Modern Slavery, 68. In England this issue was governed by the Statute of Artificers, which the American colonies only adopted in a limited way.

[21] Kerry Ward, “Slavery in Southeast Asia, 1420-1804,” in The Cambridge World History of Slavery Volume 3, 165-66.

[22] Eltis and Engerman, “Dependence, Servility, and Coerced Labor,” 6.

[23] Kenneth Morgan, Slavery and Servitude in Colonial North America (New York: New York University Press, 2000), 8-9, 20; David Galenson, “The Rise and Fall of Indentured Servitude in the Americas: An Economic Analysis,” Journal of Economic History 44, no. 1 (1984): 4.

[24] It was in the Ottoman state’s interest to keep this agricultural system stable; Y. Hakan Erdem, Slavery in the Ottoman Empire and its Demise, 1800-1909 (New York: St. Martin’s Press, 1996), 12-13, 15.

[25] Stanley Engerman, “Slavery at Different Times and Places,” American Historical Review 105, n. 2 (2000): 481.

[26] Crossley, “Slavery in Early Modern China,” 189.

[27] Hellie, “Russian Slavery,” 284, 293.

[28] Hellie, “Russian Slavery,” 279-80. The author notes the similarity between this Russian contract and the ancient Persian custom of antichrisis (as named by Greek authors).

[29] Crossley, “Slavery in Early Modern China,” 191.

[30] Ward, “Slavery in Southeast Asia,” 171.

[31] Lawrence M. Friedman, A History of American Law, 2nd ed. (New York: Simon & Shuster, 1985), 225.

[32] W.W. Buckland, The Roman Law of Slavery (New York: AMS, 1969, reprint of 1908 Cambridge U. Press edition), 2-3.

[33] Yan Thomas, “Vitae Necisque Potestas: Le PĂšre, La CitĂ©, La Mort,” Publications de l’École Française de Rome (1984): 499–548.

[34] Buckland, The Roman Law of Slavery, 36-8.

[35] Kenneth Morgan, Slavery and Servitude in Colonial North America, 35, 77; Ira Berlin, Many Thousands Gone: The First Two Centuries of Slavery in North America (Cambridge, MA: Belknap Press, 1998), 116; Paul Finkelman, “Slavery: United States Law,” in Oxford International Encyclopedia of Legal History, 5:258-262; Friedman, A History of American Law, 225-6.

[36] David Brion Davis, Slavery and Human Progress (Oxford: Oxford University Press, 1984), 8.

[37] Ehud Toledano, Slavery and Abolition in the Ottoman Middle East (Seattle: University of Washington Press, 1998), 164-65; Toledano, As if Silent and Absent: Bonds of Enslavement in the Islamic Middle East (New Haven: Yale University Press, 2007), 21.

[38] See also Nur Sobers-Khan, Slaves without Shackles: Forced Labour and Manumission in the Galata Court Registers, 1560-1572 (Berlin: Klaus Schwarz Verlag, 2014).

[39] Martin Klein, “Introduction,” in Breaking the Chains: Slavery, Bondage, and Emancipation in Modern Africa and Asia, ed. Martin Klein (Madison: University of Wisconsin Press, 1993), 4-5.

[40] Rodney Coates, “Slavery” in Blackwell Encyclopedia of Sociology, ed. George Ritzer (Oxford: Blackwell, 2007).

[41] A. Testart, “The Extent and Significance of Debt Slavery,” Revue Française de Sociologie 43 (2002): 176.

[42] Davis, Slavery and Human Progress, 17-19; Brenda Stevenson, What is Slavery? (Malden, MA: Polity, 2015), 8.

[43] Claudu Meillassoux, The Anthropology of Slavery (London: Athlone, 1991).

[44] Orlando Patterson, Slavery and Social Death (Cambridge, MA: Harvard University Press, 1982), 7-8, 13.

[45] Nasser Rabbat, “The Changing Concept of the MamlĆ«k in the Mamluk Sultanate in Egypt and Syria,” in Slave Elites in the Middle East and Africa, ed. Miura Toru and John Edward Philips (London: Kegal Paul, 2000), 89, 97.

[46] Rotman, Byzantine Slavery, 104.

[47] See Ali Yaycıoğlu, “Wealth, Power and Death: Capital Accumulation and Imperial Seizures in the Ottoman Empire (1453-1839)” available at http://www.econ.yale.edu/~egcenter/Yaycioglu%20-%20Wealth%20Death%20and%20Power%20-%20November%202012.pdf.

[48] Leslie Pierce, Morality Tales: Law and Gender in the Ottoman Court of Aintab (Berkeley: University of California Press, 2003), 315; Toledano, As if Silent and Absent, 25; Ebru Boyar and Kate Fleet, A Social History of Ottoman Istanbul (Cambridge: Cambridge University Press, 2010), 147-48.

[49] Dror Ze’evi, “My Slave, My Son, My Lord: Slavery, Family and the Sate in the Islamic Middle East,” in Slave Elites in the Middle East and Africa, 75. See also Metin Kunt’s short article, “Ethnic-Regional (Cins) Solidarity in the Seventeenth-Century Ottoman Establishment,” International Journal of Middle East Studies 5, no. 3 (1974): 233-39.

[50] Veinstein, G., “Soážłollu Meáž„med Pas̲h̲a”, in: Encyclopaedia of Islam, Second Edition, Edited by: P. Bearman, Th. Bianquis, C.E. Bosworth, E. van Donzel, W.P. Heinrichs. Consulted online on 21 November 2016 First published online: 2012

[51] Daniel Crecelius and Gotcha Djaparidze, “Relations of the Georgian Mamluks of Egypt with Their Homeland in the Last Decades of the Eighteenth Century,” Journal of the Social and Economic History of the Orient 45, no. 3 (2002): 326.

[52] Christine E. Sears, “‘In Algiers, the City of Bondage’: Urban Slavery in Comparative Context,” in New Directions in Slavery Studies, ed. Jeff Forret and Christine E. Sears (Baton Rouge: Louisiana State University Press, 2015), 203, 207, 211.

[53] Julia O’Connell Davidson, Modern Slavery: The Margins of Freedom (New York: Palgrave Macmillan, 2015), 3, 6, 22-23, 37-39, 69, 169.

[54] Davidson, Modern Slavery, 100.

[55] William Clarence-Smith and David Eltis, “White Servitude,” 139, 144.

[56] See also www.thestar.co.uk/news/majority-of-rotherham-child-exploitation-suspects-are-white-claims-new-report-1-7392637.

[57] Davidson, Modern Slavery, 31.

[58] Davidson, Modern Slavery, 33.

[59] Kevin Bales, Understanding Global Slavery (Berkeley: University of California Press, 2005), 52-54.



Disclaimer: The views, opinions, findings, and conclusions expressed in these papers and articles are strictly those of the authors. Furthermore, Yaqeen does not endorse any of the personal views of the authors on any platform. Our team is diverse on all fronts allowing for constant enriching dialogue that helps us produce only the finest research.

Monday, May 29, 2017

Don’t Blame Hillary - WSJ


Friday’s commencement address at Wellesley—an attack on the man who defeated her—is only the latest outburst from a failed candidate, who has now vowed to take a leading position in the anti-Trump “resistance.” On the right these things provoke new headlines about sore loserhood. Far more interesting is the irritation Mrs. Clinton’s refusal to fade away is causing among fellow Democrats who blame her for the loss against what should have been an easily defeatable Republican nominee.

This is supremely unfair to Mrs. Clinton. As flawed a candidate as she might have been, the truth is almost certainly the reverse. It is today’s Democratic Party that gave us Mrs. Clinton, as well as the thumping in November.

Yes, the Clintons have always been flexible about principles, a big reason for the appeal of the more purist Bernie Sanders. Back when her husband was running for president as a “New Democrat” in 1992, the idea was that the party had shed its McGovernite past and moved to the center, so that it could now be trusted on values, the economy and national security. At the time Mr. Clinton advertised his wife as “two for the price of one.”

Once they got in, Mrs. Clinton reverted to type by pushing, unsuccessfully, for universal health care. But after that belly-flop and the 1994 GOP takeover of Congress, they dialed it back, and by 1996 her husband was telling the American people “the era of big government is over.”

As New York’s junior senator, Mrs. Clinton was firmly ensconced within her party. “On the 1,390 votes she cast in which most senators from one party voted differently from most senators across the aisle,” notes an April 2016 piece from Roll Call, “Clinton went against the Democratic grain only 49 times.”

Even on the single issue that came to be used against her in last year’s Democratic presidential primary—her 2002 vote to authorize the use of force in Iraq—Mrs. Clinton was squarely with her party. We’ve forgotten it today, but more Democrats voted with Mrs. Clinton on that one than against, including Harry Reid, John Edwards, Chuck Schumer, Joe Biden and John Kerry. Only a few years later she, again like them, opposed the surge.

So which is she, hawk or dove? The truth is that she is both—and neither. In a notable section in the memoirs of fellow Obama cabinet member Bob Gates, he relates a conversation in which she admits her opposition to the surge in Iraq “had been political because she was facing [Barack Obama] in the Iowa primary.” Again this only puts her within the mainstream of her party: Most of the other Democrats who had voted for the war in 2002 would also oppose the surge in 2007.

It has been a consistent pattern for Mrs. Clinton. On almost any issue that energizes her party—from same-sex marriage to the Trans-Pacific Partnership trade deal—Mrs. Clinton has gone where the party has pulled her even if it meant going against where she had been. This is what Hollywood actress Rosario Dawson meant last summer when she asked a group of Sanders delegates at the convention to understand that Mrs. Clinton “is not a leader, she’s a follower.”

But on what became the single overriding theme of her campaign, Mrs. Clinton was truly in sync with her party. This is the idea that she should be elected because she’s a woman, and that a coalition of millennials, minorities and women would come together to make it happen. So where Donald Trump had “Make America Great Again,” Mrs. Clinton had the identity project par excellence: “I’m with her.”
After all, who could be more deserving to succeed the first African-American president than the first woman president?

It didn’t turn out that way. And if you take the Trump blinders off for some perspective, there’s another dynamic that had little to do with Mrs. Clinton: the hemorrhaging of Democratic seats over the Obama years—from the governorships to state legislatures to Capitol Hill—to the point where the Democratic Party is now at its lowest levels in a century.

By the time Mrs. Clinton had secured the nomination for president, she had embraced everything a far more progressive party wanted her to embrace. But she also inherited a party that was losing elections all across the country.

So maybe it wasn’t only a flawed messenger that led Democrats to defeat in 2016. Maybe there’s a problem with the message, too.

Saturday, March 11, 2017

Presidential Payback For Media Hubris | Hoover Institution

Presidential Payback For Media Hubris | Hoover Institution

Sadly, the contemporary mainstream media—the major networks (ABC, CBS, NBC, CNN), the traditional blue-chip newspapers (Washington Post, New York Times), and the public affiliates (NPR, PBS)—have lost credibility. They are no more reliable critics of President Trump’s excesses than they were believable cheerleaders for Barack Obama’s policies.

Trump may have a habit of exaggeration and gratuitous feuding that could cause problems with his presidency. But we would never quite know that from the media. In just his first month in office, reporters have already peddled dozens of fake news stories designed to discredit the President—to such a degree that little they now write or say can be taken at face value.

No, Trump did not have any plans to invade Mexico, as Buzzfeed and the Associated Press alleged.

No, Trump’s father did not run for Mayor of New York by peddling racist television ads, as reported by Sidney Blumenthal.

No, there were not mass resignations at the State Department in protest of its new leaders, as was reported by the Washington Post.

No, Trump’s attorney did not cut a deal with the Russians in Prague. Nor did Trump indulge in sexual escapades in Moscow. Buzzfeed again peddled those fake news stories.

No, a supposedly racist Trump did not remove the bust of Martin Luther King Jr. from the White House, as a Time Magazine reporter claimed.

No, election results in three states were not altered by hackers or computer criminals to give Trump the election, as implied by New York Magazine.

No, Michael Flynn did not tweet that he was a scapegoat. That was a media fantasy endorsed by Nancy Pelosi.

In fact, Daniel Payne of the Federalist has compiled a lengthy list of sensational stories about Trump’s supposed buffooneries, mistakes, and crudities that all proved either outright lies or were gross exaggerations and distortions.

We would like to believe writers for the New York Times or Washington Post when they warn us about the new president’s overreach. But how can we do so when they have lost all credibility—either by colluding with the Obama presidency and the Hillary Clinton campaign, or by creating false narratives to ensure that Trump fails?

Ezra Klein at Vox just wrote a warning about the autocratic tendencies of Donald Trump. Should we believe him? Perhaps not. Klein was the originator of Journolist, a “left-leaning” private online chat room of journalists that was designed to coordinate media narratives that would enhance Democratic politicians and in particular Barack Obama. Such past collusion begs the question of whether Klein is really disinterested now in the fashion that he certainly was not during the Obama administration.

Recently, New York Times White House correspondent Glenn Thrush coauthored a report

about initial chaos among the Trump White House staff, replete with unidentified sources. Should we believe Thrush’s largely negative story?

Perhaps. But then again, Thrush not so long ago turned up in the Wikileaks troves as sending a story to Hillary Clinton aide John Podesta for prepublication audit. Thrush was his own honest critic, admitting to Podesta: “Because I have become a hack I will send u the whole section that pertains to u. Please don’t share or tell anyone I did this Tell me if I f**ked up anything.”

Dana Milbank of the Washington Post has become a fierce critic of President Trump. Are his writs accurate? Milbank also appeared in Wikileaks, asking the Democratic National Committee to provide him with free opposition research for a negative column he was writing about candidate Trump. Are Milbank’s latest attacks his own—or once again coordinated with Democratic researchers?

The Washington Post censor Glenn Kessler posted the yarn about Trump’s father’s racist campaign for New York mayor—until he finally fact-checked his own fake news and deleted his tweet.

Sometimes the line between journalism and politicians is no line at all. Recently, former Obama deputy National Security advisor Ben Rhodes (brother of CBS news president David Rhodes) took to Twitter to blast the Trump administration’s opposition to the Iran Deal, brokered in large part by Rhodes himself. “Everything Trump says here,” Rhodes stormed, “is false.”

Should we believe Rhodes’s charges that Trump is now lying about the details of the Iran Deal?

Who knows, given that Rhodes himself not long ago bragged to the New York Times of his role in massaging reporters to reverberate an administration narrative: “We created an echo chamber They were saying things that validated what we had given them to say.” Rhodes also had no respect for the very journalists that he had manipulated: “The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.”

Is Rhodes now being disinterested or once again creating an “echo chamber”?

His boss, former UN Ambassador and National Security Advisor in the Obama administration, Susan Rice (married to Ian Cameron, a former producer at ABC news), likewise went on Twitter to blast the Trump administration’s decision to include presidential advisor Steven Bannon in meetings of the National Security Council: “This is stone cold crazy,” Rice asserted, “After a week of crazy.”

Is Rice (who has no military experience) correct that the former naval officer Bannon has no business participating in such high strategy meetings?

In September 2012, Rice went on television on five separate occasions to insist falsely to the nation that the attacks on the Benghazi consulate were the work of spontaneous rioters and not a preplanned hit by an al Qaeda franchise. Her own quite crazy stories proved a convenient administration reelection narrative of Al Qaeda on the run, but there were already sufficient sources available to Rice to contradict her false news talking points.

There are various explanations for the loss of media credibility.

First, the world of New York and Washington DC journalism is incestuous. Reporters share a number of social connections, marriages, and kin relationships with liberal politicians, making independence nearly culturally impossible.

More importantly, the election in 2008 of Barack Obama marked a watershed, when a traditionally liberal media abandoned prior pretenses of objectivity and actively promoted the candidacy and presidency of their preferred candidate. The media practically pronounced him god, the smartest man ever to enter the presidency, and capable of creating electric sensations down the legs of reporters. The supposedly hard-hitting press corps asked Obama questions such as, “During these first 100 days, what has …enchanted you the most from serving in this office? Humbled you the most…?”

Obama, as the first African-American president—along with his progressive politics that were to the left of traditional Democratic policies—enraptured reporters who felt disinterested coverage might endanger what otherwise was a rare and perhaps not-to-be-repeated moment.

We are now in a media arena where there are no rules. The New York Times is no longer any more credible than talk radio; CNN—whose reporters have compared Trump to Hitler and gleefully joked about his plane crashing—should be no more believed than a blogger’s website. Buzzfeed has become like the National Inquirer.

Trump now communicates, often raucously and unfiltered, directly with the American people, to ensure his message is not distorted and massaged by reporters who have a history of doing just that. Unfortunately, it is up to the American people now to audit their own president’s assertions. The problem is not just that the media is often not reliable, but that it is predictably unreliable. It has ceased to exist as an auditor of government. Ironically the media that sacrificed its reputation to glorify Obama and demonize Trump has empowered the new President in a way never quite seen before. At least for now, Trump can say or do almost anything he wishes without media scrutiny—given that reporters have far less credibility than does Trump.

Trump is the media’s Nemesis—payback for its own hubris.


16 Fake News Stories Reporters Have Run Since Trump Won


Journalists, media types, reporters, you have two choices: you can fix these problems, or you can watch your profession go down in flames.
By Daniel Payne
FEBRUARY 6, 2017
Since at least Donald Trump’s election, our media have been in the grip of an astonishing, self-inflicted crisis. Despite Trump’s constant railing against the American press, there is no greater enemy of the American media than the American media. They did this to themselves.

We are in the midst of an epidemic of fake news. There is no better word to describe it than “epidemic,” insofar as it fits the epidemiological model from the Centers for Disease Control: this phenomenon occurs when “an agent and susceptible hosts are present in adequate numbers, and the agent can be effectively conveyed from a source to the susceptible hosts.”

The “agent” in this case is hysteria over Trump’s presidency, and the “susceptible hosts” are a slipshod, reckless, and breathtakingly gullible media class that spread the hysteria around like—well, like a virus.

It is difficult to adequately sum up the breadth of this epidemic, chiefly because it keeps growing: day after day, even hour after hour, the media continue to broadcast, spread, promulgate, publicize, and promote fake news on an industrial scale. It has become a regular part of our news cycle, not distinct from or extraneous to it but a part of it, embedded within the news apparatus as a spoke is embedded in a bicycle wheel.

Whenever you turn on a news station, visit a news website, or check in on a journalist or media personality on Twitter or Facebook, there is an excellent chance you will be exposed to fake news. It is rapidly becoming an accepted part of the way the American media are run.

How we will get out of this is anyone’s guess. We might not get out of it, not so long as Trump is president of these United States. We may be up for four—maybe eight!—long years of authentic fake news media hysteria. It is worth cataloging at least a small sampling of the hysteria so far. Only when we fully assess the extent of the media’s collapse into ignominious ineptitude can we truly begin to reckon with it.

Since Trump’s election, here’s just a small sampling of fake news that our media and our journalist class have propagated.

Early November: Spike in Transgender Suicide Rates
After Trump’s electoral victory on November 8, rumors began circulating that multiple transgender teenagers had killed themselves in response to the election results. There was no basis to these rumors. Nobody was able to confirm them at the time, and nobody has been able to confirm in the three months since Trump was elected.

Nevertheless, the claim spread far and wide: Guardian writer and editor-at-large of Out Zach Stafford tweeted the rumor, which was retweeted more than 13,000 times before he deleted it. He later posted a tweet explaining why he deleted his original viral tweet; his explanatory tweet was shared a total of seven times. Meanwhile, PinkNews writer Dominic Preston wrote a report on the rumors, which garnered more than 12,000 shares on Facebook.

At Mic, Matthew Rodriguez wrote about the unsubstantiated allegations. His article was shared more than 55,000 times on Facebook. Urban legend debunker website Snopes wrote a report on the rumors and listed them as “unconfirmed” (rather than “false”). Snopes’s sources were two Facebook posts, since deleted, that offered no helpful information regarding the location, identity, or circumstances of any of the suicides. The Snopes report was shared 19,000 times.

At Reason, writer Elizabeth Nolan Brown searched multiple online databases to try to determine the identities or even the existence of the allegedly suicidal youth. She found nothing. As she put it: “[T]eenagers in 2016 don’t just die without anyone who knew them so much as mentioning their death online for days afterward.”

She is right. Just the same, the stories hyping this idea garnered at least nearly 100,000 shares on Facebook alone, contributing to the fear and hysteria surrounding Trump’s win.

November 22: The Tri-State Election Hacking Conspiracy Theory
On November 22, Gabriel Sherman posted a bombshell report at New York Magazine claiming that “a group of prominent computer scientists and election lawyers” were demanding a recount in three separate states because of “persuasive evidence that [the election] results in Wisconsin, Michigan, and Pennsylvania may have been manipulated or hacked.” The evidence? Apparently, “in Wisconsin, Clinton received 7 percent fewer votes in counties that relied on electronic-voting machines compared with counties that used optical scanners and paper ballots.”

The story went stratospherically viral. It was shared more than 145,000 times on Facebook alone. Sherman shared it on his Twitter feed several times, and people retweeted his links to the story nearly 9,000 times. Politico’s Eric Geller shared the story on Twitter as well. His tweet was retweeted just under 8,000 times. Dustin Volz from Reuters shared the link; he was retweeted nearly 2,000 times. MSNBC’s Joy Reid shared the story and was retweeted more than 4,000 times. New York Times opinion columnist Paul Krugman also shared the story and was retweeted about 1,600 times.

It wasn’t until the next day, November 23, that someone threw a little water on the fire. At FiveThirtyEight, Nate Silver explained that it was “demographics, not hacking” that explained the curious voting numbers. “Anyone making allegations of a possible massive electoral hack should provide proof,” he wrote, “and we can’t find any.” Additionally, Silver pointed out that the New York Magazine article had misrepresented the argument of one of the computer scientists in question.

At that point, however, the damage had already been done: Sherman, along with his credulous tweeters and retweeters, had done a great deal to delegitimize the election results. Nobody was even listening to Silver, anyway: his post was shared a mere 380 times on Facebook, or about one-quarter of 1 percent as much as Sherman’s. This is how fake news works: the fake story always goes viral, while nobody reads or even hears about the correction.

December 1: The 27-Cent Foreclosure
At Politico on December 1, Lorraine Woellert published a shocking essay claiming that Trump’s pick for secretary of the Treasury, Steve Mnuchin, had overseen a company that “foreclosed on a 90-year-old woman after a 27-cent payment error.” According to Woellert: “After confusion over insurance coverage, a OneWest subsidiary sent [Ossie] Lofton a bill for $423.30. She sent a check for $423. The bank sent another bill, for 30 cents. Lofton, 90, sent a check for three cents. In November 2014, the bank foreclosed.”

The story received widespread coverage, being shared nearly 17,000 times on Facebook. The New York Times’s Steven Rattner shared it on Twitter (1,300 retweets), as did NBC News’s Brad Jaffy (1,200 retweets), the AP’s David Beard (1,900 retweets) and many others.

The problem? The central scandalous claims of Woellert’s article were simply untrue. As the Competitive Enterprise Institute’s Ted Frank pointed out, the woman in question was never foreclosed on, and never lost her home. Moreover, “It wasn’t Mnuchin’s bank that brought the suit.”

Politico eventually corrected these serious and glaring errors. But the damage was done: the story had been repeated by numerous media outlets including Huffington Post (shared 25,000 times on Facebook), the New York Post, Vanity Fair, and many others.

January 20: Nancy Sinatra’s Complaints about the Inaugural Ball
On the day of Trump’s inauguration, CNN claimed Nancy Sinatra was “not happy” with the fact that the president and first lady’s inaugural dance would be to the tune of Frank Sinatra’s “My Way.” The problem? Nancy Sinatra had never said any such thing. CNN later updated the article without explaining the mistake they had made.

January 20: The Nonexistent Climate Change Website ‘Purge’
Also on the day of the inauguration, New York Times writer Coral Davenport published an article on the Times’s website whose headline claimed that the Trump administration had “purged” any “climate change references” from the White House website. Within the article, Davenport acknowledged that the “purge” (or what she also called “online deletions”) was “not unexpected” but rather part of a routine turnover of digital authority between administrations.

To call this action a “purge” was thus at the height of intellectual dishonesty: Davenport was styling the whole thing as a kind of digital book-burn rather than a routine part of American government. But of course that was almost surely the point. The inflammatory headline was probably the only thing that most people read of the article, doubtlessly leading many readers (the article was shared nearly 50,000 times on Facebook) to believe something that simply wasn’t true.

January 20: The Great MLK Jr. Bust Controversy
On January 20, Time reporter Zeke Miller wrote that a bust of Martin Luther King Jr. had been removed from the White House. This caused a flurry of controversy on social media until Miller issued a correction. As Time put it, Miller had apparently not even asked anyone in the White House if the bust had been removed. He simply assumed it had been because “he had looked for it and had not seen it.”

January 20: Betsy DeVos, Grizzly Fighter
During her confirmation hearing, education secretary nominee Betsy DeVos was asked whether schools should be able to have guns on their campuses. As NBC News reported, DeVos felt it was “best left to locales and states to decide.” She pointed out that one school in Wyoming had a fence around it to protect the students from wildlife. “I would imagine,” she said, “that there’s probably a gun in the school to protect from potential grizzlies.”

This was an utterly noncontroversial stance to take. DeVos was simply pointing out that different states and localities have different needs, and attempting to mandate a nationwide one-size-fits-all policy for every American school is imprudent.

How did the media run with it? By lying through their teeth. “Betsy DeVos Says Guns Should Be Allowed in Schools. They Might Be Needed to Shoot Grizzlies” (Slate). “Betsy DeVos: Schools May Need Guns to Fight Off Bears” (The Daily Beast). “Citing grizzlies, education nominee says states should determine school gun policies” (CNN). “Betsy DeVos says guns in schools may be necessary to protect students from grizzly bears” (ThinkProgress.) “Betsy DeVos says guns shouldn’t be banned in schools … because grizzly bears” (Vox). “Betsy DeVos tells Senate hearing she supports guns in schools because of grizzly bears” (The Week). “Trump’s Education Pick Cites ‘Potential Grizzlies’ As A Reason To Have Guns In Schools” (BuzzFeed).

The intellectual dishonesty at play here is hard to overstate. DeVos never said or even intimated that every American school or even very many of them might need to shoot bears. She merely used one school as an example of the necessity of federalism and as-local-as-possible control of the education system.

Rather than report accurately on her stance, these media outlets created a fake news event to smear a reasonable woman’s perfectly reasonable opinion.

January 26: The ‘Resignations’ At the State Department
On January 26, the Washington Post’s Josh Rogin published what seemed to be a bombshell report declaring that “the State Department’s entire senior management team just resigned.” This resignation, according to Rogin, was “part of an ongoing mass exodus of senior Foreign Service officers who don’t want to stick around for the Trump era.” These resignations happened “suddenly” and “unexpectedly.” He styled it as a shocking shake-up of administrative protocol in the State Department, a kind of ad-hoc protest of the Trump administration.

The story immediately went sky-high viral. It was shared nearly 60,000 times on Facebook. Rogin himself tweeted the story out and was retweeted a staggering 11,000 times. Washington Post columnist Anne Applebaum had it retweeted nearly 2,000 times; journalists and writers from Wired, The Guardian, the Washington Post, Bloomberg, ABC, Foreign Policy, and other publications tweeted the story out in shock.

There was just one problem: the story was more a load of bunk. As Vox pointed out, the headline of the piece was highly misleading: “the word ‘management’ strongly implied that all of America’s top diplomats were resigning, which was not the case.” (The Post later changed the word “management” to “administrative” without noting the change, although it left the “management” language intact in the article itself).

More importantly, Mark Toner, the acting spokesman for the State Department, put out a press release noting that “As is standard with every transition, the outgoing administration, in coordination with the incoming one, requested all politically appointed officers submit letters of resignation.” According to CNN, the officials were actually asked to leave by the Trump administration rather than stay on for the customary transitional few months. The entire premise of Rogin’s article was essentially nonexistent.

As always, the correction received far less attention than the fake news itself: Vox’s article, for instance, was shared around 9,500 times on Facebook, less than one-sixth the rate of Rogin’s piece. To this day, Rogin’s piece remains uncorrected regarding its faulty presumptions.

January 27: The Photoshopped Hands Affair
On January 27, Observer writer Dana Schwartz tweeted out a screenshot of Trump that, in her eyes, proved President Trump had “photoshopped his hands bigger” for a White House photograph. Her tweet immediately went viral, being shared upwards of 25,000 times. A similar tweet by Disney animator Joaquin Baldwin was shared nearly 9,000 times as well.

The conspiracy theory was eventually debunked, but not before it had been shared thousands upon thousands of times. Meanwhile, Schwartz tweeted that she did “not know for sure whether or not the hands were shopped.” Her correction tweet was shared a grand total of…11 times.

January 29: The Reuters Account Hoax
Following the Quebec City mosque massacre, the Daily Beast published a story that purported to identify the two shooters who had perpetrated the crime. The problem? The story’s source was a Reuters parody account on Twitter. Incredibly, nobody at the Daily Beast thought to check the source to any appreciable degree.

January 31: The White House-SCOTUS Twitter Mistake
Leading up to Trump announcing his first Supreme Court nomination, CNN Senior White House Correspondent Jeff Zeleny announced that the White House was “setting up [the] Supreme Court announcement as a prime-time contest.” He pointed to a pair of recently created “identical Twitter pages” for a theoretical justices Neil Gorsuch and Thomas Hardiman, the two likeliest nominees for the court vacancy.

Zeleny’s sneering tweet—clearly meant to cast the Trump administration in an unflattering, circus-like light—was shared more than 1,100 times on Twitter. About 30 minutes later, however, he tweeted: “The Twitter accounts…were not set up by the White House, I’ve been told.” As always, the admission of mistake was shared far less than the original fake news: Zeleny’s correction was retweeted a paltry 159 times.

January 31: The Big Travel Ban Lie
On January 31, a Fox affiliate station out of Detroit reported that “A local business owner who flew to Iraq to bring his mother back home to the US for medical treatment said she was blocked from returning home under President Trump’s ban on immigration and travel from seven predominately Muslim nations. He said that while she was waiting for approval to fly home, she died from an illness.”

Like most other sensational news incidents, this one took off, big-time: it was shared countless times on Facebook, not just from the original article itself (123,000 shares) but via secondary reporting outlets such as the Huffington Post (nearly 9,000 shares). Credulous reporters and media personalities shared the story on Twitter to the tune of thousands and thousands of retweets, including: Christopher Hooks, Gideon Resnick, Daniel Dale, Sarah Silverman, Blake Hounshell, Brian Beutler, Garance Franke-Ruta, Keith Olbermann (he got 3,600 retweets on that one!), Matthew Yglesias, and Farhad Manjoo.

The story spread so far because it gratified all the biases of the liberal media elite: it proved that Trump’s “Muslim ban” was an evil, racist Hitler-esque mother-killer of an executive order.

There was just one problem: it was a lie. The man had lied about when his mother died. The Fox affiliate hadn’t bothered to do the necessary research to confirm or disprove the man’s account. The news station quietly corrected the story after giving rise to such wild, industrial-scale hysteria.

February 1: POTUS Threatens to Invade Mexico
On February 1, Yahoo News published an Associated Press report about a phone call President Trump shared with Mexican president Enrique Pena Nieto. The report strongly implied that President Trump was considering “send[ing] U.S. troops” to curb Mexico’s “bad hombre” problem, although it acknowledged that the Mexican government disagreed with that interpretation. The White House later re-affirmed that Trump did not have any plan to “invade Mexico.”

Nevertheless, Jon Passantino, the deputy news director of BuzzFeed, shared this story on Twitter with the exclamation “WOW.” He was retweeted 2,700 times. Jon Favreau, a former speechwriter for Barack Obama, also shared the story, declaring: “I’m sorry, did our president just threaten to invade Mexico today??” Favreau was retweeted more than 8,000 times.

Meanwhile, the Yahoo News AP post was shared more than 17,000 times on Facebook; Time’s post of the misleading report was shared more than 66,000 times; ABC News posted the story and it was shared more than 20,000 times. On Twitter, the report—with the false implication that Trump’s comment was serious—was shared by media types such as ThinkProgress’s Judd Legum, the BBC’s Anthony Zurcher, Vox’s Matt Yglesias, Politico’s Shane Goldmacher, comedian Michael Ian Black, and many others.

February 2: Easing the Russian Sanctions
Last week, NBC News national correspondent Peter Alexander tweeted out the following: “BREAKING: US Treasury Dept easing Obama admin sanctions to allow companies to do transactions with Russia’s FSB, successor org to KGB.” His tweet immediately went viral, as it implied that the Trump administration was cozying up to Russia.

A short while later, Alexander posted another tweet: “Source familiar [with] sanctions says it’s a technical fix, planned under Obama, to avoid unintended consequences of cybersanctions.” As of this writing, Alexander’s fake news tweet has approximately 6,500 retweets; his clarifying tweet has fewer than 250.

At CNBC, Jacob Pramuk styled the change this way: “Trump administration modifies sanctions against Russian intelligence service.” The article makes it clear that, per Alexander’s source, “the change was a technical fix that was planned under Obama.” Nonetheless, the impetus was placed on the Trump adminsitration. CBS News wrote the story up in the same way. So did the New York Daily News.

In the end, unable to pin this (rather unremarkable) policy tweak on the Trump administration, the media have mostly moved on. As the Chicago Tribune put it, the whole affair was yet again an example of how “in the hyperactive Age of Trump, something that initially appeared to be a major change in policy turned into a nothing-burger.”

February 2: Renaming Black History Month
At the start of February, which is Black History Month in the United States, Trump proclaimed the month “National African American History Month.” Many outlets tried to spin the story in a bizarre way: TMZ claimed that a “senior administration official” said that Trump believed the term “black” to be outdated. “Every U.S. president since 1976 has designated February as Black History Month,” wrote TMZ. BET wrote the same thing.

The problem? It’s just not true. President Obama, for example, declared February “National African American History Month” as well. TMZ quickly updated their piece to fix their embarrassing error.

February 2: The House of Representatives’ Gun Control Measures
On February 2, the Associated Press touched off a political and media firestorm by tweeting: “BREAKING: House votes to roll back Obama rule on background checks for gun ownership.” The AP was retweeted a staggering 12,000 times.

The headlines that followed were legion: “House votes to rescind Obama gun background check rule” (Kyle Cheney, Politico); “House GOP aims to scrap Obama rule on gun background checks” (CNBC); “House scraps background check regulation” (Yahoo News); “House rolls back Obama gun background check rule” (CNN); “House votes to roll back Obama rule on background checks for gun ownership” (Washington Post).

Some headlines were more specific about the actual House vote but no less misleading; “House votes to end rule that prevents people with mental illness from buying guns” (the Independent); “Congress ends background checks for some gun buyers with mental illness” (the Pittsburgh Post-Gazette); “House Votes to Overturn Obama Rule Restricting Gun Sales to the Severely Mentally Ill” (NPR).

The hysteria was far-reaching and frenetic. As you might have guessed, all of it was baseless. The House was actually voting to repeal a narrowly tailored rule from the Obama era. This rule mandated that the names of certain individuals who receive Social Security Disability Insurance and Supplemental Security Income and who use a representative to help manage these benefits due to a mental impairment be forwarded to the National Instant Criminal Background Check System.

If that sounds confusing, it essentially means that if someone who receives SSDI or SSI needs a third party to manage these benefits due to some sort of mental handicap, then—under the Obama rule—they may have been barred from purchasing a firearm. (It is thus incredibly misleading to suggest that the rule applied in some specific way to the “severely mentally ill.”)

As National Review’s Charlie Cooke pointed out, the Obama rule was opposed by the American Association of People With Disabilities; the ACLU; the Arc of the United States; the Autistic Self-Advocacy Network; the Consortium of Citizens With Disabilities; the National Coalition of Mental Health Recovery; and many, many other disability advocacy organizations and networks.

The media hysteria surrounding the repeal of this rule—the wildly misleading and deceitful headlines, the confused outrage over a vote that nobody understood—was a public disservice.

As Cooke wrote: “It is a rare day indeed on which the NRA, the GOP, the ACLU, and America’s mental health groups find themselves in agreement on a question of public policy, but when it happens it should at the very least prompt Americans to ask, ‘Why?’ That so many mainstream outlets tried to cheat them of the opportunity does not bode well for the future.”

Maybe It’s Time to Stop Reading Fake News
Surely more incidents have happened since Trump was elected; doubtlessly there are many more to come. To be sure, some of these incidents are larger and more shameful than others, and some are smaller and more mundane.

But all of them, taken as a group, raise a pressing and important question: why is this happening? Why are our media so regularly and so profoundly debasing and beclowning themselves, lying to the public and sullying our national discourse—sometimes on a daily basis? How has it come to this point?

Perhaps the answer is: “We’ve let it.” The media will not stop behaving in so reckless a manner unless and until we demand they stop.

That being said, there are two possible outcomes to this fake news crisis: our media can get better, or they can get worse. If they get better, we might actually see our press begin to hold the Trump administration (and government in general) genuinely accountable for its many admitted faults. If they refuse to fix these serial problems of gullibility, credulity, outrage, and outright lying, then we will be in for a rough four years, if not more.

No one single person can fix this problem. It has to be a cultural change, a kind of shifting of priorities industry-wide. Journalists, media types, reporters, you have two choices: you can fix these problems, or you can watch your profession go down in flames.

Most of us are hoping devoutly for the former. But not even a month into the presidency of Donald J. Trump, the outlook is dim.