Four years ago, a French schoolteacher tried to post a picture of Gustave Courbet’s “L’Origine du Monde” to his Facebook friends. The painting is world-renowned and hangs in the Musee d’Orsay, but because it features some close-up lady parts, Facebook’s moderators torpedoed it.
The teacher, affronted by Facebook’s decision — and apparently quite defensive of Courbet, his fellow Frenchman — actually took Facebook to court over its choice to take “L’Origine” down. Not an American court, mind you, with its laissez-faire attitude toward content moderation and its cool outlook on anything approaching “obscenity.” He sued in a French court. A Parisian court. A court that Facebook’s terms of service say, in no uncertain terms, it absolutely cannot be sued in.
On Thursday, the court nevertheless agreed to hear the case — an upset that some observers have already called a win.
See, regardless of your opinion on art or nudity or Courbet, the case has big implications for American social media companies and their moderation policies. By saying it has jurisdiction over this issue, the French court is saying Facebook has to follow French laws. And not just on practical matters of takedown requests or court orders, but on stuff as granular as a nude painting — inherently moral, and relative, calls.
This is troubling for many reasons. (For starters, it opens Facebook up to a whole lot of French lawsuits.) But given that the teacher who posted Courbet’s painting is litigating for reasons of censorship and free speech, it seems particularly ill-advised to kick the issue of moral policing up to the state.
Facebook’s own policing is infamously imperfect. Just last week, the site suspended New York art critic Jerry Saltz over the “offensiveness” of a few medieval paintings. Previously, Facebook has censored work from institutions as diverse as the New York Academy of Art and the Centre Pompidou. (We won’t even get into non-artistic representations of nudity or gore or violence, which come with their own little trails of controversies.)
Facebook has stated “community guidelines” to govern these kinds of things. But the guidelines are enforced by thousands of individual moderators, which means their strong suit is often not consistency. And the guidelines themselves come off as puritanical, to some: an attempt to accommodate a billion-odd users who are more offendable than your average French sophisticate. As Facebook has grown, those poles have only grown further apart. The writer Adrian Chen calls it “the Grandma Problem” — the impossible paradox of running a site that increasingly must appeal to both reckless, f-bomb spouting teens and their kindly, conservative grandparents.
Saltz, the New York art critic, was apparently a victim of “the Grandma Problem,” too, reported to Facebook by some of his more censorious followers. “Over the last bunch of months, I’d run afoul of art-world conservatism and moralism,” he complained.
No one can ever be totally happy under this arrangement, of course. It is the uneasy compromise at the heart of every social network: Facebook basically has to pick, and then enforce, the moral code that seems to tick off the least amount of people. Naturally, the code is vague: Of nudity, Facebook says, “we aspire to respect people’s right to share content of personal importance.”
One practical translation, as of 2012: no “private parts,” no “sexual activity,” no nude cartoons — “art nudity is okay.” There’s no exact definition for “art nudity.” And Facebook declined to share the guidelines its moderators use now, or to otherwise elaborate.
But as arbitrary and unfair as this system may seem, the French case would only open up Facebook users to more uncertainty. As things stand now, Facebook is basically the last authority on the subject of its own moderation. It’s headquartered in Silicon Valley and, under U.S. law, has more or less free range over what people on the site are allowed to do/say.
Maybe that’s frustrating in a progressive, Western country like France, where a close-up of genitalia is (apparently?) NBD. But in much of the world, the fact that Facebook has the final say over content moderation is critically important to dissent and free speech. The alternate vision that our Courbet-posting friend has pitched is seductive, maybe, to those who think Facebook is too powerful, or conservative, or something else entirely. But calling on local governments to police the site’s policing seems, on balance, even more risky.
To be clear, the Paris court has only agreed to hear this Courbet case; it doesn’t start until May, and they could very well end up ruling for Facebook. But as the plaintiff’s lawyer gleefully told the media last week, even this jurisdiction battle set a powerful precedent.
Perhaps all this turmoil will persuade Facebook to overhaul moderation internally. The American Civil Liberties’ Union has called on the site to reform how it handles appeals — that way users have a means to provide critical context and correct moderators who make mistakes. There have also long been calls to get Facebook to standardize its moderation processes. Maybe, one writer suggested, tongue-in-cheek, the nudity censors could take a quick class in art theory.
Saltz, for his part, has a simpler solution: If someone’s Facebook page offends you, why don’t you just unfollow it?
Alas, outraged Internet users rarely respond to such elegant logic.
Liked that? Try these: