Bullfighting is a controversial sport; even within Spain, few people still follow it. But columnists from Madrid to Malaga bristled at the suggestion that a federally recognized piece of heritage could be branded offensive.
“Facebook equates bullfighting with prostitution,” declared ABC, the country’s third-largest newspaper, on Jan. 14. Days later, when Facebook inevitably backtracked and deleted its references to bullfighting — clarifying, in a statement to The Post, that it had been included mistakenly — Spain’s second-largest paper, El Mundo, rejoiced that the network had “rectified” the situation.
But unfortunately for the suits at Facebook, who had suffered considerable headaches over the bullfighting mess, that situation was just the latest in a string of unintended clashes as inevitable as they are endless. As Facebook has tentacled out from Palo Alto, Calif., gaining control of an ever-larger slice of the global commons, the network has found itself in a tenuous and culturally awkward position: how to determine a single standard of what is and is not acceptable — and apply it uniformly, from Maui to Morocco.
For Facebook and other platforms like it, incidents such as the bullfighting kerfuffle betray a larger, existential difficulty: How can you possibly impose a single moral framework on a vast and varying patchwork of global communities?
If you ask Facebook this question, the social-media behemoth will deny doing any such thing. Facebook says its community standards are inert, universal, agnostic to place and time. The site doesn’t advance any worldview, it claims, besides the non-controversial opinion that people should “connect” online.
“Every day, people come to Facebook to connect with people and issues they care about,” a spokeswoman said in a statement. “Given the diversity of the Facebook community, this means that sometimes people share information that is controversial or offends others. That’s why we have a set of global Community Standards that explain what you can and cannot do on our service. . . We work hard to strike the right balance between enabling expression while providing a safe and respectful experience.”
Facebook has modified its standards several times in response to pressure from advocacy groups — although the site has deliberately obscured those edits, and the process by which Facebook determines its guidelines remains stubbornly obtuse. On top of that, at least some of the low-level contract workers who enforce Facebook’s rules are embedded in the region — or at least the time zone — whose content they moderate. The social network staffs its moderation team in 24 languages, 24 hours a day.
In response to recent criticism that Facebook has mishandled takedown requests from users in the Middle East, Facebook’s policy director for the region assured users that “all reports are assessed by teams of multilingual, impartial and highly trained people” — including native speakers of Hebrew and Arabic, who presumably understand the region’s particular issues.
And yet, observers remain deeply skeptical of Facebook’s claims that it is somehow value-neutral or globally inclusive, or that its guiding principles are solely “respect” and “safety.” There’s no doubt, said Tarleton Gillespie, a principal researcher at Microsoft Research in New England, that the company advances a specific moral framework — one that is less of the world than of the United States, and less of the United States than of Silicon Valley.
If you study Facebook’s community standards, going back to the long-forgotten time when users voted on a version of them, the site has always erred on the side of radical free speech, corporate opaqueness and a certain American prudishness: Its values are those of the early Web, moderated by capitalist conservatism.
The values that Facebook articulates are not always the ones it enforces. Below that top-level standard are the unknown thousands of invisible click-workers forced to interpret it, and below them are the self-deputized users flagging their friends’ content. Between the site’s demonstrably U.S. orientation and the layers of obfuscation below, there can be little doubt that the values Facebook ends up imposing on its “community” of 1.55 billion people are not agreed upon by many — perhaps even most — of them.
Somehow, it seems that we only notice the imposition when there’s a glitch in the machine: I can’t use a tribal name on Facebook? The site maligned bullfighting? Why, how dare this private company impose its worldview on me!
This is not merely a problem for Facebook; Gillespie, the Microsoft researcher, calls it the unsolvable “basic paradox” of all Internet companies: They’re private and have their own corporate motives, but they’re called upon to police public speech. Alas, as their public grows more diverse, the worldviews of the “community” and its corporate sponsor would appear to align less and less. As of 2013, eight of the world’s 10 top Web properties were based in the United States — and 81 percent of their users were located outside of it. (If nothing else, there’s a compelling statistical reason why Google, Amazon.com, Facebook and Apple, collectively acronymed “GAFA,” have been called the new face of “American cultural imperialism.”)
Facebook will never make everyone happy, of course; nor does anyone suggest it should. But in a better world, the largest social network would at least admit that it’s not an impartial, value-neutral observer. After all, every single thing Facebook does — from advance a single global “community,” to add six extra words in a dialogue box — reshapes the public space of its users.
“The myth of the social network as a neutral space is crumbling, but it’s still very powerful,” Gillespie said. “For Facebook to finally say, ‘Yes, we construct social life online. We construct public discourse’ — that would be so important, but for them, dangerous.”
Liked that? Try these!
- If you use Facebook to get your news, please — for the love of democracy — read this first
- The one thing about ‘matching’ algorithms that dating sites don’t want you to know
- An hour-by-hour look at how a conspiracy theory becomes ‘truth’ on Facebook