Facebook announced Friday that it would begin to consider the newsworthiness and public interest of difficult or graphic content before censoring it for being in violation of its site rules, a process that has also long been a crucial editorial function of news organizations.
Founder Mark Zuckerberg recently said that Facebook is “a tech company, not a media company.” When Facebook censored — and then later reinstated — Ut’s photograph, it demonstrated just how hard it will be for the company to keep the two things entirely separate.
Ut’s photograph is an iconic representation of the horrors of war. As a tech company, Facebook saw the photograph as a violation of its rules against nudity. The change on Friday — which will open up the company to weighing its rules and the importance of the image together — could be read as an implicit acknowledgment that decisions like this will no longer be so simple for the platform.
While the company has no plans to exist as a traditional part of the media, Facebook is, in many ways, already very much a part of the media industry.
A month ago, Norway’s largest newspaper — and even the country’s prime minister — loudly criticized Facebook for removing Ut’s image from the page of a well-known Norwegian author. Espen Egil Hansen, the editor of Norway’s Aftenposten newspaper, called Zuckerberg the “world’s most powerful editor” in a front-page, open letter to the company. Hansen wrote that the decisions newsrooms make about the newsworthiness of difficult or graphic images such as Ut’s “should not be undermined by algorithms encoded in your office in California.”
Forty-four percent of the general population in the United States says it gets its news from Facebook, according to a recent Pew study. Facebook’s Friday announcement suggests that the company will now play a more active role in making sure that the newsworthiness of the images and videos posted there are not undermined by the rules it designed to protect its users from potentially offensive content.
To do that, Facebook VPs Joel Kaplan and Justin Osofsky wrote on Friday that the company will “work with our community and partners” over the coming weeks to figure out “new tools and approaches to enforcement” for its community standards.
“Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them,” the announcement explains. The brief post provides no specific information on how Facebook will start to make these determinations. The company did say it would seek input from “experts, publishers, journalists, photographers, law enforcement officials and safety advocates” in doing so, however.
Facebook initially defended its censorship of Ut’s photo in early September but reversed the decision after international outrage, citing the photograph’s historical importance. But the Vietnam War photo hardly marks the only time that Facebook has faced criticism for its handling of newsworthy but difficult content on its site.
Some other recent examples:
- In March, Facebook suspended the accounts of users who shared an article about Australian Aboriginal feminism, because the article’s share image was a photograph of Aboriginal women in traditional dress, including ceremonially painted bare breasts.
- In July, Facebook briefly removed Diamond Reynolds’s Facebook Live video that showed the dying moments of Philando Castile after he was shot in by a Minnesota police officer during a traffic stop. The video was reinstated with a graphic-content warning, and Facebook explained the video’s brief disappearance as a “technical glitch.”
- In August, law enforcement successfully petitioned Facebook to disable the social-media accounts of Korryn Gaines, who was live-streaming her armed standoff with Baltimore County police. Gaines was later killed, and her young son injured, in a shootout with law enforcement.
- In September, activists claimed that Facebook had temporarily censored a live stream of a mass arrest during a Dakota Access pipeline protest. A Facebook spokesperson told Motherboard that the video was mistakenly removed as a result of an error with its automatic spam filters.
Separately, Facebook has faced criticism over the accuracy and presentation of the news stories and topics it surfaces as “trending” on the site, an issue The Intersect is currently exploring in depth.