Meta’s independent Oversight Board announced Wednesday it has overturned the social media giant’s decision earlier this year to remove a Facebook post that likened Russian soldiers who invaded Ukraine to Nazis.
“In this context, neither Meta’s human rights responsibilities nor its Hate Speech Community Standard protect soldiers from claims of egregious wrongdoing or prevent provocative comparisons between their actions and past events,” the board wrote in its ruling.
The ruling represented a rare rebuke by the Oversight Board, a group of academics, experts and lawyers who oversee Meta’s content moderation decisions, of the company’s ongoing response to a flood of posts on its social media networks about the war in Ukraine. The Oversight Board previously expressed interest in weighing in on the company’s evolving policies regarding content about Russia’s invasion of Ukraine but was rebuffed.
Under the rules, Facebook and its users are allowed to appeal to the Oversight Board cases in which the company has taken down posts for violating its community standards — rules it imposes against hate speech, harassment and other problematic types of content. The decisions the Oversight Board makes on these cases are considered binding.
The Oversight Board can also issue policy recommendations for changes to the company’s content moderation practices, but those are not considered binding.
The board was examining Facebook’s initial decision to remove a post from April in which a Latvian user posted an image of what appeared to be a dead body of someone shot in Bucha, Ukraine. Along with the photo, which revealed no visible wounds, the user compared World War II and the invasion of Ukraine and argued that the Russian army “became fascist.”
The user said the Russian army in Ukraine “rape[s] girls, wound[s] their fathers, torture[s] and kill[s] peaceful people” and that Ukrainians may also want to repeat such actions. The post included excerpts of the poem “Kill Him!” by Soviet poet Konstantin Simonov.
Meta removed the post for violating its rules against hate speech, which bar users from posting “dehumanizing” content about people or groups on the basis of sensitive characteristics such as race or sexual orientation. The user appealed Meta’s decision to the Oversight Board. After the board selected the case, Meta rescinded its previous decision and restored the post. Later, the company applied a warning screen to the photograph that alerted users the content may be violent or graphic.
But the board argued that the user wasn’t making general accusations that “Russian soldiers are Nazis” but rather the post was drawing historical parallels based on the soldiers’ behavior in a particular time and place.
“The post also targets Russian soldiers because of their role as combatants, not because of their nationality,” the Oversight Board wrote. “In this context, neither Meta’s human rights responsibilities nor its Hate Speech Community Standard protect soldiers from claims of egregious wrongdoing or prevent provocative comparisons between their actions and past events.”
In effect, the board’s ruling overturned Meta’s decision to put a warning screen on the post and the company will review other posts with identical content to determine whether to take action, according to a statement from Meta.
The Oversight Board added that the excerpted poem was being deployed as a rhetorical device to warn about cycles of violence rather than to explicitly encourage such violence.
So far, the board has ruled on a wide range of cases including deciding whether Facebook should have suspended the account of former president Donald Trump in the wake of the Jan. 6, 2021, rioting at the Capitol.
Earlier this year, Meta decided to allow some calls for violence against Russian invaders, creating an unusual exception to its long-standing hate speech rules that prohibit such language. Meta President of Global Affairs Nick Clegg wrote in an internal post that the company would be referring the guidance it issued to moderators to the Oversight Board, according to a copy of the post viewed by The Washington Post.
Later, Meta withdrew its request for the Oversight Board to review its approach to content about the war, citing “ongoing safety and security concerns.” That prompted criticism from the Oversight Board.
“While the Board understands these concerns, we believe the request raises important issues and are disappointed by the company’s decision to withdraw it,” the board said in a statement at the time.