Facebook will start removing misleading and inflammatory posts that may trigger violent attacks, the social network said Wednesday, as it faces criticism over its response to sectarian conflict in countries such as Myanmar and Sri Lanka.
Facebook said that a new policy will cover misinformation shared on the platform to instigate or amplify violence. The policy applies to written posts and manipulated images. Civil-society groups and threat-intelligence agencies are among the partners that Facebook said will help the company flag incendiary posts and review their potential impact. Facebook said that its local and international partners must verify if the information they share is false and show that the material could contribute to imminent violence. Once the threat is confirmed, Facebook said it will remove the content and take down similar posts.
The announcement came as chief executive Mark Zuckerberg sought to clarify his recent remarks that people who deny the Holocaust may do so in good faith, according to an interview with Recode. Zuckerberg later said, "I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.”
In the interview with Recode, Zuckerberg said that Facebook sees a substantial difference between false information and the type of false information that can result in physical harm. While Facebook won’t ban Infowars, a prominent right-wing outlet known for spreading conspiracy theories, the social network will take down posts that may lead to violence, Zuckerberg said. He pointed to the nations of Myanmar and Sri Lanka, where social media may have contributed to deadly sectarian conflict, according to United Nations and government officials.
“Reducing the distribution of misinformation — rather than removing it outright — strikes the right balance between free expression and a safe and authentic community,” Facebook said in a statement Wednesday. The company added that the policy change would allow it to take down posts that contribute to physical harm.
The new policy was enacted last month in Sri Lanka, the company said. Facebook removed content that falsely claimed that Muslims were poisoning food given to Buddhists. Similar posts had recently contributed to violence in the country. Government officials there grew so concerned with the inflammatory posts that Sri Lanka temporarily banned Facebook earlier this year to stem sectarian violence.
Facebook said the policy change will roll out in the coming months.