Update: During Facebook founder Mark Zuckerberg's appearance before a joint meeting of two Senate panels on Tuesday, he was asked about his company's alleged role in spreading hate speech against Rohingya Muslims in Burma.
Sen. Patrick J. Leahy (D-Vt.) raised the issue of a death threats against a Burmese journalist that spread on Facebook in 2016 and were highlighted by Burmese groups. As an aide held up an example of the threats on a board, Leahy said it took “attempt after attempt after attempt” and the involvement of civil society groups to remove the threats.
“What's happening in Myanmar is a terrible tragedy, and we need to do more,” Zuckerberg answered, using another name for Burma. He said Facebook was hiring “dozens” more Burmese-language content reviewers to look for hate speech, as well as working with civil society to identify “specific hate figures” who should be banned and working with product teams for further technical solutions to the problems.
Ahead of Facebook founder Mark Zuckerberg's appearance on Capitol Hill this week, a number of civic organizations and human rights groups in Burma shared a detailed presentation with U.S. lawmakers on how the social network helped spread hate speech in the country.
The presentation, sent to key American leaders who will be questioning Zuckerberg on Tuesday and Wednesday, highlights a number of examples of alleged negligence by Facebook that they say helped propel violence in the Southeast Asian nation.
It features a video shared by a nationalist monk on Facebook in January 2016 that provocatively reenacted a controversial rape and murder of a Burmese woman. It took three days for the video to be removed, the organizations say, although it was “clearly designed to ignite further tensions between Buddhists and Muslim communities.”
Before being removed from Facebook, the video had been viewed more than 120,000 times, according to the presentation. Indeed, despite the removal of the original video, the groups found copies of it in at least eight places on Facebook.
In other examples, the groups pointed to a November 2016 post on Facebook that described a respected journalist working for international media as “a terrorist” and called for him to be killed. The post was not found to violate the social network's community standards, but it was later removed when Facebook representatives were contacted directly, according to the presentation.
The groups also cited the Facebook page of nationalist Buddhist monk U Wirathu. His page had been reported for spreading inflammatory content for years but kept returning.
In a letter to U.S. lawmakers, the organizations emphasized that Burma should be a key priority in the questioning of Facebook because “lives are literally on the line.” They pointed to a recent statement from United Nations investigators that concluded Facebook had played a “determining role” in alleged ethnic cleansing in the country.
In a phone call, Htaike Htaike Aung, director of the Myanmar ICT for Development Organization, said that U.S. pressure was necessary after years of promises for action from Facebook. “After these endless promises, we believe that the U.S. Congress can really help force Facebook to do something,” she said.
The groups suggested four questions for senators to ask Zuckerberg, included verbatim below:
- What concrete actions are you going to take to ensure that content that clearly breaches your own Community Standards is immediately removed?
- Countries such as Germany require Facebook to remove hate speech within 24 hours or face significant fines. Will you commit to removing all incitements to violence in Myanmar within 24 hours?
- Your emergency escalation processes clearly do not work. Will you put in place clear, easily accessible emergency escalation processes in all relevant languages?
- There is extensive misuse of your platform by bad actors, yet even serial offenders very rarely face closure of their account. How do you decide who is taken down, and what concrete actions are you going to take to remove users who breach your Community Standards?
Zuckerberg's appearance on Capitol Hill this week comes as the American social media giant reels from the alleged misuse of its platform for political influence in the 2016 U.S. election. Last month, it was revealed that a political consultancy firm hired by the Donald Trump campaign and other Republicans had improperly gained access to 87 million Facebook users, including 71 million Americans.
In planned testimony released this week ahead of Wednesday's appearance before the House Energy and Commerce Committee, Zuckerberg is expected to personally apologize for the misuse of the platform he created. “I started Facebook, I run it, and I’m responsible for what happens here,” the Facebook executive plans to tell lawmakers.
The testimony is expected to deal almost exclusively with issues surrounding Cambridge Analytica, the political consultancy firm, and alleged Russian interference in Western elections. Facebook's role in smaller countries such as Burma is not directly addressed.
Burma represents only a tiny slice of Facebook's giant global business. However, the social network has played a powerful role in Burmese society as the country opened up to the world after years of isolation. One study found that 38 percent of Facebook users in Burma got most, perhaps even all, of their news from the site.
The U.S. social media giant has been accused of spreading “fake news” that inflamed ethnic tensions between the country's Buddhist majority and its minority Muslim populations. In 2017, a military crackdown caused more than 600,000 mostly Muslim Rohingya in the country's Rakhine province to flee across the border with Bangladesh — and left an unknown number of people dead.
In an email, the founder of one of the Burmese organizations said they have been corresponding with a number of senators' offices about Zuckerberg's comments. “We would therefore be very surprised if one or more senators did not ask about Myanmar,” said David Madden of Phandeeyar, a leading tech hub that has worked with Facebook to draft its community settings for the Burmese language.
Sen. John Neely Kennedy (R-La.) of the Senate Judiciary Committee told CBS News last week that he wants Facebook to “stop people from running advertisements on Facebook that encourages the genocide of the Rohingya Muslims in Burma.”
Leahy, a long-serving Democrat on the Judiciary Committee, quizzed Facebook's general counsel last October about its role in Burma. “You have a great responsibility,” Leahy said. “Not only can elections be swayed by people who are not favorable to the United States, but people can die.”
Zuckerberg has acknowledged some of the criticisms from groups such as Phandeeyar. The Facebook CEO responded with a personal message to the groups after they published an open letter that accused him of misrepresenting Facebook's success in combating hate speech in Burma during a recent media interview. A spokeswoman for the groups later told BuzzFeed News that the response was “grossly insufficient.”
Although the situation in Burma is especially fraught, critics in other non-Western countries have argued that Facebook turns a blind eye to the misuse of its platform. On Monday, a number of activists and independent media professionals in Vietnam released their own open letter to Zuckerberg, pointing to account suspensions in the country.
“Without a nuanced approach, Facebook risks enabling and being complicit in government censorship,” the Vietnamese groups said.
Htaike Htaike said it was clear that concerns about Facebook are a global problem. Its actions in the United States and Europe may have drawn most of the attention, she said, but many people in other countries are asking: “What about us?”