Facebook has been accused of spreading “fake news” that inflamed ethnic tensions between Burma's Buddhist majority and its minority Muslim populations. (Kwaw Phyo Hein/AFP/Getty Images)

Earlier this week, Facebook founder Mark Zuckerberg told Vox that his company was well aware that critics say the social media platform has been used to spread misinformation and hate speech in Burma, explaining that this has “gotten a lot of focus inside the company.”

Describing sensational messages that were inciting violence on the platform, the Facebook founder suggested that a mechanism has sprung into place to stop them from spreading. “Now, in that case, our systems detect that that’s going on,” Zuckerberg said. “We stop those messages from going through.”

However, according to a group of activists and civic organizations that work in Burma, that's wrong. In fact, these groups argued in an open letter released Thursday, the case Zuckerberg highlighted showed exactly what was problematic about Facebook's view of misinformation and hate speech in the country.

“From where we stand, this case exemplifies the very opposite of effective moderation: It reveals an overreliance on third parties, a lack of a proper mechanism for emergency escalation, a reticence to engage local stakeholders around systemic solutions and a lack of transparency,” the organizations wrote.

Worse still, the letter argued, this was not an isolated incident. “This case further epitomizes the kind of issues that have been rife on Facebook in Myanmar for more than four years now and the inadequate response of the Facebook team,” it continued, using an alternative name for Burma.

Six organizations signed the letter, including Phandeeyar, a leading tech hub that had worked with Facebook to draft its community settings for the Burmese language. They said that Facebook took action against the messages Zuckerberg referred to only after Burmese organizations sent the company an email about them.

In a statement, a Facebook spokesman said these groups have helped the company combat content that spread hatred and violence in recent years. “We are sorry that Mark did not make clearer that it was the civil society groups in Myanmar who first reported these messages,” the statement said. “We took their reports very seriously and immediately investigated ways to help prevent the spread of this content.”

Facebook's role in Burma has sparked widespread criticism from human rights groups in recent years. The U.S. social media giant has been accused of spreading “fake news” that inflamed ethnic tensions between the country's Buddhist majority and its minority Muslim populations.

The spread of this misinformation may have had violent repercussions. A military crackdown caused more than 600,000 mostly Muslim Rohingya in the country's Rakhine province to flee across the border with Bangladesh by the end of 2017 and left an unknown number of people dead.

Though Burma is only a tiny slice of  Facebook's giant global business, the social network has played a powerful role in Burmese society as the country opened up to the world after years of isolation. One study found that 38 percent of Facebook users in Burma got most, perhaps even all, of their news from the site.

Last June, a Facebook vice president wrote on the company's “Hard Questions” blog that part of the problem was understanding the context in the country. “We expect this to be a long-term challenge,” he wrote.

Zuckerberg's interview with Vox came as the company struggles with a huge scandal about its use of data in the United States and Europe. But the backlash from Burmese civic organizations showed how difficult it may be for the company to address the problems it can help cause in smaller countries around the world, too.

In the open letter, the organizations pointed to two messages shared widely on Facebook Messenger that were clearly designed to inflame racial tensions — one aimed at Buddhists, which used the racist term “kalar” to say Muslims were planning to launch a “jihad,” and another that warned Muslims to prepare for an “anti-kalar movement” led by “extremist nationalists.”

Both suggested that an event would take place Sept. 11, 2017. The group said that despite Zuckerberg's claims, it had to report the messages to Facebook via email Sept. 9, at which point the messages had already been circulating for three days.

“Though these dangerous messages were deliberately pushed to large numbers of people — many people who received them say they did not personally know the sender — your team did not seem to have picked up on the pattern,” the organizations argued in their letter. “For all of your data, it would seem that it was our personal connection with senior members of your team which led to the issue being dealt with.”

The letter went on to highlight a number of factors that slowed the progress of reporting the messages, including the inability of users to flag content as a priority and an apparent lack of Burmese-speaking Facebook staff. The organizations said that while they have had interaction with Facebook's policy team, there were no direct engagements with product, engineering and data teams that might have been better able to implement plans to stop hate speech in Burma.

“The risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar,” the letter concluded. “We appreciate that progress is an iterative process and that it will require more than this letter for Facebook to fix these issues.”

In its statement, Facebook said that many of its problems in Burma were being addressed.

“We are rolling out the ability to report content in Messenger, and have added more Burmese language reviewers to handle reports from across all our services. There is more we need to do and we will continue to work with civil society groups in Myanmar and around the world to do better,” it said in its statement.

But some of these problems may still need work. On Friday, Victoire Rio, a social media analyst based in the country, noted that Facebook's job listing for a “Myanmar Market Specialist” listed Burmese language ability as ideal, rather than essential. Though this was quickly changed by Facebook, the position was based in the company's offices in Dublin.

“I take it it's not that easy to find Burmese Speakers in Dublin,” Rio noted.

More on WorldViews

As Facebook confronts data misuse, foreign governments might force real change