Ressa, who was featured on the cover of Time magazine’s 2018 Person of the Year, identified 26 accounts that reached more than 3 million Facebook users in mid-2016. That October, she asked Facebook to remove them, she said, arguing it would be too dangerous for her news outlet to publish the findings first.
She feared for her safety and that of her colleagues because social media mobs had already silenced other journalists and civic leaders who criticized the extrajudicial killings of drug addicts and dealers that Duterte was promoting. “I gave the data to Facebook because I was hoping they would fix it, and then we could do the story,” Ressa said, adding that the executives looked “shocked” at what she told them.
Ressa had intended to write a story about Facebook taking down the accounts. But when Facebook did not act, her publication, Rappler, readied a three-part series. An avalanche of threats and lawsuits followed its publication, culminating in Ressa’s arrest and overnight detention in a cyber-libel case against her. She is free on bail awaiting arraignment March 1 and has been forced to increase security for herself and Rappler. "If Facebook had taken action in 2016, I wouldn’t be in this position,” Ressa said.
Kaye said in an interview there is no legal mechanism to hold the company accountable. Countries have given Facebook legal immunity for the content it publishes.
Simon Milner, Facebook’s vice president of public policy in the Asia-Pacific region, said, “Keeping our community, especially those who are at risk, safe is our top priority.”
In the Philippines, Milner said, the company has increased the number of people policing content, built better technology to report abuse more quickly and expanded digital literacy efforts. It has also invested more in training news outlets in best practices and analytics.
“There is always more to do, and that’s why we have a dedicated team of product, policy and partnerships experts who are focused on helping keep our community in the Philippines safe,” he said. The dedicated team was put in place last year after widespread criticism of Facebook surfaced following the 2016 U.S. election, when Russia was easily able to use the platform for disinformation and help elect Donald Trump as president.
Ressa said she gave the executives the account names at the first meeting and assumed they relayed them because they were taking notes. Facebook spokeswoman Ruchika Budhraja defended the company’s response to Ressa by saying that executives had asked the journalist for the Internet addresses of the fake accounts but that she didn’t send them until weeks after publication. Facebook, Budhraja said, “took action on some of these accounts in October, but we only had the article to go on.” After Ressa sent them all 26 accounts in November, “we took action on the remainder of accounts that violated our policies.”
Ressa’s discoveries showed Facebook’s failure to enforce its own policies against fake accounts and calls for violence. Rappler’s series described how “sock puppets,” fake accounts controlled by a network of Duterte supporters, engaged real people online and spread lies, misleading photos and false incidents of rampant crime to drum up support for Duterte’s hard-line anti-drug policies. The accounts called for violence against legislators, civic activists and journalists who spoke up against Duterte’s tactics. Ressa was among them.
It is against Facebook’s policy to create and use accounts using false identities, as these accounts did, and to use the platform to call for violence against individuals or groups, as many of these accounts also did. Facebook has said it was “too slow” to develop the technology and to employ enough people to spot large quantities of bad content and either remove it or reprogram its algorithm to push it down so low in a consumer’s Facebook feed as to make it unlikely to be seen.
Ressa’s legal troubles and the continuing violent threats against the 55-year-old journalist are widely viewed as the government’s way to shut down Rappler and drive her out of the business of revealing government wrongdoing. As she predicted, “the online threats increased exponentially after we published our three-part series,” she said. “The charges for the cases later filed were seeded in social media, repeated exponentially. A lie told a million times becomes truth.”
Ressa did not stop pursuing Facebook executives after the initial 2016 meeting. In a recent interview with “Frontline,” she described meeting with more than 50 employees, including chief executive Mark Zuckerberg, to urge them to stop the systematic abuse taking place on Facebook’s pages.
In April, Facebook hired Rappler to become part of its new news verification program, which fact-checks on behalf of the social media giant. Ressa said the company is doing a better job than it did before. Neither would discuss the financial arrangement. Rappler staff members have been overwhelmed by the volume of false information still flooding the platform.
Last month, Zuckerberg said he had “fundamentally changed how we run this company” in response to the dangers its technology has enabled, although there is no independent way to verify those assertions.
The human rights case against Facebook is growing. Last year, U.N. human rights investigators found it had played a “determining role” in the genocide of Myanmar’s Muslim Rohingya by allowing its platform to be used to incite widespread violence against the minority group. Sri Lankan authorities temporarily banned Facebook last year when calls to kill Muslims circulated freely, inciting riots and killings.
After Rappler’s series, threatening, hate-filled Facebook posts poured into Ressa’s page at a rate of 90 hate messages an hour, Ressa said. She again pressed the company to do more and was told she needed to formally report the messages. She said such a task would have taken her 24 hours a day because of the high volume. At that point, Facebook also told her there was nothing more it could do because it considered her a public figure, she said. Facebook has since lowered its threshold for removing threats against journalists, the spokeswoman said.
About 90 percent of Facebook’s market is overseas. In the Philippines, 95 percent of people online use it, a popularity seen in other developing countries where it has become the primary way to communicate. To grow the Philippine market, Facebook trained then-presidential candidate Duterte and his campaign staff how to use its technology. They gave similar training to many other political leaders, including autocrats in Egypt, Myanmar, Turkey and elsewhere.
Ressa said she had been one of Facebook’s biggest fans back then. She believed its campaign efforts would empower more citizens to take part in the political process. “I thought there was great potential.” She invited Facebook executives on her television show to promote the platform’s use during the presidential campaign. But shortly after Duterte’s election, when he began his anti-drug crusade and massive disinformation campaign, Rappler began investigating.
Ressa was already steeped in social network analysis. After the 9/11 terrorist attacks in the United States, as a CNN correspondent, she traced the al-Qaeda terrorist group’s network around the Philippines and across Southeast Asia. Her journalism was sometimes groundbreaking, disclosing connections that authorities were unaware of.
Rappler’s October 2016 series offered, in retrospect, a surprisingly accurate blueprint for the Russian disinformation operation already in full bloom in the U.S. presidential campaign and using similar digital techniques. U.S. disinformation in 2016 was foreign, not domestic, and didn’t call for violence so much as sow social discord.
Ressa still holds Facebook responsible for allowing the scale of false information in the Philippines to grow so fast. “They built this. It’s theirs,” she told “Frontline.” "It seems like they just gave everyone the guns, and they said, ‘Whomever … kills the most people wins.’ There were no rules.”
The United Nations’ Kaye said Facebook must be more transparent about its actions and decision-making so the public can debate and perhaps revise options for holding it accountable. “Right now,” he said, “we really rely on the company to do the right thing.” In Ressa’s case, Kaye said, “She told them, ‘They are threatening me.’ She didn’t hear anything. It was radio silence.”