Happy Wednesday! We have some news we’re excited to finally share with you all.
Want to commemorate by sharing a news tip? We won’t argue: cristiano.lima@washpost.com.
Below: Russia turns up the pressure on YouTube, and how Ukraine is urging tech companies to step up behind closed doors. First up:
Facebook’s most popular posts show how it’s vulnerable to exploitation, report finds
The vast majority of the most widely viewed posts on Facebook failed basic tests designed to spot spammy or unreliable sources, according to a group of researchers including former company staffers, a revelation they say shows how susceptible the platform is to manipulation by bad actors vying for mass reach.
The report by the Integrity Institute — a new nonprofit made up of tech industry veterans — found that Facebook’s lists of its popular content in 2021 were dominated either by posts from anonymous or spammy sources or that contained unoriginal material.
Researchers say those characteristics are hallmarks of prominent disinformation campaigns, and the fact that spammy or aggregated posts are so popular should be cause for significant concern at Facebook.
“We’ve seen time and again … bad actors find success on platforms within various communities by using this fundamental tactic of not being transparent about who they are,” Jeff Allen, a co-founder of the Integrity Institute and a former data scientist at Facebook and for the Democratic National Committee.
According to the report, shared with The Technology 202, more than two-thirds of the posts on Facebook’s quarterly “Widely Viewed Content Report” for much of 2021 failed at least one media literacy test, with the most common trip up being that links were recycled or aggregated.
Joe Osborne, a spokesperson for Facebook parent company Meta, said some of the report’s findings “don't match our source data,” citing what they called discrepancies in the number of posts that were removed. Osborne added that the reports were created “so we can move forward and address gaps in our enforcements and policies.”
Allen said the trend means it’s easy for untrustworthy sources to piggyback off or even co-opt viral stories for their own motives, just like Russia’s notorious Internet Research Agency did during the 2016 U.S. elections to sow discord online.
“If you don't build systems into the platform that actually care about [identifying and surfacing valuable content] … then it's almost inevitable that your content, your platform will be taken over by non-original content,” he said.
Researchers also found that a handful of posts that cracked this list of most viewed posts either violated or appeared to violate Facebook’s rules.
Allen called that a major red flag. “I think from an integrity professional point of view, any content on the top 20 lists that is violating your community standards is a five-alarm fire,” he said.
The report — which uses one of the few data points about the reach of content that Facebook proactively shares — also raises questions about the company’s transparency.
Facebook faced backlash last year after it was reported that the company shelved one of its content reports amid concerns that the list — which featured in the top spot an article suggesting a Florida doctor died due to the coronavirus vaccine — would make it look bad. A medical examiner’s report said later there wasn’t enough evidence to link the two. (The company said at the time that they knew “there were fixes to the system we wanted to make.”)
Since then, the social network has released lists featuring links that were “removed by Facebook for violating Community Standards” without detailing what they were. Others, researchers found, appeared to break Facebook’s rules and were posted by pages that the company had taken action against — but that weren’t clearly labeled as such.
Facebook has noted in its reports that despite their reach, the posts on the widely viewed content reports still only represent “a small fraction of all content views” on the site.
Allen said that while it’s hard to make statements about the “overall health of the ecosystem” based on the data, it’s still “really good for diagnosing … any large-scale problem that you should be very concerned about.” And ultimately, he said, external researchers can only vet the data Facebook is willing to share.
Since Facebook whistleblower Frances Haugen disclosed internal research showing how the company’s products can pose risks to users, there’s been a renewed campaign in Congress to pass laws requiring that platforms open themselves up more to outside experts. If successful, the push could give researchers including at the Integrity Institute even more oversight tools.
In the meantime, the group is also offering up ideas on how Facebook and other social networks could bolster their defenses against platform manipulation.
One proposal outlined in the report would be to repurpose a ranking algorithm first developed by Google’s founders known as PageRank that measures the quality of search results. Researchers at the Integrity Institute are currently updating the tool to use on social media; they argue it could significantly cut down on spammy or unoriginal content.
“PageRank is a good example of an alternative to engagement-based ranking that the platforms could use at any time,” Allen said.
Our top tabs
Facebook paid a GOP firm to smear rival TikTok
Facebook parent company Meta is paying Targeted Victory, one of the biggest Republican conservative consulting firms in the country, to carry out a nationwide campaign painting rival TikTok as a major threat, my colleagues Taylor Lorenz and Drew Harwell report.
"The campaign includes placing op-eds and letters to the editor in major regional news outlets, promoting dubious stories about alleged TikTok trends that actually originated on Facebook, and pushing to draw political reporters and local politicians into helping take down its biggest competitor," Taylor and Drew report.
Targeted Victory declined to respond to questions about the campaign, saying only that it has represented Meta for several years and is “proud of the work we have done.” Meta spokesperson Andy Stone defended the campaign by saying, “We believe all platforms, including TikTok, should face a level of scrutiny consistent with their growing success.”
Russia is threatening YouTube with fines for ‘information war’
Roskomnadzor, Russia’s official digital censor, accused YouTube of “participating in the information war against Russia” and threatened to fine the company for not removing certain videos from its service, my colleague Gerrit De Vynck reports.
“YouTube is hugely popular in Russia and has been a key way for millions of Russians to consume news and entertainment videos for years. It has more users in the country than any other social network, including Russia’s homegrown Facebook competitor, VK. Since Russia’s Feb. 24 invasion, the country has blocked other U.S.-owned social media networks such as Facebook, Instagram and Twitter, leaving YouTube as one of the few ways for Russians to see content from outside the country without having to download special software to trick their Internet providers into believing they are not inside Russia,” Gerrit writes.
A Google spokesperson did not return a request for comment.
How Ukraine's digital minister is urging tech companies to step up behind closed doors
Mykhailo Fedorov, Ukraine's minister of digital transformation, has “emerged as one of Ukraine’s most visible leaders” as Russia wages war on his country, my colleague Cat Zakrzewski reports.
According to the report, Fedorov is “using the Internet to align major tech companies and marshal the country’s resources in a digital front, for a conflict he has begun to call 'World Cyber War One.' He gained global notoriety for using his Twitter as a cannon to pressure Apple, Facebook and more of the world’s largest companies to build a ‘digital blockade’ against Russia.”
While Fedorov has publicly pressured tech companies to come to Ukraine's aid on social media, he's also wages a behind-the-scenes offensive to get them to step up their contributions.
“His ministry has embarked on an extensive outreach campaign, sending more than 4,000 requests to companies, governments and other organizations, each one personally signed by Federov. His office is in touch with thousands of CEOs of smaller businesses,” Cat writes.
Inside the industry
State attorneys general ask Snap and TikTok to give parents more control over apps. (New York TImes)
Activision Blizzard officially settles federal sexual harassment suit for $18 million (Shannon Liao)
Mentions
- Former Joint Chiefs of Staff chairman Gen. Joseph F. Dunford Jr. and former acting director of the CIA Michael Morell have joined the national security advisory board for the American Edge Project, a coalition of backed by two dozen organizations including Facebook parent Meta.
Before you log off
There're two types of cats.🐈🦐😅 pic.twitter.com/W9NQWsr5fv
— 𝕐o̴g̴ (@Yoda4ever) March 28, 2022
That’s all for today — thank you so much for joining us! Make sure to tell others to subscribe to The Technology 202 here. Get in touch with tips, feedback or greetings on Twitter or email.