“So YouTube’s algorithm massively recommends Russia’s take on the investigation into Russia’s interference in the 2016 election,” Chaslot tweeted Thursday night.
Formerly known as Russia Today, RT is one of the most popular media channels on YouTube, claiming more than 2 billion views. The video interview it recommended, posted by RT’s U.S.-focused division RT America, is sharply critical of American press coverage of the Mueller report and calls journalists “Russiagate conspiracy theorists” and “gossiping courtiers to the elite.”
YouTube disputed AlgoTransparency’s methodology, data and findings and said it could not reproduce the group’s results. General searches about the Mueller report were showing results from verified news sources more often than the RT video, the company said.
"The AlgoTransparency tool was created outside of YouTube and does not accurately reflect how YouTube’s recommendations work, or how users watch and interact with YouTube,” YouTube spokesman Farshad Shadloo said. “We’ve designed our systems to help ensure that content from more authoritative sources is surfaced prominently in search results and ‘watch next’ recommendations in certain contexts, including when a viewer is watching news-related content on YouTube.”
In Chaslot’s analysis, some Mueller-related videos got recommended more often overall. “The Late Show With Stephen Colbert” was recommended more than 5 million times. Some other channels, such as Fox and PBS NewsHour, got hundreds of thousands of recommendations for their Mueller videos. But no other video tracked by AlgoTransparency received recommendations from as many different channels as the RT one did.
Chaslot, a French expert in artificial intelligence who worked on YouTube’s recommendation algorithm before leaving the company in 2013, is among the most prominent critics of the company. His analyses of the functioning of YouTube’s recommendation engine are frequently quoted in news reports — and nearly as often disputed on methodological grounds by the company, as is common with critical studies of technology companies by outside researchers.
For his analysis of the Mueller report, Chaslot said he collected the recommendations from 1,000 popular channels. He searched the resulting data for “Mueller” and based on that search believes the RT program was recommended at least 406,841 times, making it very highly recommended. He said that the real number of recommendations, from all YouTube channels, likely is far higher.
Chaslot said in an interview Friday that while the RT video ultimately did not get massive viewership — only about 55,000 views — the numbers of recommendations suggest that Russians have grown adept at manipulating YouTube’s algorithm, which uses machine-learning software to surface videos it expects viewers will want to see. The result, Chaslot said, could be a gradual, subtle elevation of Russian views online because such videos result in more recommendations and, ultimately, more views that can generate more advertising revenue and reach.
“Every time they make something more pro-Russian, they get more and more views,” Chaslot said. “That’s extremely worrying.”
The Justice Department told RT America that it needed to register as a foreign agent following a report by the U.S. intelligence community that said the outlet had played a key role in Russia’s efforts to support President Trump’s 2016 election.
The Kremlin-backed media site, which U.S. intelligence has called Russia’s premier international propaganda arm, has often enjoyed top rankings in YouTube’s search results. It was also the first news channel to pass 1 billion views on YouTube in 2013.
RT’s deputy editor in chief Anna Belkina said in a statement, “RT produces high quality, interesting and original content; we’re good, thus we’re popular. People are always gonna be jealous of Number One.”
Sen. Mark R. Warner (D-Va.) said the findings again highlighted how tech companies should be doing more to combat disinformation and misuse. “It’s extremely concerning that YouTube still hasn’t fixed the problems with its algorithm that make it so susceptible to gaming and questionable sources like RT," he said.
YouTube, the world’s largest online video service with 2 billion users, has been coming under increasingly sharp criticism over the past two years for a range of issues, including the ability of hate groups, conspiracy theorists, pedophiles and others to manipulate the platform to spread content they favor.
YouTube's recommendation algorithms are designed to steer viewers to videos they may not have otherwise searched for, including by automatically playing more videos through its "Up next" function. But experts said that functionality can lead viewers down a rabbit hole of increasingly concerning videos of conspiracy theories, disinformation or offensive content.
YouTube's algorithms have previously been designed to maximize watch time, which Chaslot and others have criticized as rewarding more shocking or sensational videos. YouTube said it now relies on information such as user surveys, likes, dislikes and shares to improve its recommendations.
Even its efforts to combat misinformation have in some cases backfired, as happened this month when videos of the flaming collapse of the spire of Notre Dame Cathedral in Paris were incorrectly identified by YouTube as imagery from the Sept. 11, 2001, terrorist attack on the World Trade Center in New York.
Tony Romm contributed to this report.
CORRECTION: An earlier version of this story incorrectly said that YouTube had recommended RT programming more often than other network’s programming for analysis of the Mueller report. In fact, some programming was recommended more often, but in Chaslot’s analysis none was recommended by as many channels as RT.