Investigators, who were still looking into Kelley's motives when the video was published Monday, said later that day that Kelley's actions didn't appear to be fueled by racial or religious issues but by an ongoing “domestic situation.” Meanwhile, the conclusions drawn in the YouTube video remained prominently featured online, with more than 70,000 views.
While social media sites are filled with fringe theories and unverified information, the YouTube video's persistent high ranking highlights the challenges facing social platforms as they deliver news to eager audiences before key details have been vetted during breaking news events.
Americans are increasingly turning to social media to get their news, according to a recent Pew Research Center report. About a quarter of adults say they get news from two or more social media sites, and 18 percent of all Americans say they get news on YouTube, making the platform the second-most common social media news source, behind Facebook, the report said.
YouTube said in a statement Monday that the company is continuing to add new features and changes to provide authoritative results when people look to its platform for news. “There is still more work to do, but we're making progress,” the company said.
YouTube isn't the only platform under Alphabet's portfolio that deals with breaking news events. When a person begins to type the gunman's name using Google search, the third term that Google suggests searching is “devin kelley antifa.”
Google said in a statement, "We try to be careful with autocompletions on names, and in this case, our system did not work as intended. We’re currently working on our system for name detection to improve this process moving forward.”
While conspiracy theories have long been a feature of YouTube, which 1.5 billion people use every month, the distribution of fake news has drawn heightened scrutiny in recent months as Congress investigates the role played by social media platforms in the Russian misinformation campaign to influence the 2016 presidential election.
The absence of detailed, verified information hasn't stopped people from consuming and distributing hastily crafted news and analysis.
“You can make up it faster than you can look it up, and that’s the information vacuum that these people operate in,” said Ben Nimmo, a fellow with the Atlantic Council’s Digital Forensic Research Lab. Unlike law enforcement, researchers and journalists, the people behind these types of social media posts are not bound by professional standards to verify claims. Nimmo said part of the purpose behind posts like the Antifa YouTube video is to “reinforce your side,” to weaponize disinformation for political ends.
Monday's video was YouTube's fourth-highest-ranking post when the gunman's name was searched on Monday. It had fallen to 29th on the list as of Tuesday morning.
“I have 100 percent proof here that he is far-leftist and he's Antifa” the video's narrator says, referring to the anti-fascist protest movement. “This is a whole conspiracy right now that the media is doing. The FBI is scrubbing all these Facebook pages right now, so take these images and everything I am giving you and save them somewhere.” The video was posted by an account called The Patriotic Beast.
Even as YouTube and other social media companies work to address the spread of misinformation and other objectionable content, some experts point to problems beyond technological fixes.
“There is no denying that the dominance of social media in providing news to people now has made it way, way easier to spread misinformation.” said Casey Fiesler, a professor of information science at the University of Colorado in Boulder. But “a lot of people want to have what they already think verified. And if that's the case, it doesn't matter what order an algorithm might show you search results in. If you just want your existing opinions or thoughts to be verified, then you are not going to care what the algorithm thinks is more valid.”