Officials at YouTube, the world’s largest video site, said the clip showcased “demonstrably false content that undermines trust in the democratic process,” and they blocked it from pulling in advertising money. But they kept it viewable, attaching only a small disclaimer saying election “results may not be final,” because they said it did not directly break rules against videos that “materially discourage voting.” It has been viewed more than 400,000 times.
Videos on YouTube and other sites have become one of the most potent funnels of disinformation on the Web, helping boost election conspiracy theories and political point-scoring attempts faster than social media companies and fact-checkers can respond.
As Trump falsely claimed victory and his supporters rallied against the ongoing vote count, many turned to raucous live streams, out-of-context clips and viral conspiracy theories to win attention and undermine foes. Since the election, millions have watched and shared videos featuring debunked or unproven allegations involving mysterious wagons, ripped-up ballots and vote-deleting felt-tip pens.
The rumor mill churned across text-based social media, too, often with the videos held up as proof. But away from the social media sites that have removed posts or flagged them as misleading, a number of the videos have benefited from YouTube’s huge audiences, light-touch regulations and limited disclaimers on even the most certifiably false videos.
Evelyn Douek, a lecturer at Harvard Law School who researches online speech, said video sites like YouTube too often get a pass from the discussions over hate speech, conspiracy theories and viral misinformation that have defined Facebook’s and Twitter’s last few years.
Part of that is technical: Videos are harder to track and more time-consuming to check than simple, searchable text. But Douek also said it’s because regulators and journalists underappreciate the major pull videos have on our information ecosystem: YouTube’s parent Google says more than a billion hours of video are watched there every day.
“We’re so focused on the other platforms that we don’t demand the same accountability and transparency from (YouTube), and nobody kicks up a fuss,” she said. But “that has created a blind spot in our public discourse about misinformation and disinformation and all the same content moderation issues” facing the rest of the Web, she added. “We can’t just let them get away with this.”
YouTube spokesman Farshad Shadloo said the company’s rules are “generally on par and in some instances more aggressive” than other social media sites, including targeting doctored “deepfake” videos and conspiracy theories “used to justify real-world violence.” The site, he added, had announced its medical-misinformation rules in March and had been enforcing them since January. Last month, it expanded the policy to ban videos falsely claiming vaccines would kill people.
But video streaming is a fundamentally different policing challenge than text, Shadloo said, and requires a different response. Following years in which YouTube was criticized for sometimes sweeping viewers down rabbit holes of increasingly extreme videos, the site has worked to buoy authoritative news videos and bury disinformation and spam on its home page and in its next-video recommendations and search results.
Instead of removing most false videos outright, the company has more often reduced their spread in automated recommendations and appended disclaimer labels and “information panels” beneath the videos with links to reputable sources. Some of the most recent disinformation videos, Shadloo argued, have gained their largest audiences from Facebook, Twitter or other sites.
It’s easy to see why online videos have become so useful for getting one’s point across. They’re visceral, colorful, shareable; often way more fun than text. They lend a tinge of unearned authenticity, just by merit of being visual. And they reach people without the time, energy and interest to read — on websites and apps devoted to socializing and entertainment, not just the news.
Often overlooked amid Facebook’s crowded conversations, Instagram’s glossy imagery and Twitter’s fire hose of news, YouTube and other video sites have typically been regarded as niche online properties with little overlap in American debate.
They’re actually anything but: One in four U.S. adults say they get their news from YouTube, and more than 70 percent of those say they expect what they learn there was largely accurate, according to a September survey by the Pew Research Center. And viewers generally don’t care where that news is coming from: Just as many said they learn about current events from traditional outlets as from independent channels, including many with a clear political bent.
That includes rising conservative stars like the Right Side Broadcasting Network, an Alabama-based channel “On The Right Side of History” that has been viewed more than 170 million times since 2014. More than 2 million users tuned in for at least some portion of its eight-hour live stream on election night, which featured interviews with fallen Fox host Bill O’Reilly and the National Rifle Association’s ex-spokeswoman Dana Loesch.
Promoted to viewers “tired of the mainstream media overwhelming you with numbers and information without proper perspective,” the broadcast opened with host Mike Slater setting their coverage apart from everyone else: “Have you learned nothing these last four years about the media and how they manipulate you? Do you trust them? … [Are we] going to start trusting them tonight?”
And unlike Fox News, the channel’s viewers can talk with each other in a fast-scrolling chat box. “The media is the long arm of the globalist,” one popular comment said. “TRUMP is the arm breaker!”
The last days of the election have been rife with videos deceivingly made or ripped out of context. In the last weeks of the campaign, President Trump’s son Eric and allies in the GOP and Fox News shared a deceptively edited video in which Democratic candidate Joe Biden appears to take pride in the biggest “voter fraud organization” in history — when, in reality, he was saying exactly the opposite.
Some social media sites flagged it, but not before it’d gained millions of views; on Wednesday, the newly elected Republican member of Congress Marjorie Taylor Greene, a booster of the far-right QAnon conspiracy theory aligning Democrats with Satanists, pointed to the clip as proof of fraud “on full display right now!”
Videos with no basis in reality easily gained lives of their own. On Wednesday, Eric Trump alerted his 4 million Twitter followers to a video, supposedly of 80 Trump ballots being burned. The video was suspect on its face: It offered no clear evidence and had come from a since-suspended account with QAnon links in its profile named “Ninja_StuntZ.”
Election officials in Virginia Beach had, a day before, confirmed that the charred remains were actually sample ballots, easily printed by anyone. But Eric Trump’s tweet, which offers no disclaimer that it’s fake, nevertheless gained more than 30,000 retweets. The video has since been more widely distributed across far-right militia and pro-Trump online groups as proof of voter fraud. One repost of the misleading video on YouTube, titled “TWITTER REMOVED THIS VIDEO!”, has been viewed more than 100,000 times. (Eric Trump did not respond to requests for comment.)
Videos have been used to bolster some unique methods of election-day mischief, including with trolls who broadcast live streams of fake election results for hours on YouTube before the site’s moderators marked them as spam.
They’ve also played a growing role as the president and his allies have cast doubt on the integrity of American elections. When Trump supporters rallied in protest outside election centers — first to count the vote, then to stop it — many live-streamed the confrontations with officials for their own online audiences.
Rambling live streams from far-right provocateurs also have helped rally Trump supporters to his defense. The conservative commentator Steven Crowder trashed Democrat voters and interviewed far-right figures like the conspiracist Alex Jones, who labeled Biden a “walking corpse,” in a seven-hour live stream that has been viewed more than 8 million times.
Some voting officials have attempted to roll out videos in a way that could help smooth along the post-election aftermath. Election authorities in places like Arizona’s Maricopa County and Washington’s King County have launched live video feeds so Americans can see for themselves the thrilling dramas of signature verification and envelope review.
But even those bids to boost transparency inside the country’s election system have been misconstrued as proof there’s something to hide. Video of an official reinforcing the windows of a Detroit ballot-counting center has been passed around as proof of an actual voter coverup. “SHADY,” White House press secretary Kayleigh McEnany tweeted Wednesday, in a post retweeted more than 17,000 times.
One video by the pro-Trump YouTube creator Austin Fletcher, taken from a live stream inside an Atlanta vote-counting center, suggested that an election worker had crumpled and tossed a ballot away. It’s unclear that it’s a ballot, though; the paper in the video is unrecognizable. And election supervisors with Fulton County, whose officials did not immediately respond to requests for comment, can be seen monitoring nearby. The video was nevertheless viewed more than 4 million times on Thursday, including with the help of a retweet by Donald Trump Jr.
“If that’s not voter fraud, I don’t know what is,” Fletcher said, offering zero actual evidence. Trump Jr.’s tweet hasn’t been flagged as “disputed” on Twitter, where it’s gained more than 30,000 retweets.
Just how effective are those disclaimers, anyway? The social media companies don’t offer details, but the endlessly multiplying nature of the Web likely limits their usefulness. An election-night video in which Trump supporters in Arizona said their votes with Sharpie pens had been mysteriously invalidated — totally wrong, election officials said — was labeled as “false information” by Facebook, where it’s been shared more than 125,000 times. The baseless “Sharpiegate” conspiracy theory is now practically unavoidable on Trump-boosting tweets, posts and message boards. It’s also potentially headed to court.
There is plenty of truth in online video, too. Streaming sites once devoted to quirky dances and video games, like the explosively popular short-video app TikTok and the live-streaming gaming site Twitch, now overflow with clever creators offering engaging riffs on current events. One of the most popular Twitch streams on Election Day was from the Bernie Sanders-loving commentator Hasan Piker, whose frenetic 16-hour coverage of himself eating a salad (served by his mother) and discussing the news, titled “BEDLAM IS UPON US!!!!!!!!!!!!!!!!!”, has been viewed more than 4 million times.
But the up-and-coming video giants often stumble in moderating the tense squabbles of political debate. On TikTok, for instance, the rules around acceptable content are far less clear than the more established social media apps it’s long since passed on the download charts.
TikTok’s election-related hashtags include links to both blatantly false videos but also satires and debunks, and company officials there have said they’d prefer targeting specific videos only when they cross the line, as opposed to blanket-banning hashtags outright.
But that ruling has allowed polarized hashtags like #riggedelection to gain more than 1 million views. That hashtag and another, #fraudonlywaybidenwins, were redirected to the site’s rules only after The Washington Post asked the company about them.
TikTok’s massive audience, many of them teenagers, nevertheless sees the app as an informational free-for-all. In one TikTok video that suggested window coverings were hung at the Detroit vote-counting center so officials could “scam the American people in peace,” a girl in a Trump bucket hat who said she was 17 argued that the clips were “just the evidence we need” of voter fraud.
“Instagram, Twitter and TikTok are deleting posts that have those videos in them. Thankfully, though, they can’t delete all the posts,” she said with a smile. The video has gained 15,000 shares, 2,000 comments and been “liked” more than 88,000 times.
Tonya Riley, Isaac Stanley-Becker, Tony Romm and Cat Zakrzewski contributed to this report.