Far-right outlet Infowars made a short-lived return to YouTube on Thursday after its banishment last year, briefly illustrating social media’s struggle to remove rule-breaking content while still maintaining a forum for free expression.
A new channel called the War Room, the name of Infowars’ afternoon program, surfaced for a few hours and started posting videos. Alex Jones, the founder of the outlet known for spreading conspiracy theories and baseless information, was banned from YouTube and numerous other sites in 2018 after policy violations including posting hate speech and violent content. He had multiple channels on the site.
The relaunch of the channel coincided with a blog post earlier this week by YouTube’s chief executive, Susan Wojcicki. In the blog post, Wojcicki called for allowing some controversial content to remain on YouTube in the interest of fostering a more informed society.
“It sometimes means leaving up content that is outside the mainstream, controversial or even offensive,” wrote Wojcicki. “But I believe that hearing a broad range of perspectives ultimately makes us a stronger and more informed society, even if we disagree with some of those views.”
Infowars did not immediately respond to a request for comment. The first video uploaded to the channel was available for about 17 hours, according to Vice, which first reported the news. Thirteen videos were uploaded before the channel was removed. Vice said one of the videos was titled, “Breaking! YouTube CEO says ‘Alex Jones’ and ‘Infowars Ban Is Over.’ ” Wojcicki’s letter did not mention Infowars or Jones.
In a statement, YouTube spokeswoman Ivy Choi said the company is “committed to preserving openness and balancing it with our responsibility to protect our community.” Choi said YouTube removed the Infowars-backed account and channel for violating its terms of service, specifically for circumventing its policies by creating another YouTube channel after its banishment.
Social media companies in recent years have struggled to balance free speech with policing hate, footage of violent crimes and other problematic content on their sites. They’ve instituted new rules and banned a number of controversial figures from their platforms. They’ve also hired thousands of contractors around the world to review and delete violent or offensive content, something that introduces its own problems, including the psychological toll on the contractors.
Still, policy violating content gets through, illustrated recently by grisly video recorded by the alleged perpetrator of March’s bloody massacres at two New Zealand mosques played out on YouTube and other social media.
Google-owned YouTube, the world’s largest video platform with about 2 billion people logging in monthly, in particular has faced fierce backlash from critics who say it is enabling hateful and inappropriate content to proliferate. With each crisis, YouTube has raced to update its guidelines for which types of content are allowed. But it has lagged Facebook and Twitter in hiring moderators and has faced criticism over apparent double standards when it comes to moderating content from its biggest stars.
Still, it has worked to strengthen its approach, including broadening its view of hate speech in its policies in June, specifically banning videos that espouse racial supremacy or discrimination. Choi said YouTube counts on users to flag content that violates the company’s guidelines, and that the company removed 8.3 million videos in the first quarter of this year alone.
YouTube banned Jones in 2018, long before the company implemented its new policy. Jones is well known for, among other things, espousing conspiracy theories about the Sandy Hook Elementary School shooting that led survivors and their families to be harassed and threatened. YouTube, along with Apple and Facebook, told news outlets at the time that Jones was banned for repeated community standards violations. Jones, in turn, accused the companies of censorship.
While having a “robust community” is a laudable goal for YouTube, the company’s actions Thursday show that “platforms continue to be naive about how bad actors will exploit their words,” said Angelo Carusone, president of liberal media watchdog group Media Matters for America.