Tim Harmsen, the head of the Military Arms Channel on YouTube, said in a video Saturday that he had temporarily disabled all of his videos after moderators gave him a “strike” for three firearms-related videos, including one about an exploding rifle target, they said had violated the site's guidelines. YouTube bans accounts after three strikes.
He posted a notice from moderators that said, “We don't allow content that encourages illegal activities or incites users to violate YouTube's guidelines.” “Right now we're under attack” by YouTube employees, whom he called “far-leftist lunatics,” he said in a video that has been viewed 260,000 times.
YouTube said in a statement that some of the videos were “removed in error” and would be reinstated. “As we work to hire rapidly and ramp up our policy enforcement teams throughout 2018, newer members may misapply some of our policies resulting in mistaken removals,” the statement said.
After other examples of the misuse of its platform last year, YouTube in December said that its parent company, Google, would increase the number of people working on content moderation to 10,000 this year.
A source close to YouTube's operations said the mistaken removals included right-wing but also left-wing and mainstream videos. The errors, the source said, were due to the company ratcheting up its oversight of some of the more than 400 hours of video uploaded to the site every minute.
“We're flagging content at a much higher volume, so we're having more false positives because more content is being reviewed,” the source said. “As we dramatically step up hiring, we will see mistakes. This isn’t the last time this will happen.”
Sarah T. Roberts, a University of California Los Angeles assistant professor who studies content moderation, said the mistaken removals were part of YouTube's “come-to-Jesus moment around understanding their own values and economic model.” The site, she said, had long sought to avoid publicly making content-moderation decisions because it didn't want to be held responsible for deciding which speech should be protected or which videos go too far.
With “content moderation having been an absolute afterthought for so long, but with it now suddenly gaining importance and prominence in the public eye and the eye of regulators, they are really reckoning with their need to communicate to the public late in the game,” Roberts said.
The moderators, she added, can be “easy targets” for YouTube to blame in moments of pushback. “When they do do something that the public can perceive and respond negatively, they are able to gesture at human moderators and hang the blame largely on them,” she said. “But there's a lack of clarity around what the values are ... where those polices emanate, and on whose behalf.”
Alex Jones, the conservative conspiracy theorist who publicly raised the possibility that the Parkland shooting was a “false flag” attack just hours after it happened, said his videos were among those targeted by YouTube’s efforts to crack down on content that violated its policy against harassment. He was told he had been given “two strikes” by YouTube and was at risk of being banned from the platform.
But Jones loudly complained, posting videos to his own InfoWars website alleging that YouTube was wrongly blocking content that did not in fact violate policies. Jones said he did not dispute that the Parkland shooting happened or that the surviving students were real; he merely alleged that they were being coached in their public appearance as part of a political drive for gun control and questioned other aspects of the mainstream reporting about the attack.
Jones said in an interview Tuesday, “The good news is that they’re now having to back off … because I didn’t say what they said I said.”