The Internet, as everyone knows, is riddled with racist, sexist, sadist ranters, not to mention trolls, stalkers, savage bullies and inciters of mayhem, from sextortion to child pornography to terrorism. A whole industry now exists of “moderators,” thousands or perhaps hundreds of thousands of them, paid to patrol sites like Facebook and Twitter, along with news sites like The Washington Post and the New York Times, weeding out offensive posts and comments.
YouTube, among others, has tried a variety of tactics, including real-name requirements using Google+, designed to subdue the beast, “flagging” by users to alert its moderators to content for possible removal and an elite corps of “Trusted Flaggers,” which the Google-owned company says “gives users access to more advanced flagging tools as well as periodic feedback, making flagging more effective and efficient.”
All this to modest avail, despite the fact that, as YouTube reports, “over 90 million people have flagged videos on YouTube since 2006 — that’s more than the population of Egypt — and over a third of these people have flagged more than one video.”
So on Thursday it proposed something new, “YouTube Heroes,” essentially a gaming effort to entice users into, among other things, “mass flagging” of offensive content, which would then be reviewed by professionals and removed if warranted. Here’s how it’s supposed to work, according to the YouTube blog post:
YouTube Heroes will have access to a dedicated YouTube Heroes community site that is separate from the main YouTube site, where participants can learn from one another. Through the program, participants will be able to earn points and unlock rewards to help them reach the next level. For example, Level 2 Heroes get access to training through exclusive workshops and Hero hangouts, while Level 3 Heroes who have demonstrated their proficiency will be able to flag multiple videos at a time (something Trusted Flaggers can already do) and help moderate content strictly within the YouTube Heroes Community site.
YouTube explained the Heroes program with a little video:
By early Friday morning, it had garnered 1,015,700 views.
It had also garnered an overwhelming “thumbs down” (342,033 of them) versus “thumbs up” (5,500).
The comments were disabled, as they always are on that particular channel.
"youtube heros" you know you are making the right decision that everyone agrees with when you disable the comments section.
— Dick Frenzy (@TheBedfellows) September 22, 2016
But the reaction among YouTube users on Twitter, along with a bunch of parodies posted on YouTube and elsewhere provides a (moderated) glimpse of the reaction. Some suggested it was a “snitch” rewards program. Others said it would encourage YouTube vigilantism.
And still others saw YouTube Heroes as a way of having users do the work the company ought to be paying for.
“As someone who’s had vids falsely reported because angry mobs were angry at them, #YouTubeHeroes seems like an idea that can go bad quick,” tweeted Jim Sterling.
“So if I have this right,” tweeted someone going under the name of ToddInTheShadows, “YouTube Heroes is more accurately YouTube Vigilante Mobs?”
youtube wants us to start mass flagging videos so i flagged their #YouTubeHeroes video. level up!!
— Deplorable Luna (@rabbitraisin) September 21, 2016
— Deefry (@d20_highroller) September 21, 2016
Twitter and YouTube itself were swamped with new videos ridiculing the idea.
— Mister J (@JustinTotino) September 22, 2016
Apart from the merits of the idea, some suggested that YouTube’s way of introducing it left too much to the fertile imagination of some users and perhaps the pecuniary aspirations of others. Some YouTubers make good money from sponsorships of their videos.
“This is a huge undertaking that you tried to explain in a minute-thirty with a nice little video,” Dan Speerin, a YouTube creator and vice president of the Independent Web Creators of Canada (IWCC), told the CBC. “We need more info.”
And in fairness, there’s kind of an iron law under which no regular user ever likes anything new introduced by the likes of Google, Facebook or Twitter.
“We don’t know if an online community of unpaid volunteers will help rid YouTube of some of its nastier trolls, but other sites also encourage users to self-police,” wrote Gordon Gottsegen at CNET, noting that Reddit employs a similar method, “allowing some members to moderate forums and even ban other users if need be. At the very least, YouTube is acknowledging it has a problem with negative comments and inappropriate videos. Let’s hope the program starts making a difference quick.”
Correction An earlier version of this story suggested the comments were disabled after the YouTube video was posted. They were disabled when the video was posted as are other videos on that channel.