But wait — there was more. The writer, Carlos Maza, protested that the streamer, Steven Crowder, still sold T-shirts declaring that “Socialism Is for F*gs.” (A literal fig leaf took the place of the asterisk.) So YouTube said Crowder would have to quit his hawking. Observers were baffled that the bigotry apparently wouldn’t be an issue if it weren’t branded on Hanes cotton, so YouTube tried again: It wasn’t just the T-shirts Crowder had to get rid of to stop being a rule-breaker. It was all that other awful stuff, too.
Finally, at the end of this days-long debacle, YouTube published a blog post explaining its thought process. The platform’s promise? It would update its rules, of course.
Rules, though, are not worth much unless they serve a principled purpose. And figuring out that purpose could be the most crucial and most complicated challenge for platforms today.
YouTube has policies prohibiting hate speech and harassment, at least in theory. Had YouTube wanted to act against Crowder’s conduct, it could have — citing the rules. And when it didn’t want to act against that conduct, it didn’t — again, citing the rules. This was reminiscent of Infowars founder Alex Jones having been within the rules as he befouled the platforms with hoaxing and hate, until suddenly last summer he found himself banned.
The primary principle guiding YouTube in these cases appears to have been the imperative to avoid making very many people very angry very publicly. Rules were only, well, a fig leaf.
But really, platforms should write and apply their rules only after facing up to some thorny questions. What’s their responsibility for what happens in the world off the Web? It’s easy to say you’re for free speech. And it’s easy to say you value the safety of your community members. But what do you do when those principles run up against each other?
For YouTube and other platforms, these questions are existential. Sometimes the companies offer stages for debate, sometimes information services, sometimes just the easiest way to check up on Grandma. What do they really believe they are?
It’s a lot to grapple with. There’s a risk platforms will become censor-happy and wipe away the openness and empowerment they were built to provide. There’s a risk they won’t do enough and that people will keep getting hurt. There’s a risk that a rule saying no to that clip distorted to make teetotaler Nancy Pelosi look drunk, or to the kind of garbage that comes from Crowder, could put all sorts of content we may value — parodies, maybe, or late-night-talk-show-style snark — in danger.
There are practical problems as well as philosophical ones: These sites serve millions or even billions, and they have to operate at scale. While obvious violations can be left to an algorithm, neither computers nor low-level moderators operating under too much stress with too little pay can referee the edge cases that animate our most fractious Twitter fights.
Still, that’s no excuse not to do the grappling. YouTube could have said from the outset this week that its commitment to providing a forum for public figures to argue with each other over political topics outweighed its commitment to preventing cruelty or protecting the marginalized, if that’s what it believes. It could have said the opposite. Or it could have said that the doxxing and other attacks Maza was suffering from Crowder’s followers made the balancing act moot, because any content that leads to real-world harm is unacceptable.
Instead of dispensing a bunch of unsigned tweets full of byzantine reversals, YouTube also could have told us how it was coming to its decision. Better yet, it could establish a transparent process, like the sort of oversight board Facebook is crafting, untethered to profits or stock price. Platforms need to do the work and show their work — show not only that they have rules but also that those rules are built on a foundation of principle and that the principle is more meaningful than just trying to avoid making people angry on the Internet.
Which is an impossible goal anyway.