Skip to main content
The Washington PostDemocracy Dies in Darkness

Discord adds new Twitch-like AutoMod feature to help keep users safe

(Washington Post illustration; Discord)

Discord, a voice and text-based communications platform near-ubiquitous among gamers, is a communications marvel, but a logistics nightmare. Individual servers can teem with tens, hundreds or thousands of people, making round-the-clock moderation a nauseatingly tall order. That’s where Discord hopes AutoMod will come in.

AutoMod, which became available to Discord users Thursday, is a variation on a tool made popular on platforms like Reddit and Twitch. True to its name, it automates portions of the moderation process such that human moderators — who are often volunteers — don’t have to stay glued to their screens to keep text chats and threads free of foul language, off-topic/banned discussions and spam. On the aforementioned platforms, it’s become an indispensable tool in an era where online conversations never really end.

Discord hopes to free up human moderators for the bigger picture issues in the communities they steward.

“Moderators spend a pretty decent amount of time just policing their servers, but I think the real superpower of moderators is when they’re more cultivating culture and running events [rather than policing individual comments],” Discord group product marketing manager Jesse Wofford told The Washington Post. “There’s a lot of things they’re doing every single day that we can automate.”

What is Discord, the chat app used by the Buffalo suspect?

Discord’s take on AutoMod can automatically detect, block, and alert moderators of words, phrases and even portions of words deemed inappropriate before they ever become visible to other users. It can also automatically time out users who try to post slurs or spam; a human moderator can decide their fate at a later date.

Discord’s AutoMod is a cut above user-made bots which portions of the Discord community have adopted for essentially the same purpose, but which — due to their lack of access to Discord’s inner workings — can only delete harmful messages after they’ve appeared for at least a split second. Additionally, users have to seek out bots and install them on their servers, where Discord’s AutoMod is built into the platform, making it more accessible.

However, rather than force bot-patrolled Discord communities to toss out their carefully curated keyword lists and start over, Discord is sharing the ability to delete messages before they go up with third-party developers, effectively upgrading user-made moderation bots as well.

“If they’re using tools that make their community safer and easier to manage, whether that’s through a third party or through us, it’s a win for us regardless,” said Wofford. He also added that though Discord AutoMod shares functionality with Twitch’s variation on the theme, Discord took the bulk of its inspiration not from other platforms, but from “talking to our developers and our admins.”

On Discord, bots find a foothold as mini indie success stories

This new safety focused functionality comes at a time when Discord is under heightened scrutiny from both its own users and politicians like New York Attorney General Letitia James following the revelation that the Buffalo shooter preemptively laid out his attack plans on a small Discord server, which Discord has since said he kept entirely private until just 30 minutes before the attack. Despite the timing, Wofford explained that AutoMod has been in the works for “quite some time” and that its launch is not a reaction to the tragedy.

Still, he said, the company’s trust and safety team — which works to alleviate larger, platform-wide issues involving harm and violence like the Buffalo shooting — is involved in many initiatives across Discord, AutoMod included.

“Cross-collaboration between our two teams is pretty constant,” said Wofford. “Even things like the predetermined list of words [AutoMod initially suggests for users to block] was worked through with our safety team and experts they have both internally and externally.”

But Discord’s suggested list is just the beginning. Memes are ever evolving, as are dogwhistles, spam and scam messages and purposeful misspellings of words that tiptoe past even the most vigilant filters. A completely innocent word in one Discord server might be a racist remark in another. Wofford and company hope AutoMod’s custom keyword lists will allow moderators to react so quickly and comprehensively that even creative trolls lose interest.

“We’re investing just as much if not more effort on our side to be taking care of the safety issues even without the moderators,” he said. “But we also know that moderators often have the most context for what they do or don’t want in their community. And these types of tools in particular are about giving them more powerful ways to actually enact that for themselves.”

Discord’s newly released version of AutoMod is far from a finished product. Wofford said that as far as future features go, the team is considering more nuanced AutoMod functionality that might, for example, allow certain rules to go into effect only at specific times a day when moderators are away. But it’ll all depend on users.

“Once we release this, we want to understand how much people are actually utilizing it,” he said. “And if people aren’t utilizing it, what are the things that we’re missing?”