This past weekend, however, moderators appeared to be overzealous when deleting posts — particularly those identifying shooter Omar Mateen as a Muslim, and some non-controversial posts such as where people could give blood in the Orlando area. Things escalated further when members of the main Donald Trump community on Reddit began repeating accusations of censorship en masse and managed to dominate the site’s central page for displaying all Reddit content — known as “r/all” — with their accusations.
The situation exposed some uncomfortable truths about how small groups of Reddit users can affect what the entire community sees, and raises questions about
Reddit's reliance on volunteer moderators and its ability to act as the front page of the Internet.
To help address the issue, Reddit chief executive Steve Huffman said that the firm has tweaked the algorithm it uses to display what’s on the r/all page. Right now, the page
displays popular posts from nearly all the site’s subsections.
Under the new algorithm, Huffman said, repeatedly posting from the same community will make it less likely that posts from that specific community appear in the r/all listing.
“The gist of the way it works is that the more often a community is in the listing, its ‘hotness’ gets demoted a little bit,” Huffman said.
That, he said, should help keep small groups of users from dominating the site’s overall conversations and keep situations like this from snowballing out of control.
That change will take effect today. It is the first time Reddit has changed its r/all algorithm since 2008, said Chris Slowe, Reddit’s founding engineer. Reddit is also planning to make a few other changes in the future, including allowing all users to filter certain communities out of the r/all page.
The algorithm tweak is a very technical change to address a technical problem. But what about the cultural issue?
That’s a harder problem to solve, as it pits Reddit’s principles of free expression against its desire to moderate discussion on the site in any reasonable way. In this particular case, Huffman said that if he could go back in time, he would have had Reddit employees step in earlier.
“We would have stepped in right away and created a live thread,” Huffman said, referring to a tool the company has specifically designed for live events. These are also monitored by moderators but aren’t linked to particular subsections of the site.
The firm is also working to improve tools that allow Reddit employees to more easily contact the moderators of subreddits, in case the company itself needs to step in to arbitrate a situation.
Huffman said that Reddit is well aware that there can be problems with the way moderators are chosen, particularly because there’s no real check against the control they have over what is displayed on the site. Some users have suggested that all moderators be elected, or chosen in some other way, from within a community’s users. However, Huffman said, the company still thinks that it should give individual communities the right to operate as they want.
“We still believe that Reddit should be operated so that communities can operate how they like — you can be as strict or as lenient as you like, as long as you’re not more lenient than Reddit’s rules,” he said.
While Huffman acknowledges that Reddit has work to do when it comes to balancing expression and a responsibility to present a (reasonably) unskewed picture of what’s happening online, he did note that these problems aren’t all specific to Reddit.
“I’ll tell you what happened on Reddit was very similar to what happened nationwide after the shooting — we had a small group of people who immediately politicized the issue,” he said. “We don’t want to see that in the world at all, but we have some control over Reddit.”