Twitter knows it has a problem with online abuse, and on Tuesday it announced three more changes it's making to help users deal with it.
This "whack-a-mole" problem is a big one for Twitter, which many of its critics have brought up as a consistent weakness in the network's anti-harassment policies. Twitter started to address this before. In 2015, the company said it would start asking users suspended for abusive behavior for their phone numbers in order to reinstate their accounts — phone numbers being at least somewhat more difficult to obtain than an email address.
Twitter did not elaborate in the post on how it will expand its efforts to crack down on repeat abusers or give an exact timeline of when users will see this feature.
The company also said it is working on altering its search function so that tweets containing "potentially sensitive" content — messages that may contain, for example, violence or nudity — won't show up in a normal search. The altered search function will also ignore messages from people a particular user has blocked or muted.
"While this type of content will be discoverable if you want to find it, it won’t clutter search results any longer," the company's post said, although it didn't say when users will see this on their own accounts.
The third change deals with a filter of sorts for conversations. Twitter, by default, will soon show what it has judged as the most relevant replies to a tweet. Others will be hidden behind an expandable bar, labeled "Less Relevant Replies" — messages that Twitter believes may be "potentially abusive" or "low-quality."
The hope is that the most thoughtful and relevant replies will rise to the top. But, if you want to see the spam, abuse or other messages that don't make the cut for some reason, you'll still be able to see them by expanding the conversation all the way.
The ability to weed out tweets by relevance will roll out in the "coming weeks," Twitter said. You can see how that will look in action, here:
Twitter has touted itself for years as a network that supports freedom of expression, which can put it in a tricky position when trying to judge what constitutes abuse. But the social network has responded to criticism that it hasn't moved quickly enough to fix its abuse problems. Twitter's vice president of engineering, Ed Ho, made this clear last month in a series of tweets that included a promise to move faster to deal with these issues.
Ho said that he and his colleagues know that Twitter users want them to do more to combat online abuse — and fast. "We're thinking about progress in days and hours not weeks and months," he said.