Facebook said Wednesday that it is taking more steps to crack down on its network's “revenge porn” problem, including a new process that prevents users from reposting intimate images shared without the subject's consent.
Victims of nonconsensual porn often find it very hard to get images of themselves removed from the Internet, because it's so easy to share photos that have been removed in other places. Photo-matching software will allow Facebook to prevent photos that have been removed from surfacing again, at least on its own site.
“If someone tries to share the image after it's been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” Davis said in the post.
Policing such images has proved to be a difficult task for Facebook and other social networks that deal with a flood of posts each day and have not always consistently enforced their policies on graphic images. Twitter also says that posting revenge porn on its network violates its community standards. Both companies have supported legislation that would make it illegal to distribute these kinds of images.
About 4 percent of U.S. Internet users — 10.4 million people — have been victims of revenge porn or threatened with the posting of explicit images, according to a 2016 study by the Data and Society Research Institute.
Facebook's policies on revenge porn have come into sharp focus after members of the Marine Corps were found to be sharing nude pictures of female Marines, without permission, in a private Facebook group.
After those violations became public, many called on Facebook to clean up its act. "[With] US revenge porn law still a patchwork of difficult-to-enforce statutes, it’s increasingly incumbent on Facebook to come up with the solution itself,” wrote Wired's Emma Grey Ellis in March. She said that Facebook should devise a more proactive solution to tackling revenge porn, rather than simply reacting to reports.
The policy has its pros and cons, said Soraya Chemaly, the founder of the Safety and Free Speech Coalition. One big positive, she said, is that Facebook is expanding its efforts to combat harassment that isn't just in text. "I think that attempt to incorporate responses to photography and its uses and abuses are a good thing," she said. "
Chemaly would like to see more transparency from Facebook regarding how it will judge the tricky questions of what constitutes consent and who owns images that are posted. "I think this is a positive step, if you have to be in the business of moderating content," she said.
Companies can have a tricky time censoring user content in general, said Danielle Citron, a Unviersity of Maryland law professor who has researched online harassment. In this case, Citron said, she believes that the parameters of what constitutes "revenge porn" makes defining what's appropriate and what isn't more clear than having to pass similar judgement on what is, for example, appropriate speech.
But, she said, she appreciated that Facebook is taking its time to respond to users' concerns. "They're moving slowly because they want to get it right, and don’t want to over-censor," Citron said.
The new process will go into effect on Facebook, Facebook Messenger and Instagram. Facebook said the new step to the reporting process will feel familiar for those who've flagged posts for Facebook before.
Users will be able to report images by clicking on the “Report” link, which appears when you tap on the down arrow — or "…” if you are on mobile — in the upper right-hand corner of a post. The social network will then refer the image to its community operations team, for review by employees, who will then determine whether to remove it.
Those who've posted photos can appeal Facebook's decision.
The company said Wednesday that it worked with several groups to develop its new policies, including the Cyber Civil Rights Initiative — a group founded by Holly Jacobs, who had been a victim of revenge porn. Davis said that Facebook has worked with the group to create a “one-stop shop” for reporting revenge porn images posted on multiple sites.
“We convened over 150 safety organizations and experts last year in Kenya, India, Ireland, Washington DC, New York, Spain, Turkey, Sweden and the Netherlands to get feedback on ways we can improve,” the company said.
Rep. Jackie Speier (D-Calif.), who has sponsored legislation that would make revenge porn illegal, said in a statement that Facebook's “new tools are a huge advancement in combating nonconsensual pornography and I applaud Facebook for their dedication in addressing this insidious issue.”
While the feature may prevent reported images from spreading through Facebook, it still depends on users reporting the images — which may not happen too often in closed groups such as Marines United, where users have created a community to share these types of images. Chemaly said that the move to private sharing, over messaging services as well as within private groups is a broader concern across the Web.
Davis said Facebook's privacy tools will continue to evolve.