Facebook explains reporting tool

Ever wonder what happens after you click “Report/Mark as Spam” on your Facebook account?

On Tuesday, the social network released an infographic that explains that process a little more and drew back the curtain on what happens when its users report to the site.

More tech stories

Parking doesn’t have to be a hassle

Parking doesn’t have to be a hassle

Meet the man who wants to make parking in a garage as fun as riding in an Uber.

Big data: A double-edged sword

Big data: A double-edged sword

New information will improve our health and prevent crimes, but uncover skeletons and hurt privacy.

Facebook rolls out ‘Nearby Friends,’ a real-time friend tracker

Facebook rolls out ‘Nearby Friends,’ a real-time friend tracker

Now you can track your Facebook friends in real-time. But only if you want to.

There are a few cases in which users would want to use Facebook’s reporting tool. For example, a user may want to file a report if they see spam, offensive content or if they notice a hacked or fake profile.

The company has several teams that deal with different kinds of content, according to the graphic and accompanying blog post. There’s a Safety team, a Hate and Harassment team, an Abusive Content team and an Access Team, which each deal with a specific type of reported material.

The infographic didn’t go into much detail about how the teams assess content, but did say that in some cases teams check potential violations against Facebook’s community guidelines and, in cases where there is a credible threat of violence, will bring matters to the attention of law enforcement.

Other reports such as accusations of harassment or spam are put to the supposed perpetrators to let them offer a rebuttal.

Facebook also said that in cases of harassment or when someone speaks about self-harm on the network, that its social reporting tool — a way for users to report problems to others in their network — should be users’ official channel.

“The safety and security of the people who use our site is of paramount importance to everyone here at Facebook. We are all working tirelessly at iterating our reporting system to provide the best possible support for the people who use our site,” the Facebook Safety team said in a company blog post.

Safety and security on the social network have come into sharp focus as it continues to consider plans to allow children younger than 13 onto the site. Consumer groups have already urged the site to ban ads to children 12 and under if it goes through with its plan, and others have raised concerns about the effect that cyberbullying could have on kids that young.

Mashable unscientifically polled its readers on their opinions about Facebook for a younger set and found that 78 percent would not approve it, with many parents citing bullying as a concern.

(The Washington Post Co.'s chairman and chief executive, Donald E. Graham, is a member of Facebook's board of directors.)

Related stories:

Facebook settles Sponsored Stories suit for $10 million

Facebook CTO Bret Taylor leaving for start-up

Bloomberg: Facebook buys Face.com, technology that enables facial recognition

 
Read what others are saying