Facebook now has 1.94 billion users, an increase driven by mobile growth, the company said Wednesday as it released another strong earnings report. But questions loom about whether the social network can adequately deal with some of the content posted by its growing audience.
Chief executive Mark Zuckerberg stressed the importance of creating a safe network in a news release on the earnings. “We’re continuing to build tools to support a strong global community,” he said.
Video, both live and in posted clips, is crucial to Facebook's future as it looks to video ads to make up for an expected slowdown in revenue growth. But Facebook has had to grapple with the dark side of video as users widely shared several graphic videos on its network in the past several months — including a spate of live-streamed suicides, rapes and the real-time confessions of a man who posted a video of himself gunning down a Cleveland man.
In an earlier Facebook post Wednesday, Zuckerberg said that the social network is hiring 3,000 additional workers to its “community operations” team, which will field reports from users who flag inappropriate material on the site. The company would then have 7,500 workers on its global team.
The new reviewers “will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation,” Zuckerberg said. Facebook will keep working with community groups — such as suicide prevention groups — and law enforcement to offer assistance to those who post or are seen in the videos who may need help, he said.
Merely adding more staffers to react to inappropriate videos will not fix the overall problem of users posting them in the first place — something even Zuckerberg acknowledged during a call with analysts Wednesday. “No matter how many people we have on the team, we’ll never be able to look at everything," he said.
But with the addition of these 3,000 employees, Facebook hopes to cut down on the response time between when someone reports a violent or inappropriate video and when the company can take the video down. Facebook declined to comment on where the workers will be stationed. The company doesn't say how many Facebook Live videos are posted each day but confirmed that 1 in 5 videos on the site is a live broadcast.
Until Wednesday, Zuckerberg has said little about these violent Facebook incidents; at the company's annual conference last month, he expressed his sympathy for those affected by the crimes live-streamed on Facebook's platform.
The company has been hit with heavy criticism for not taking sufficient measures to vet and react to the streaming of inappropriate content on the social network. That came into sharp focus after Steve Stephens, a man suspected of killing 74-year-old Robert Goodwin Sr. on Easter weekend in Cleveland, posted a video of the fatal shooting. Stephens later committed suicide.
Several other disturbing video incidents have gained media attention. Chicago police charged four adults with hate crimes in January after they live-streamed themselves on Facebook as they were beating a man. Police in Chicago also investigated the alleged sexual assault of a 15-year-old girl streamed on Facebook in March. More recently, a Thai man killed his infant daughter on Facebook Live before killing himself late last month. That same week, the suicide note of Markeice “Mari” Brown, 20, accompanied by a Facebook Live video, gained attention when he killed himself shortly after his pregnant girlfriend committed suicide.
In every case, the videos went viral — raising questions about Facebook's capabilities to take down such content before it spreads throughout the network.
The company's review team is on call 24 hours a day to handle complaints from users who have flagged content as offensive. But even with violent videos, it can be difficult to determine what is appropriate and what is not. The process can take time, giving the videos a chance to gain traction. The video of the Chicago beating, for example, remained online for 30 minutes.
For Facebook, the tricky issue is determining the video's context, which plays a large role in whether the company will allow it to remain online. As Facebook said in a July blog post:
For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video.
The company is also working on artificial intelligence tools to automatically look for inappropriate videos, but it's not clear how equipped these algorithms will be to navigate sensitive situations. Zuckerberg said that he thinks the technology is still years away from being where it needs to be.
Experts said that Facebook's hiring decision is a good step toward the maturation of Facebook Live. The feature has also had some positive effects — particularly for political speech and for its ability to shed light on issues such as police brutality. Those uses, experts said, are worth saving.
“The worry I have is that Facebook will ultimately say, 'This isn’t worth it to us,' and we’ll lose it as this larger tool for society and civic engagement moving forward,” said Benjamin Burroughs, an assistant professor at the University of Nevada in Las Vegas who specializes in streaming media.