In the wake of two fatal shootings that streamed live to millions on Facebook, the social network is further muddying its policies on what it will and will not broadcast on its Live platform.
Per a statement released this afternoon and intended to clarify the site’s approach to live video, Facebook “understand[s] the unique challenges” of the medium and knows “it’s important to have a responsible approach.” For that reason, the statement continues, moderation staff members reserve the right to make subjective, context-based judgments on whether particular types of graphic footage are appropriate.
The company gives the example of someone streaming violent or graphic images, as happened both Wednesday and Thursday this week. In some situations those images will be permitted, according to Facebook’s opaque editorial discretion. In other situations, they will not.
“For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it,” the company said. “However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video.”
Facebook Live, which launched globally in April, has quickly emerged as one of the Internet’s dominant platforms for streaming unfiltered, real-time video. As Facebook has learned in the past week, however, that status comes with unique challenges.
Real-time video is exceedingly difficult to moderate, as it reaches its largest audience instantaneously and can be redacted only after that moment of impact. That limits the power of even a dedicated, 24-7 moderation team, which Facebook Live has. Despite growing concern that the tool could be abused — several shootings, a police standoff and an accused jihadist’s confession have streamed on Facebook already — the company has remained intentionally (and characteristically) vague on the composition and guidelines of its moderation team. As at most other major social media companies, such workings are kept opaque both to foil mischief-makers who would try to find loopholes, and to avoid outside scrutiny.
But moderation arguably needs to be scrutinized, particularly as more people stream significant breaking news events to Facebook and, thus, grant the social network an almost unprecedented amount of civic power. Activists have cheered live-streaming as a means to speak directly to the public, free of traditional gatekeepers such as the news media. But Facebook itself is also a gatekeeper, and one without an explicit civic mission or responsibility. (In fact, as the media know all too well, Facebook only privileges news when it stands to benefit personally.)
That may become important as live-streaming grows more mainstream. On Wednesday, Diamond Reynolds live-streamed the gory aftermath of a routine traffic stop in which a police officer shot and killed her fiance. Facebook briefly removed archived footage of the video, which it has said was done accidentally.
One day later, several protesters at a Black Lives Matter rally in Dallas caught the unfolding terror of a sniper opening fire on police in the area. Michael Kevin Bautista’s stream of the attack has been viewed more than 5.4 million times; in the aftermath, two people wrongly accused of being suspects also declared their innocence on Facebook Live.
Moderators allowed those videos, of course. But given the inconsistency with which Facebook already enforces its community standards, and the further leeway it appears to be giving in the case of live video, we have little indication whether we can expect the same in the future.