But as the video's view count was mounting overnight, Facebook removed it — prompting an uproar among some users and raising questions about how social-media sites should approach graphic content as they increasingly serve as real-time news and accountability tools.
The video was later reinstated and has now been viewed nearly 2.5 million times on the social network. Facebook blamed the video's disappearance, which lasted about an hour, on a "technical glitch" and said it was restored as soon as the company was able to investigate the issue.
"We're very sorry that the video was temporarily inaccessible," Facebook said in a statement to The Washington Post.
The details of the glitch aren't clear. But how Facebook determines what their users should or should not see on the site has become increasingly important as more and more of its 1.6 billion users get their news through the social network. According to a recent Pew Research survey, two-thirds of American Facebook users say they use the site to get news.
The social network's community standards say that Facebook will remove posts featuring violence or graphic content when they are shared for "sadistic pleasure or to celebrate or glorify violence." But they also note that people use Facebook to call attention to issues that are important to them — and in cases such as human rights abuses, that may involve sharing unsettling photos or videos.
"In many instances, when people share this type of content, they are condemning it or raising awareness about it," the community standards say.
The site also asks people to warn their audience if they do share posts including graphic violence. Facebook attached such a warning to the reinstated version of Reynolds's video.
Facebook has historically relied on users to help flag posts that they think are inappropriate, with human moderators deciding whether content should be removed. (These are often contractors overseas in places such as the Philippines.)
But the launch of Facebook Live earlier this year has brought new urgency to the question of how the site should approach moderation as tragedies unfold in real time.
Even before Reynolds's video, Facebook was grappling with where to draw the line. For example, in June the site removed a live video made by a French Islamic State sympathizer who was shot after he killed his romantic partner and a police officer.
After the Chicago video, Facebook expanded the team that reviews live content, although it did not say how many people are now part of the team. The company said its workers use a variety of factors, including the number of user reports, number of viewers and type of reports, to determine which videos to review first. It also plans to monitor some live broadcasts once they reach a certain popularity threshold and will take action — including interrupting the stream — if it sees a violation of its community standards.
"We do understand and recognize that there are unique challenges when it comes to content and safety for Live videos," Facebook said in the statement Thursday to The Post. "It’s a serious responsibility, and we work hard to strike the right balance between enabling expression while providing a safe and respectful experience."