The Washington PostDemocracy Dies in Darkness

Facebook removed 1.5 million videos of the Christchurch attacks within 24 hours — and there were still many more

Police said they will provide “a highly visible” presence when New Zealanders return to daily life three days after an attack on two mosques killed 50 people. (Video: Monica Akhtar, Allie Caren, Drea Cornejo, Sarah Parnass, Taylor Turner/The Washington Post)
Placeholder while article actions load

WELLINGTON, New Zealand — Facebook said that it removed 1.5 million videos of footage from the shooting rampage at two mosques in Christchurch within 24 hours of the attack, underscoring the massive game of whack-a-mole social media giants have to play with even the most high-profile problematic content on their platforms.

In a statement, Mia Garlick, spokeswoman for Facebook New Zealand, said that the company continues to “work around the clock to remove violating content from our site, using a combination of technology and people.” Of the 1.5 million videos of the massacre, filmed by a body-worn camera on the perpetrator almost in the style of a video game, 1.2 million were blocked at upload. 

Facebook’s statement came after New Zealand Prime Minister Jacinda Ardern said in a Sunday news conference that there were “further questions to be answered” by Facebook and other social media sites over their response to the events. 

Live updates: In wake of massacre, New Zealand debates gun laws

Ardern said that her country had done as much as it could to “remove or seek to have removed some of the footage” circulated in the aftermath of the attack but that ultimately it has been “up to those platforms.” 

When the horror began Friday morning in New Zealand, alleged shooter Brentan Harrison Tarrant’s Facebook followers were the first to know. He live-streamed his assault, from the time he started driving over to Al Noor mosque to the moments when he fired his first shots. 

Many hours later, and long after he and other suspects had been arrested, others were still uploading the video to YouTube and other online video platforms. A Washington Post search of keywords related to the event, such as “New Zealand,” surfaced a long list of videos, many of which were lengthy and uncensored views of the massacre.

And though Facebook, Instagram and Twitter have all removed Tarrant’s accounts, dozens of archived versions remain available, along with the links and videos he shared. 

In Brenton Harrison Tarrant’s Australian hometown, his relatives remember violent video games, trouble with women

Facebook says that they are using audio technology to detect more versions of the video, allowing them to catch more footage even if there isn’t an exact match to the full version streamed by Tarrant. 

On Sunday, the New Zealand government informed online platforms that sharing any version of the footage, even the edited, non-graphic versions, is a violation of the law. Facebook says that since the attack, teams have been also working to remove content in support of the massacre and other hateful posts. 

The restrictions have also applied to news media. Local media reported that Sky News Australia was pulled off New Zealand broadcaster Sky TV for airing “distressing footage.” 

Ardern acknowledged that the problem of hate speech and the difficulty of controlling the proliferation of violent videos was a global problem. 

“But it doesn’t mean we can’t play an active role in seeing it resolved,” she added. 

As the attack unfolded, the youngest victim — a 3-year-old boy — ran toward the gunman

‘Lets get this party started’: New Zealand shooting suspect narrated his chilling rampage

Pakistani man who tried to stop shooter to be given posthumous national award

Today’s coverage from Post correspondents around the world

Like Washington Post World on Facebook and stay updated on foreign news