The militant group Islamic State in Iraq released a video on Tuesday that appeared to show the beheading of American photojournalist James Foley. Shortly afterward, Radio Sawa's Washington correspondent Zaid Benjamin, who seems to have broken the news of the execution, had his Twitter account temporarily blocked and a tweet containing the video deleted.

A White House official told the Washington Post late Tuesday that "officials from the Departments of State and Defense reached out to relevant social media sites to inform them of the video and requested that they take appropriate action consistent with their stated usage policies."

But asked about the removal, a Twitter spokesperson said the company "do[es] not comment on individual accounts, for privacy and security reasons" and directed a reporter to its stated policy on removing media concerning the deceased at the behest of family members.

"In order to respect the wishes of loved ones," the Twitter policy reads, "Twitter will remove imagery of deceased individuals in certain circumstances. Immediate family members and other authorized individuals may request the removal of images or video of deceased individuals, from when critical injury occurs to the moments before or after death, by sending an e-mail to privacy@twitter.com. When reviewing such media removal requests, Twitter considers public interest factors such as the newsworthiness of the content and may not be able to honor every request."

Asked whether in this case a family member or representative indeed asked for the removal of the video and photos, the Twitter spokesman said that confirmation of any requests would have to come from the Foley family.

Still, scores of horrifying images from the Foley video continued circulating on Twitter Tuesday night. The situation points to a complexity of serving as a social media platform at a time when sharing happens at the speed of light. Openness is in Twitter's DNA: the platform largely took hold in the public's consciousness when it played a role in the 2009 election uprisings in Iran, and it has served as a key medium for the distribution of news and outrage amid the ongoing protests over the shooting death of Michael Brown in Ferguson, Mo. By all appearances, the company embraces its role as an open publishing platform for journalists and non-journalists alike, and the very viral abundance that makes Twitter so powerful --  a half billion tweets are sent a day -- makes it difficult to police.

Andrew McLaughlin was the director of global public policy for Google from 2004 to 2009, and served as a deputy Chief Technology Officer of the United States from 2009 to 2011. Says McLaughlin of Twitter, "They start from a free speech, pro-news bias."

"The dilemma is that you're one, a human being, and you don't want to be a jerk. So you can imagine that if you're a family member of this person, by all means you would want the horrific photos of their moment of death taken offline. But, second, the photos are obviously newsworthy. It's awful that these photos were taken, and it's awful that this moment happened, but their very existence is news. It's the sort of thing that moves history."

Companies like Twitter, says McLaughlin, "have to calibrate between those two pressures."

With Twitter as a middleman, the company is in the position of deciding whether photos like those of Foley's terrible last moments stay or go. Twitter has chosen to put that call on his family, and to McLaughlin that makes some sense. It's they who would likely be affected most. And even given the horror of the moments the images capture, McLaughlin isn't sure the decision is an easy one. "Some families might want the photos to spread far and wide," he says, "to spark worldwide condemnation."

Last February, Google was pressed to take down from its YouTube service a pair of videos circulating under the banner of "The Innocence of Muslims" that were seen as offensive to the prophet Muhammad.

But even once the judgment is made to scrub content, Twitter's circumstances might make acting on that decision more complex. Google, says McLaughlin, has developed advanced image analysis tools, including ones capable of searching for a unique "hash" that can serve as a fingerprint of a video or a photo. Google faced frequent complaints over child pornography and copyright, but McLaughlin says he isn't sure that Twitter has developed the same capacities.

What's more, even though Twitter and YouTube might appear a lot alike, their structures are very different. Photos on Twitter are often shared via third-party services like TwitPic, Imgur and yfrog. Those images display "in-line" in tweets but are hosted on non-Twitter servers. "Twitter is more like a messaging service with lots of links to things and some hosted content," says McLaughlin, "where YouTube is all hosted content." It isn't impossible to monitor linked services, says McLaughlin, but "it's a fantastical amount of processing power you would have to apply."

Zelda Williams, daughter of Robin Williams, recently faced offensive, faked images on Twitter and the photo-sharing service Instagram after her father's death by suicide, including ones altered to show bruises around his neck. "We will not tolerate abuse of this nature on Twitter," Twitter's vice president of trust and safety, Del Harvey, said in a statement at the time. But the Foley videos and photos aren't abusive per se, and a quick glance at Twitter reveals that often those sharing them are well-meaning tweeters.

For the moment, at least, the burden of completely avoiding the worst of the Foley images might ultimately fall on users. The hashtag #ISISMediaBlackout has begun trending as a way of rallying around the idea of not sharing videos and images distributed by the group. And during the crash of the Malaysia Airlines crash in Ukraine, the journalism site Poynter provided tips on avoiding sensitive photos by turning off the automatic display of images in your feed.

David Nakamura contributed to this report. 

Loading...