By Peter Whoriskey
Washington Post Staff Writer
Friday, September 12, 2008
The video-sharing service YouTube is banning submissions that involve "inciting others to violence," following criticism from Sen. Joseph I. Lieberman (I-Conn.) that the site was too open to terrorist groups disseminating militant propaganda.
The company earlier this year removed some of the videos that Lieberman targeted, many of which were marked with the logos of al-Qaeda and affiliated groups. But the company refused to take down most of the videos on the senator's list, saying they did not violate the Web site's guidelines against graphic violence or hate speech.
Now that videos inciting others to violence are banned, more videos by the terrorist groups in question may be removed.
"YouTube reviews its content guidelines a few times a year, and we take the community's input seriously," YouTube spokesman Ricardo Reyes said. "The senator made some good points."
"YouTube was being used by Islamist terrorist organizations to recruit and train followers via the Internet and to incite terrorist attacks around the world, including right here in the United States," Lieberman said in a statement. "I expect these stronger community guidelines to decrease the number of videos on YouTube produced by al-Qaeda and affiliated Islamist terrorist organizations."
The standoff between the senator and the nation's largest video-sharing site aroused arguments that have become commonplace since Sept. 11, 2001: It pitted civil rights -- in this case, free speech -- against demands to crack down on terrorism.
In May, Lieberman issued a bipartisan report by the Senate Committee on Homeland Security and Governmental Affairs staff that described how al-Qaeda created and managed its online media.
Later that month, Lieberman wrote a letter to officials at Google [which owns YouTube] demanding that the company "immediately remove content produced by Islamic terrorist organizations from YouTube. This should be a straightforward task since so many of the Islamist terrorist organizations brand their material with logos or icons."
He also asked Google to explain what changes would be made to YouTube's guidelines to address "violent extremist material."
Because the volume of videos uploaded to YouTube is vast -- hundreds of thousands every day -- the company says it cannot monitor what gets posted. Instead, it relies on users to flag videos that violate its "Community Guidelines."
When the company removed videos after Lieberman's request in May, the company did so because they violated its existing guidelines prohibiting graphic violence and hate speech. Some of the videos depicted violent attacks on U.S. soldiers in Iraq and Afghanistan.
But most of the videos highlighted by Lieberman were not removed.
"While we respect and understand his[Lieberman's] views, YouTube encourages free speech and defends everyone's right to express unpopular points of view," the company said in a statement at the time.
The company's stance now appears to have changed.
Exactly what kind of videos will be deemed to be "inciting others to violence," will be considered on a case-by-case basis, though First Amendment experts said the company could run into trouble if the phrase is interpreted too broadly.
"We subscribe to the common sense rule," Reyes said. "Our guidelines are not written for lawyers."