Once again, a self-made video, uploaded to the world via YouTube, is at the heart of a horrific news event.

Elliot Rodger, the 22-year-old man who authorities say killed six people in the Santa Barbara, Calif., area before fatally shooting himself late Friday, posted at least two self-pitying videos to the video-sharing site shortly before he went on his rampage.

The videos — in which Rodger calmly and chillingly discusses his sexual frustrations and intent to “slaughter” those he claims harmed him — were removed by YouTube after viewers flagged them. But they were repeatedly re-posted on the site as copies spread across the Internet.

YouTube, which is owned by Google, has dealt with controversial user-generated videos since shortly after the site was founded in 2005. It bans most explicit or pornographic material uploaded by users, as well as “hate” speech or direct threats against individuals. But homemade videos or news footage involving acts of violence — street fights and car chases, primarily, but also historic events such as the Zapruder film of President John F. Kennedy’s assassination — are a staple among the millions of short clips it hosts.

One of the site’s most infamous uploads is the 13-minute trailer for “Innocence of Muslims,”a crudely made movie attacking the origins of Islam. The video was largely ignored when it was posted in English on YouTube in July 2012, but it sparked protests and riots across the Muslim world in September of that year, after it was translated into Arabic.

The Obama administration initially blamed the trailer video for inciting the attacks on a U.S. government facility in Benghazi, Libya, that left four Americans dead, including U.S. ambassador J. Christopher Stevens. YouTube voluntarily removed the video in a number of Muslim countries but declined to remove it from its main site, saying it did not qualify as “hate” speech.

A YouTube spokeswoman, Samantha Smith, declined to say when Rodger’s videos were posted, or when they came to the company’s attention. Videos are routinely flagged by YouTube’s users; the company reviews videos that have raised concerns and removes them if they violate its community guidelines.

Among other things, the guidelines prohibit videos displaying “predatory behavior, stalking, threats, harassment [and] intimidation . . . and inciting others to commit violent acts. . . . Anyone caught doing these things may be permanently banned from YouTube.”

Law enforcement authorities in Santa Barbara said they are analyzing Rodger’s videos, which he apparently has posted online since 2012.

The California shooter's path

“Our hearts go out to the families affected by this terrible news,” Smith said in a statement. “Videos threatening violence are against YouTube’s guidelines and we remove them when they are flagged.”