Free Hess, a pediatrician and mother, had learned about the chilling videos over the summer when another mom spotted one on YouTube Kids.

She said that minutes into the clip from a children’s video game, a man appeared on the screen — offering instructions on how to commit suicide.

“I was shocked,” Hess said, noting that since then, the scene has been spliced into several more videos from the popular Nintendo game Splatoon on YouTube and YouTube Kids, a video app for children. Hess, from Ocala, Fla., has been blogging about the altered videos and working to get them taken down amid an outcry from parents and child health experts, who say such visuals can be damaging to children.

AD

One on YouTube shows a man pop into the frame. “Remember, kids,” he begins, holding what appears to be an imaginary blade to the inside of his arm. “Sideways for attention. Longways for results.”

AD

“I think it’s extremely dangerous for our kids,” Hess said about the clips Sunday in a phone interview with The Washington Post. “I think our kids are facing a whole new world with social media and Internet access. It’s changing the way they’re growing, and it’s changing the way they’re developing. I think videos like this put them at risk.”

A recent YouTube video viewed by The Post appears to include a spliced-in scene showing Internet personality Filthy Frank. It’s unclear why he was edited into these clips, but his fans have been known to put him in memes and other videos. There is a similar video on his channel filmed in front of a green screen, but the origins and context of the clip in question are not clear.

AD

Andrea Faville, a spokeswoman for YouTube, said in a written statement that the company works to ensure that it is “not used to encourage dangerous behavior and we have strict policies that prohibit videos which promote self-harm.”

AD

“We rely on both user flagging and smart detection technology to flag this content for our reviewers,” Faville added. “Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views. We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report [transparencyreport.google.com] and give users a dashboard showing the status of videos they’ve flagged to us.”

The videos come amid mounting questions about how YouTube, the world’s largest video-sharing platform, monitors and removes problematic content.

AD

YouTube has long wrestled with how to keep the platform free from such material — removing hateful and violent videos, banning dangerous pranks and cracking down on child sexual exploitation. As The Post’s Elizabeth Dwoskin reported last month, YouTube announced that it was redoing its recommendation algorithm to prevent it from prompting videos that include conspiracy theories and other bogus information, though the videos would remain on the site.

AD

Hess said she has been writing about the distressing video clips on her blog, PediMom, to raise awareness and to get them removed from the platform.

Earlier this month, she found a second one — this time on YouTube.com. She recorded it, wrote about it and reported the content to the video-sharing platform, she said. The video was taken down.

AD

Another version was reposted Feb. 12, receiving more than 1,000 views before it, too, was removed from the site.

Hess said the doctored Splatoon videos are not the only ones pushing dark and potentially dangerous content on social media platforms, particularly on YouTube Kids. In a blog post last week, Hess alerted other parents to numerous concerning videos she said she found on the app — a Minecraft video depicting a school shooting, a cartoon centered on human trafficking, one about a child who committed suicide by stabbing and another who attempted to commit suicide by hanging.

AD

Nadine Kaslow, a past president of the American Psychological Association, told The Post that it is a “tragic” situation in which “trolls are targeting kids and encouraging kids to kill themselves.”

AD

Kaslow, who teaches at Emory University School of Medicine, said that some children may ignore the grim video content but that others, particularly those who are more vulnerable, may be drawn to it. She said such videos can cause children to have nightmares, trigger bad memories about people close to them who have killed themselves or even encourage them to try it, though some of them may be too young to understand the consequences.

Kaslow said parents should monitor what their children do online and tech companies should ensure such content is removed. Still, she said, it’s not enough.

“I don’t think you can just take them down,” she said about the videos. “For children who have been exposed, they’ve been exposed. There needs to be messaging — this is why it’s not okay.”

AD
AD

Kaslow said that though parents should talk to their children about the videos, YouTube Kids also should address the issue, explaining to children what the videos were and why children should never harm themselves.

She added that there should be “serious consequences” for those who had a hand in the videos, noting that it was “very worrisome” that they were targeting children.

According to the Centers for Disease Control and Prevention, risk factors associated with suicide may include mental disorders such as clinical depression, previous suicide attempts, a barrier to accessing mental health treatment, physical illness and feelings of hopelessness or isolation. Those who need help, including children, can call the National Suicide Prevention Lifeline at 1-800-273-TALK.

Read more:

AD
AD