She said that a few minutes after the start of the video game clip for children, a man appeared on the screen, giving instructions on how to commit suicide.
"I was shocked," said Hess, noting that since then, the scene had been split into several other videos of the famous Nintendo game "Splatoon" on YouTube and YouTube Kids, a video application aimed at kids. Hess, from Ocala, Fla., Has published a blog on modified videos and has tried to get rid of them in the midst of the outcry from parents and child health experts, claiming that such visual effects could be damaging for kids.
One on YouTube shows a pop man in the frame. "Remember, children," he begins, holding what appears to be an imaginary blade inside his arm. "Laterally to attract attention.Long time to get results."
"I think it's extremely dangerous for our kids," Hess said of the Sunday clips in a phone interview with the Washington Post. "I think our children are facing a whole new world with social media and Internet access, and that is changing the way they grow and develop, and I think videos like this put them in danger. "
A recent YouTube video watched by The Post seems to include a spliced scene showing the Internet personality, Filthy Frank. It is unclear why he was ridden in these clips, but his fans are known for putting him in memes and other videos. There is a similar video on his channel filmed in front of a green screen, but the background and context of the clip in question is unclear.
Andrea Faville, a spokeswoman for YouTube, said in a written statement that the company was working to make sure that she "is not used to encourage unsafe behavior and that we apply rules strict prohibiting videos promoting self-destruction ".
"We rely on both user sensing and smart sensing technology to report this content to our reviewers," added Faville. "Every quarter, we remove millions of videos and channels that violate our rules and most of these videos before they are seen.We are always working on improving our systems and removing more fast of content violates.That's why we report our progress in a quarterly report (transparencereport.google.com) and give users a dashboard showing the status of videos they've reported to us . "
The videos come from growing questions about how YouTube, the world's largest video-sharing platform, monitors and removes problematic content.
YouTube has been striving for a long time to preserve the platform of its content: removal of hateful and violent videos, prohibition of dangerous pranks and fight against the sexual exploitation of children. As reported by Elizabeth Dwoskin of The Post last month, YouTube announced that it was rebuilding its recommendation algorithm to prevent it from offering videos containing conspiracy theories and other factual information, even if these videos would stay on the site.
Hess said that she had written on the painful video clips on her blog, PediMom, to raise awareness and have them removed from the platform.
Earlier this month, she found a second, this time on YouTube. She recorded it, wrote it down and posted the content to the video sharing platform, she said. The video has been taken.
Another version was re-released on February 12 and was viewed more than 1,000 times before being removed from the site.
Hess said not only "smashed" Splatoon videos, but dark, potentially dangerous content on social media platforms, especially YouTube Kids. In an article published last week on her blog, Hess alerted other parents on many videos regarding what she found on the app: a video of "Minecraft" describing a shootout in a school, a comic strip centered on human trafficking, a film featuring a child who committed suicide, and another who attempted to commit suicide by hanging.
Nadine Kaslow, former president of the American Psychological Association, told The Post that it was a "tragic" situation in which "trolls are targeting children and encouraging them to to kill himself. "
Kaslow, who teaches at the Emory University School of Medicine, said some children may ignore ominous video content, but that others, especially those who are more vulnerable, might be attracted to it. content. She added that such videos can cause nightmares in children, cause bad memories of their loved ones who have killed themselves or even encourage them to try it, even though some of them may be too young to understand the consequences.
Kaslow said that parents should monitor what their kids are doing online and that technology companies should make sure this content is removed. Yet, she says, that's not enough.
"I do not think you can take them," she said of the videos. "For the children who have been exposed, they have been there.There must be a message – that's why it's not OK."
Although parents should talk about their videos to their kids, Kaslow said, YouTube Kids should also address the problem, telling kids what these videos are about and why kids should never hurt themselves.
She added that there should be "serious consequences" for those who participated in the videos, noting that it was "very disturbing" to target children.
According to the Centers for Disease Control and Prevention, risk factors associated with suicide may include mental disorders such as clinical depression, previous suicide attempts, an impediment to access to mental health treatment, physical illness and feelings of hopelessness or isolation. Those in need of help, including children, can call the National Suicide Prevention Line at 1-800-273-TALK.
This article was written by Lindsey Bever, a Washington Post reporter.