(CBS) – Warning: This article has disturbing content and mentions of suicide.
According to a pediatrician and his mother who discovered the video, a video promoting self-aggression advice – grouped between excerpts from a popular video game – surfaced at least twice on YouTube and YouTube Kids.
The suicide instructions are interspersed between excerpts of the famous Splatoon game on Nintendo and delivered by a man who expresses himself in front of a green screen, which seems like an effort to make him blend into the rest of the video. 39; animation.
"Remember the kids, sideways to get attention, long term to get results," said the man, miming cutting movements on his forearm. "End it."
YouTuber Filthy Frank is the featured man with over 6.2 million subscribers and is called "the embodiment of everything a person should not be", although there is no such thing proof that Frank, whose real name is George Miller, participated in the creation of the video falsification. He did not immediately respond to CBS News' request for comment.
When Free Hess found the video on YouTube last week, she posted it on her blog, warning other parents to control what their kids might watch.
"Looking at the comments, it's been a while since people are reporting it and they've even reported it eight months ago," Hess told CBS News on Friday.
Shortly after publishing her blog post, YouTube removed the video, claiming that it violated the site's community rules, according to Hess.
Hess said he spotted another version of the same video on YouTube Kids in July of last year. She stated that she and many other parents of Facebook groups had come together to report it. The video was finally removed after a parent contacted a Google employee directly. Google has not responded to CBS News' request for information regarding the steps that led to the removal of the video.
Hess said that after finding higher suicide rates among children in her own emergency room in recent years, she has made it her mission to educate children about troubling content. and violent that children use on social networks. She said she posted hundreds of disturbing videos on YouTube, with some success. On Friday, she found and reported seven other disturbing videos on YouTube Kids and said they were just the tip of the iceberg.
"I had to stop, but I could have continued," said Hess. "Once you start looking at it, things get darker and weird, I do not understand how it does not get caught."
YouTube Kids is supposed to be a child-friendly version of the YouTube site for kids ages 8 and under, but trolls have found solutions to YouTube's problem with the algorithm and post potentially harmful videos.
"They are awful, absolutely awful," said Hess about the content of the YouTube Kids app.
She said that she connects to the app posing as a child rather than an adult, so she can see exactly what children around the world are seeing. The videos found by Hess contain mentions or images of self-harm, suicide, sexual exploitation, trafficking, domestic violence, sexual abuse and gun violence, including a simulation of shooting in a school. She explained that many of the children she deals with in the emergency list videos on YouTube are a method used to learn destructive behaviors and self-harm techniques.
A spokesman for YouTube told CBS News on Friday that the site was working hard "to make sure YouTube is not used to encourage unsafe behavior." The spokesman also said that YouTube has "strict rules" prohibiting videos promoting self-harm.
"We rely on both user sensing technology and intelligent detection to report this content to our reviewers," said the spokesman. "Every quarter, we remove millions of videos and channels that violate our rules and most of these videos before they are seen.We are always working on improving our systems and removing more This is why we report our progress in a quarterly report and give users a dashboard indicating the status of videos they have reported to us. "
However, YouTube kids have a habit of letting disturbing and violent videos slide past their algorithms. In 2017, the search for the word "gun" on the application surfaced of a video on the construction of a serpentine gun, reported Mashable. Mickey Mouse in a pool of blood and Spider-Man urinating on Elsa, the princess of "Frozen", caused negative reactions.
"The YouTube Kids team is made up of very concerned parents, so it's extremely important for us to get it right and we act quickly when videos are reported to us," CNET told a spokesperson. from YouTube. "We agree that this content is unacceptable and we are committed to improving the application every day."
Since the backlash of 2017, YouTube has outlined the steps it is taking to improve the security of its kids app. In November 2017, the company introduced a new set of guidelines, including "faster application" of community guidelines and "blocking inappropriate comments". Last April, YouTube announced three new parental control features to give parents the ability to control what their child sees on the app.
Parents also have several other ways to make the application safer, but none of them is automatic.
This week, new cases of inappropriate content sparked high-profile reactions, including from Disney and Nestle, who fired an ad from YouTube after a blogger described "a vortex in a soft-core pedophile network" on the site.
YouTube announced Thursday that it was taking drastic action, banning more than 400 accounts and removing dozens of videos.
Critics say his approach to security on all platforms simply does not work.
Parents remain concerned about security on YouTube and YouTube Kids. "We should start by educating ourselves, educating our children and talking when we see something dangerous for our children," Hess wrote on his blog. "We also need to fight for the developers of social media platforms to be held responsible when they do not ensure that age restrictions are met and when they do not do not delete inappropriate and / or dangerous content when they are reported. "