[ad_1]
NEW YORK
Warning: This article has disturbing content and mentions of suicide.
According to a pediatrician and his mother who discovered the video, a video promoting self-aggression advice – grouped between excerpts from a popular video game – surfaced at least twice on YouTube and YouTube Kids.
The suicide instructions are interspersed between excerpts of the famous Splatoon game on Nintendo and delivered by a man who expresses himself in front of a green screen, which seems like an effort to make him blend into the rest of the video. 39; animation.
"Remember the kids, side to attract attention, long-term to get results," said the man, mimicking cutting movements on his forearm. "End it."
The featured man is YouTuber Filthy Frank, who has more than 6.2 million subscribers and is called "the embodiment of everything a person should not be", although there is no evidence that Frank, whose real name is George Miller, participated in the creation of the video falsification. He did not immediately respond to CBS News' request for comment.
When Free Hess found the video on YouTube last week, she posted it on her blog, warning other parents to control what their kids might watch.
"Looking at the comments, it was a while ago, and people had even reported it eight months ago," Hess told CBS News on Friday.
Shortly after posting her article on the blog, YouTube removed the video, claiming it violated the community's site guidelines, according to Hess.
Hess said he spotted another version of the same video on YouTube Kids in July of last year. She stated that she and many other parents of Facebook groups had come together to report it. The video was finally removed after a parent contacted a Google employee directly. Google has not responded to CBS News' request regarding the steps that led to the removal of the video.
Hess said that after finding higher suicide rates among children in her own emergency room in recent years, she has made it her mission to educate children about troubling content. and violent that children use on social networks. She said she posted hundreds of disturbing videos on YouTube, with some success. On Friday, she found and reported seven other disturbing videos on YouTube Kids and said they were just the tip of the iceberg.
"I had to stop, but I could have gone on," said Hess. "Once you start looking at it, the situation becomes darker and weirder. I do not understand how it is not done. "
YouTube Kids is supposed to be a child-friendly version of the YouTube site for kids ages 8 and under, but trolls have found ways to get around YouTube's algorithm and post potentially harmful videos.
"They are awful. Absolutely horrible, "said Hess about the contents of the YouTube Kids app.
She said that she connects to the app posing as a child rather than an adult, so that she can see exactly what the kids around the world are seeing. The videos Hess has found contain references to or images of self-harm, suicide, sexual exploitation, trafficking, domestic violence, sexual abuse and violence. army, including a simulation of shooting in a school. She explained that many of the children she deals with in the emergency list videos on YouTube are a method used to learn destructive behaviors and self-harm techniques.
A YouTube spokesman told CBS News on Friday that the site was working hard "to make sure YouTube is not used to encourage unsafe behavior." The spokesman also said that YouTube enforced "strict rules" prohibiting videos that promote self-injury.
"We rely on both user detection and smart detection technologies to report this content to our reviewers," the spokesman said. "Each quarter we delete millions of videos and channels that violate our policies, and we remove most of those videos before they're seen. We are still working on improving our systems and eliminating illicit content more quickly. That's why we report our progress in a quarterly report and provide users with a dashboard showing the status of videos they've reported to us. "
However, YouTube kids have a habit of letting disturbing and violent videos slide past their algorithms. In 2017, the search for the word "gun" on the application surfaced of a video on the construction of a serpentine gun, reported Mashable. Mickey Mouse in a pool of blood and Spider-Man urinating on Elsa, the princess of "Frozen", elicited negative reactions.
"The YouTube Kids team is made up of parents who are very interested in that. It is therefore extremely important for us to do things right. We act quickly when videos are reported to us, "a CNT spokesperson told CNET. "We agree that this content is unacceptable and we are committed to improving every day the application."
Since the backlash of 2017, YouTube has outlined the steps it is taking to improve the security of its kids app. In November 2017, the company introduced a new set of guidelines, including "faster application" of community guidelines and "blocking inappropriate comments". In April of last year, YouTube announced three new parental control features to give parents the opportunity to manage what their child is doing. sees on the application. Parents also have several other ways to make the application safer, but none of them is automatic.
This week, new cases of inappropriate content sparked high-profile reactions, including from Disney and Nestle, who pulled YouTube commercials after a blogger described "a vortex in a soft-core pedophile network" on the site.
YouTube announced Thursday that it was taking drastic action, banning more than 400 accounts and removing dozens of videos.
Critics say his approach to security on all platforms simply does not work.
UPDATE: @Youtube @YTCreators left a comment and provided an update on what they did to fight the horrible people on the site in the last 48 hours.
TLDR: comments disabled on tens of millions of videos. Finished more than 400 channels. Reported illegal comments to the forces of order. pic.twitter.com/zFHFfkX9FD
– Philip DeFranco (@PhillyD) February 21, 2019
Parents remain concerned about security on YouTube and YouTube Kids. "We should start by educating ourselves, educating our children and talking when we see something dangerous for our children," Hess wrote on his blog. "We must also fight for the developers of social media platforms to be held responsible when they do not ensure that age restrictions are met and when they do not remove not inappropriate and / or dangerous content when they are reported.
[ad_2]
Source link