Pediatrician exposes hidden child suicide tips in YouTube and YouTube Kids videos



[ad_1]

Lindsey Bever | The Washington Post

Free Hess, pediatrician and mother, had heard about the terrifying videos of the summer when another mother saw one on YouTube Kids.

She said that a few minutes after the beginning of a video game for children, a man appearing on the screen gave instructions on how to commit suicide.

"I was shocked," said Hess, noting that since then, the scene has been split into several other videos of the famous Nintendo game "Splatoon" on YouTube and YouTube Kids, a video application aimed at kids. Hess, from Ocala, Fla., Has published a blog on modified videos and has tried to get rid of them in the midst of the outcry from parents and child health experts, claiming that such visual effects could be damaging for kids.

One on YouTube shows a pop man in the frame. "Do not forget the kids," he begins, holding what appears to be an imaginary blade inside his arm. "On the side to attract attention. Long time for results. "

"I think it's extremely dangerous for our kids," Hess said of the Sunday clips in a phone interview with the Washington Post. "I think our kids are facing a whole new world with social media and Internet access. This changes how they grow and how they develop. I think videos like this put them in danger. "

A recent YouTube video watched by The Post seems to include a spliced ​​scene showing the Internet personality, Filthy Frank. The reason he was mounted in these clips is not clear, but we know that his fans put it in memory and in other videos. There is a similar video on his channel filmed in front of a green screen, but the background and context of the clip in question is unclear.

Andrea Faville, a spokeswoman for YouTube, said in a written statement that the company was working to make sure that she "is not used to encourage unsafe behavior and that we apply rules strict prohibiting videos promoting self-harm ".

"We rely on both user sensing and smart sensing technologies to report this content to our reviewers," added Faville. "Each quarter we delete millions of videos and channels that violate our policies, and we remove most of those videos before they're seen. We continually strive to improve our systems and eliminate illicit content more quickly. That's why we report our progress in a quarterly report (TRANSPREEPORT.GOogle.com) and provide users with a dashboard showing the status of the videos they've reported to us. . "

These videos are generating more and more questions about how YouTube, the world's largest video-sharing platform, monitors and removes problematic content.

YouTube has long sought to preserve the platform of its content: removal of hateful and violent videos, prohibition of dangerous pranks and fight against the sexual exploitation of children. As reported by Elizabeth Dwoskin of The Post last month, YouTube announced that it was rebuilding its recommendation algorithm to prevent it from offering videos containing conspiracy theories and other fake information, even if these videos would remain on the site.

Hess said that she had written on the painful video clips on her blog, PediMom, to raise awareness and have them removed from the platform.

Earlier this month, she found a second, this time on YouTube. She recorded it, wrote it down and posted the content to the video sharing platform, she said. The video has been taken.

Another version was re-released on February 12 and was viewed more than 1,000 times before being removed from the site.

Hess said that "smashed" Splatoon videos are not the only ones that offer dark and potentially dangerous content on social media platforms, especially on YouTube Kids. In an article published last week on her blog, Hess alerted other parents on many videos regarding what she found on the app: a video of "Minecraft" illustrating a shootout in a school, a caricature centered on human trafficking, that of a child who committed suicide by stabbing another who attempted to commit suicide by hanging.

Nadine Kaslow, former president of the American Psychological Association, told The Post that it was a "tragic" situation in which "trolls are targeting children and encouraging them to to kill oneself.

Kaslow, who teaches at Emory University's Faculty of Medicine, said some children may ignore ominous video content, but others, especially those who are more vulnerable, may be attracted to the content. She added that such videos can cause nightmares in children, cause bad memories of their loved ones who have killed themselves or even encourage them to try it, even though some of them may be too young to understand the consequences.

Kaslow said that parents should monitor what their kids are doing online and that technology companies should make sure this content is removed. Yet, she said, that's not enough.

"I do not think you can just take them," she said of the videos. "For the children who were exposed, they were exposed. There must be messaging – that's why it's not right.

[ad_2]

Source link