YouTube pedophilia, "Momo suicide challenge": the safety of children on the platform



[ad_1]

For years, health professionals and child advocacy groups have voiced their concerns about child safety and YouTube. The company has taken steps to try to make YouTube a safe space for kids and to protect young viewers from the dangers of the Internet. For example, four years ago, he launched an application specifically designed for children's content, YouTube Kids.

But despite these efforts, the problems did not go away.

In a statement to Vox, YouTube says that it "took immediate action by removing accounts and strings" and that it "will continue to work to improve and detect abuses faster."

A few days before the publication of Wired's story, a vlogger on YouTube, Matt Watson, said he discovered, during one of his investigations, that the YouTube algorithm He feeds people in videos showing kids playing once they're looking for him – a "wormhole in a soft-core pedophile ring," as he calls it.

These results are troubling – for parents who download YouTube videos of their kids playing, and for kids who are dropping more and more television to spend time on YouTube. In some cases, children even interact with commentators, according to Wired, answering their questions and providing personal information such as their age.

This also creates big problems for advertisers. Last week, brands such as AT & T, Disney, Hasbro, Epic Games and Nestle fired their ads from YouTube, saying they would not work with the tech giant until they knew how to solve the problem. .

Google claimed to have solved the problem by banning certain accounts and closing the comment sections of some videos. But Haley Halverson of the Washington-based National Center for Combating Sexual Exploitation (DCS) says the situation is still ongoing.

"In two clicks, I was able to get into a rabbit hole of videos where kids are eroticized by pedophiles and abusers," wrote Halverson in a statement released on Friday. "The more I clicked, the more the content was blatantly sexualized, because the YouTube recommendation algorithm was giving me more and more videos with hundreds of thousands, and sometimes millions of views. Despite YouTube's claim to clean up this content, YouTube continues to monetize videos of young children, offering pedophiles a platform to exchange information and exchange links to more graphic child pornography. "

YouTube is determining how it will handle its sexual exploitation commentary, as advertisers abandon offers that help Google earn billions. But as with all issues involving technology giants, the solutions are not so simple.

How Youtube has become an inadvertent focus of a circle of "soft" pedophiles

YouTube was launched in 2005 by three former Paypal employees. The idea of ​​creating an easily accessible video site was born after one of the founders struggled to locate a video of Janet Jackson's wardrobe malfunction at the 2004 Super Bowl.

In 2006, Google bought the site for $ 1.6 billion. This is the second most visited website on the Internet (behind Google.com). YouTube has 1.9 billion monthly users, accounting for about one-third of the Internet population.

Today, YouTube has become, among other things, a platform on which personalities can earn millions of dollars with DIY videos and bizarre beauty tutorials. There are also many fun videos, many of them involving children. One of the first viral videos was "Charlie bit my finger"; the clip of Harry, age 3, who was pinched his finger by his one year old brother has been seen more than 867 million times. There was the video of David, then 7 years old, after the dentist, who had had a good dose of nitrous in 2009, or the video of a lovely 3 year old girl, Cody, who cried because She could not handle how much she loved Justin Bieber.

Like these viral videos, YouTube contains content for hundreds of hours, most of which are innocent. But child advocacy groups have been talking about the dangers of YouTube – and the Internet in general – about child safety for years, and this is starting to surface, as it has become clear that these videos are not always viewed . harmless intentions.

YouTube has clear rules against explicit content. In broad outline of its policy on nudity and sexual content, the company writes that "content that is supposed to be sexually gratifying (such as pornography) is not allowed on YouTube" and that "videos containing content fetish will be removed or limited in age. also states that "sexually explicit content featuring minors and sexually exploitative content of minors" is not allowed. It states that it reports content containing child sexual abuse images at the National Center for Missing and Exploited Children, which collaborates with law enforcement agencies.

But as Wired's story points out, these people do not necessarily come to YouTube to search for pornographic content, but are interested in the most innocuous type, such as videos showing intimate parts of children. covered or not, while exercising or playing games. People leave suggestive comments on videos (which are mostly girls, some as young as 5 years old) and share the time of this content. According to YouTube's evaluation of the vlogger Watson, the comments section on YouTube also allows these people to communicate with each other.

Child pornography "is marketed, as well as social media and WhatsApp addresses. YouTube is making it easier, "writes Watson in the description of his video, explaining how these commentators get in touch to share sexual content with children, making their network larger and wider than YouTube.

Because of YouTube's algorithm, once viewers start watching videos of children playing and jumping, they receive videos that seem to be popular in the same direction; In this way, the video-sharing site essentially provides viewers with the content they are looking for. In some cases, kids on YouTube even respond to commentators. Per Wired, "on a video, a girl appears to ask another commentator why one of the videos made her" grow up "."

YouTube said it was tackling this problem "aggressively". In an email to Vox, a spokesperson wrote:

All content – including comments – endangering minors is obnoxious and we have clear rules banning it on YouTube. We acted immediately by removing accounts and channels, reporting illegal activity to the authorities, and disabling comments on tens of millions of videos containing minors. Much remains to be done and we continue to work to improve and catch abuse faster.

The company also claims to have served 400 accounts responsible for downloading videos that seem to exploit children and removed millions of comments. YouTube told Vox that it was constantly trying to oust users under the age of 13 from the platform and was trying to hire more child safety experts, including former CIA employees and of the FBI.

Still, YouTube still contains hundreds of thousands of kids videos and many of them still contain problematic comments.

I spent a few hours this week doing my own research on the site. I found that many videos of children playing had their comment sections disabled; I've also seen disassembled videos.

But there are still a lot of innocent content videos that are being exploited, like girls playing in skirts. These videos still contain comments with timestamps, as well as the continuous exchange of personal information.

Another "adpocalypse" as advertisers recoil before YouTube

The problem becomes even more disturbing on the moral plane when you consider that money is earned with this content. Dozens of brands monetize clicks, while YouTube reports about $ 3.9 billion in advertising revenue to Google each year, by Statista.

"The pedophile crisis, like all YouTube crises, is a direct result of the platform's business model," says Josh Golin, executive director of the advocacy group Campaign for a Child Without Advertising. "It's extremely important to note that YouTube's algorithm did not work well; by recommending more and more videos to pedophiles of girls in swimsuits or gymnastics, it worked exactly as planned: keep users on the site as long as possible so that YouTube earns more money. "

Almost every major company you can name, from HBO to Peloton to L'Oreal and Samsung, advertised on the video-sharing site. After the release of these reports last week, many began withdrawing their ads from YouTube. A spokesman for Epic Games, who owns Fortnite, told Wired that the advertisement on YouTube had been suspended and that she had "contacted YouTube to determine the steps to take to eliminate this type of content from its service". Disney, Hasbro and Nestle also fired their YouTube ads, as did AT & T.

This is not the first time YouTube has problems with advertisers. In 2017, brands such as Verizon, Johnson & Johnson and AT & T pulled their ads from YouTube and Google after the revelation that their ads were playing alongside extremist content promoting terrorism.

Some advocates are demanding more action from YouTube, such as cracking down on all types of content for kids.

"Why is YouTube not taking more serious action, such as temporarily stopping all comments and recommendations, removing YouTube kids' content, and using Google's huge reach to tell parents to keep kids safe?" – and children's videos – out of YouTube? ", asks Golin." Obviously, it would act drastic measures, but what will happen if the fact of openly negotiating information with pedophiles on does your site lead you to anything? "(YouTube would not say why it waited so long to tackle the problem of sexuality.) suggestive comments on children's videos.)

But reporting the content solely because it contains children is probably not a viable solution, mainly because there are apparently many videos of children playing with toys , reviewing games, etc. And some of these toy and gaming influencers are generating huge paychecks – for themselves and for YouTube. Other creators say they are trying to solve this moderation problem themselves and they do not want to be punished.

"I do not tell the story because it negatively affects the entire YouTube community," Daniel Keem, host of the YouTube show. DramaAlert, tweeted. His show covers all the drama unfolding in the world of social media. He fired back after a disciple who asked him why he had not mentioned the problems of YouTube with sexually suggestive comments on children's videos. "We do not need another apocalypse announcement. What I did behind the scenes, however, is to contact my YouTube contacts to show them the video and my team shows them the content to delete. This is not right for me. This concerns all my friends, big and small creators. I do not report anything that will affect their livelihood. "

Until now, YouTube is trying to solve the problem in the short term by limiting the broadcast of ads on videos with kids. Sure TwitterHe said, "Even if your video is suitable for advertisers, inappropriate comments may result in your video receiving little or no ads."

This of course has shaken the community of YouTube stars, who fear that their content does not earn money now. Content creators can appeal if their videos are flagged and ads deleted. As a mother, YouTuber tweeted: "MY 5-YEAR SON: doing gymnastics and being a happy, sweet and confident boy. youtube: NOT ADVERTISER FRIENDLY. "

In the meantime, it is clear that even if YouTube solves the problem of its children's content, the task of cleaning up other documents related to the site will be difficult. Last week, Free Hess, a pediatrician and blogger in Gainesville, Fla., Said she found content promoting suicide on YouTube, and that even though she had posted the videos, they continued to reappear.

Publishing this type of violent content is against the rules, but some are not searchable. Instead, Hess discovered that the clips are hidden in children's videos. In a video, a man jumps into the video to say, "Do not forget the kids: aside to attract attention. Long for results, "as he pretends to slice his arm.

"I think it's extremely dangerous for our kids," Hess told the Washington Post about YouTube. "I think our kids are facing a whole new world with social media and Internet access. This changes how they grow and how they develop. I think videos like this put them in danger. "

Research on Peppa Pig and Doc McStuffins is producing alternative videos of violent and inappropriate franchises for children. More recently, Internet trolls have taken advantage of the "Momo Challenge" fear, where a terrifying bug-eyed character (who is actually a Japanese sculpture) is supposed to order children to hurt themselves or kill each other. Although the theory that children commit suicide because of Momo is a viral hoax, mothers say they discover that Momo is now appearing in children's videos on YouTube, grouped in content intended to scare them (although YouTube denies the challenge is presented in videos on his site).

Addressing the problem of cleaning up content on YouTube will not be easy, and it will not happen overnight. But it is a pressing issue for parents and children. If their concerns do not allow the tech giant to act, Google could lose revenue because of the exodus of advertisers.

Want more stories from The Goods by Vox? Sign up for our newsletter here.

[ad_2]

Source link