Businesses are pulling their advertising campaigns out of YouTube amid reports that a pedophile ring would operate openly in the commentary section of young children's videos, Bloomberg reported. Disney and Nestle are among those who apparently would have reduced their spending after a video on YouTube revealed the persistent problem.
A video shared by YouTuber Matt Watson on Sunday described what he described as a "network of soft-core pedophiles," made possible by commentators of videos about children, especially young girls. These videos, which are monetized by society, are flooded with comments from apparent pedophiles who exchange contact information and links to child pornography. They also timestamped what Watson called "points in the video where little girls are in compromising positions, sexually implicit positions".
Watson called the algorithm used by YouTube to make these videos appear as a "wormhole" of exploiting content. Once a YouTube user has clicked on several of these videos, their suggested content column is flooded mainly with children's videos.
Wired was able to replicate Watson's claims and said that the videos he saw often included girls playing, swimming or eating ice lollies and, in some cases, more graphic content. Once some of these videos were viewed, Mr. Wired said that YouTube's algorithm was showing videos that seemed to be popular with other pedophiles. In many cases, the site said, videos of young children with pre-roll ads attached have accumulated hundreds of thousands, if not millions, of views.
Companies are now choosing to stand out from the controversy by contacting YouTube about the problem or ending ad campaigns.
"I can confirm that all Nestlé companies in the United States have suspended their ads on YouTube," a Nestlé spokesperson told Gizmodo in a statement by e-mail. Bloomberg quoted sources as saying Disney did the same, although the company did not immediately return a request for comment.
A spokesperson for Epic Games, the developer behind Fortnite, told Wired that, through its advertising agency, the company had "contacted YouTube to determine what steps they would take to eliminate this type of content from their service." Grammarly told Wired that she had also contacted YouTube about the problem.
The troubling and predatory comments on the children's videos posted on YouTube resulted in a similar response from advertisers in 2017. The company then said that it was working to solve the problem, but that this seems to remain an ubiquitous problem on the site.
A spokesperson for YouTube said the company was working to fix the problem and had disabled comments on millions of children's videos. The company has also removed more than 400 accounts from some commentators on these videos, as well as some videos that may put young people at risk. The spokesman added that YouTube reported any illegal comments to the National Center for Missing and Exploited Children.
"All content – including comments – putting minors at risk is obnoxious and we have clear policies banning this on YouTube," said a YouTube spokesperson in a statement by email. "We acted immediately by removing accounts and channels, reporting illegal activity to the authorities and disabling comments on tens of millions of videos containing minors. Much remains to be done and we continue to work to improve and catch the abuse faster. "[Bloomberg]