The latest series of troubling Internet challenges has taken a dangerous turn, with several "child-friendly" videos explaining to children how to commit suicide or encouraging them to commit high-risk acts.

A trend called the "Momo Challenge" has sparked enthusiasm over the last few weeks, even though the character – a skinny and terrifying doll that asks viewers to participate in such trivial and deadly challenges – has appeared on the Internet for less than a year. year, according to international police agencies and media outlets.

Momo, with his bulging eyes and spiky hair, would appear on sites or apps like WhatsApp, Facebook and YouTube, sometimes in conjunction with children's videos intended to represent the popular "Fortnite" game or Peppa Pig character.

Momo's current face depicts a sculpture called "Mother Bird", made by a Japanese special effects company called Link Factory, which is not associated with the challenge itself.

The person or people pretending to be the character incite internet users to contact them via a WhatsApp number. Then, as "Momo", they ask people to take on challenges, some of which involve self-injury or suicides, such as how to take medication, according to international reports. Other examples include lighting the oven at night.

Momo also tells viewers that she will "curse" them if they do not do what she says, and encourage them to not talk to anyone about this challenge.

The avatar of "Momo" seen on Facebook. (Photo: Amanda Oglesby / staff photo)

Some have called Momo a hoax, others insist that their children have been in contact with those who hide behind this mysterious figure. The changing and bottomless nature of Internet content, as well as the tendency of YouTubers and others to quickly capitalize on scary memes, make it difficult to determine the origin or extent of the widespread impact of the trend.

However, the challenge has been associated with several suicides around the world, including two children who committed suicide a few days apart in September in Barbosa, Colombia, according to the newspaper. Daily Mail of England reports. The boy, aged 16, allegedly involved the 12-year-old girl in the game before their death. Police found game-related messages on children's phones, reports said.

"Our advice, as always, is to supervise the games your kids play and be extremely attentive to the videos they watch on YouTube," the Northern Ireland Police Service said Saturday in an article about the challenge. . "Make sure the devices they access are limited to age-appropriate content."

Parents and others have posted the videos on Facebook, Youtube and other sites. Although some have been removed, others are still online and sometimes the old ones reappear.

More: The game Momo Challenge, an online viral game, causes the loss of sleep and nerves frayed in Brick

Unrelated suicide messages have been found in popular children's videos describing "Splatoon", a children's game in which squid characters are throwing ink. Filthy Frank, a character created by former Youtuber George "Joji" Miller, appears in the middle of the video and seems to give kids tips on how to cut their wrists.

It's unclear how the disturbing clip was included in a video for kids.

"Finish," he says at the end of the 11-second segment, which then returns to the "Splatoon" video.

Free Hess, a Florida-based pediatrician and mother of her own website,, said she first met the video with an excerpt of suicide instructions published about seven months ago by a concerned parent.

Hess said that even though the clip had been removed from YouTube Kids – a version of YouTube available as an app designed for kids – it had surfaced on YouTube. There was also a second video that was removed from Youtube.

"There must be a better way to ensure that this type of content is not seen by our children," Hess said in a blog post last Friday. "We can not continue to risk that."

More: Teen suicide rate more than doubled: here's how you can help save your child

More: Experts in suicide prevention: what you say (and do not say) could save a person's life

In a statement, YouTube said that all videos that do not belong to the application are removed and that the service has invested in additional parental controls to customize the user experience.

"We make sure that YouTube videos are family friendly and take the reactions very seriously," said YouTube.

Last year, the YouTube Kids app was criticized for letting several videos leak into the app that were not appropriate for kids. The parent company of YouTube, Google, responded with an update allowing parents to manage the application with more child-friendly channels, such as Sesame Street.

The suicide rate in the United States has increased in recent years, including among miners.

Resources to help:

Suicide Lifeline: If you or someone around you is struggling with suicidal ideation, you can call the US National Suicide Prevention Lifeline at 800-273-TALK (8255) anytime of the day or night or chat online.

Crisis text line provides free, confidential, 24/7 support by SMS to people in crisis when calling 741741.

Contributor: Brett Molina

Like all moms?

Connect with us on Facebook.


Read or share this story: 3004431002 /