A creepy photo of a female sculpture with long black hair, bulging eyes, and a grin that stretches across her face has ricocheted around the internet for the past few years. But the character, named Momo, has recently begun to infiltrate YouTube videos meant for kids and has apparently been promoting suicide and other dangerous activities.
According to numerous reports, children have been watching Peppa Pig and Fortnite videos that then feature Momo instructing viewers to self-harm or perform other dangerous stunts.
The Manchester Evening News reported that a primary school in the U.K. sent out a warning to parents saying, “These video clips are appearing on many social media sites and YouTube (including Kids YouTube). One of the videos starts innocently, like the start of a Peppa Pig episode for example, but quickly turn into an altered version with violence and offensive language. … Examples we have noticed in school include asking the children to turn the gas on or to find and take tablets.”
One U.K. mother told the Daily Mail that her 8-year-old son began seeing Momo in some of the videos he watched. Lyn Dixon said her child then became scared of the dark and didn’t want to be left alone.
“He showed me an image of the [Momo] face on my phone and said that she had told him to go into the kitchen drawer and take out a knife and put it into his neck,” Dixon told the newspaper. “We’ve told him it’s a load of rubbish and there are bad people out there who do bad things but it’s frightening, really frightening … He was terrified and wouldn’t sleep in his own bed and then we got to the bottom of it and we explained it wasn’t real.”
Other police officials, though, warn the Momo challenge is really run by hackers looking for information and data.
“As creepy as she looks, ‘Momo’ isn’t going to crawl out of your child’s phone and kill them,” the Police Service of Northern Ireland wrote on Facebook. “The danger lies with your child feeling pressured to either follow the orders of ANY app via ‘challenges,’ or peer pressure in chat rooms and the like. This is merely a current, attention grabbing example of the minefield that is online communication for kids. In 2017 it was ‘Blue Whale,’ now it’s ‘Momo.’ There’ll be something else next.”
The game has reportedly been blamed for teen and pre-teen suicides in Argentina, France, and Belgium, though it’s unclear if there’s any link between Momo and the deaths.
Momo made its way to YouTube last year as a number of content producers created creepy 3am Momo challenges. Some of those videos received millions of views.
Now, Momo is reportedly being spotted in the Kids app.
It’s yet another problem for YouTube and its Kids app. Last week, it was revealed that a mother discovered a children’s video that had been edited to include an old skit by Filthy Frank instructing how to self-harm by cutting your wrist.
“We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video,” YouTube said in a statement. “Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed.”
In 2017, a number of bizarre and obscene videos that featured Frozen’s Elsa and other superheroes were discovered on the Kids app. Videos of conspiracy theories also had been previously seen on the app.
As YouTube usually says in situations like this, it still has more work to do to keep kids safe, because people continue to find ways to make disturbing content available for them to watch.