In Ghaziabad this week, three sisters—aged 12, 14 and 16—died by suicide, after allegedly being consumed by an online task-based Korean “love game”. The tragedy has reopened a terrifying question: What
are these digital challenges and why do they have so much power over kids’ minds?
There’s a pattern here—not just in India, but across the world.
Korean Love Game
This is the online app linked to the Ghaziabad sisters’ case. Investigators say it begins with strangers, often claiming to be foreign or Korean, chatting with kids online. Once trust grows, they assign a series of tasks. Over days, tasks get harder. Some reports suggest the final “challenge” is to kill yourself.
Blue Whale Challenge
First reported in Russia, this challenge sounds innocuous at first—wake up early, watch a scary movie—but escalates over a 50-day series of “dares,” ending in self-harm or suicide. It spread through social networks, often via private messages.
Momo Challenge (Urban Legend)
In 2018, stories spread about a creepy contact named “Momo” on WhatsApp that challenged kids to dangerous actions. Law enforcement says much of this was a hoax, but the panic showed how quickly fear spreads online.
PUBG and Gaming Addiction Cases
Serious harm isn’t only found in “suicide games”. Popular games like PUBG (and its Indian version BGMI) have been associated with addiction, extreme hours of play, and stress when access is cut off. While not task-based suicide challenges, there have been cases of teenagers in India struggling with withdrawal or depression when banned or restricted.
Even when a game is banned, kids find ways to get it:
VPNs (Virtual Private Networks): A VPN masks where you actually are, making it look like you’re in another country. This tricks phones and app stores into letting you download apps that are banned in your region.
Alternate Accounts & Fake Ages: Platforms ask for a birthdate—and many kids simply lie. Without strong age verification, they slip through.
Social Media Groups: Links, invitations and invites to private chats spread on messaging apps and TikTok-style platforms, often hidden from parents and unmonitored by moderators.
This is how digital teens in 2026 can easily access games and content adults believe are blocked or too dangerous.
They Prey on Vulnerability
Experts who have studied these phenomena say something chilling: games like the Korean task app or Blue Whale don’t just “entertain”—they connect with emotional pain.
Research on similar online challenges shows they latch onto teens who feel isolated, unseen, hungry for belonging, and unsupervised. By giving tasks, praise, threats and a sense of identity, they manipulate psychological need. That’s exactly what police noticed in the Ghaziabad case—the girls had dropped out of school after Covid-19, had limited real-world social interaction and were deeply immersed in their game world.
When the phone was taken away, it wasn’t a toy being taken—it was their world.
It’s Not Just Bans — It’s Engagement
People talk about banning social media for kids — a move Australia has already passed for under-16s, and Europe is considering — but bans alone won’t solve the emotional disasters unfolding online.
The real issue? Kids aren’t just clicking screens — they’re seeking connection, identity, meaning.
When a child feels lonely, no game is harmless.
When a child feels unseen, a task or challenge can feel like purpose.
And when a child feels alone, digital strangers can become more powerful than real caregivers.
This isn’t just about scary games on phones.
It’s about:
Why kids get pulled in
How easy it is to access dangerous content
How bans without supervision still leave cracks
How emotional neglect creates vulnerability
Because no software should ever become a child’s emotional universe.
This tragedy is not just Ghaziabad’s.
It’s a global alarm that tells us loud and clear that we need better digital literacy, stronger safeguards, and — above all — more present, engaged parenting.














