What is the story about?
WhatsApp, the messaging platform owned by Meta Platforms, has announced the rollout of parent-managed accounts designed for children under the age of 13.
The new feature allows parents or guardians to create and supervise accounts for younger users while maintaining the app’s core privacy protections. The feature, which is being introduced in phases, includes tools that enable guardians to manage contact lists and privacy settings through a dedicated parental control system.
New controls aim to balance safety and access
According to a PTI report, the new system allows parents to set up accounts for pre-teens and limit their activity primarily to messaging and calling functions. According to the company, the controls were developed following consultations with families and child safety experts.
Under the model, parents will be able to approve contacts, manage account settings and monitor certain aspects of usage. However, message content will remain inaccessible to guardians because conversations will still be protected by end-to-end encryption, a key privacy feature of the platform.
In a blog post announcing the update, the company said the changes were designed to give families greater flexibility while maintaining safety.
“With input from families and experts, we’re rolling out parent-managed accounts that allow parents or guardians to set up WhatsApp for pre-teens with controls to limit their experience to messaging and calling,” the company said.
Access to the parental control settings will be protected through a parent PIN, ensuring that only guardians can change privacy settings on the managed device.
Strategic shift in age policy
The introduction of supervised accounts marks a shift in the platform’s long-standing policy regarding age restrictions.
Historically, WhatsApp required users to be at least 13 years old, or older depending on regional regulations. The new system effectively allows younger users to access the platform under the supervision of a parent or guardian.
Technology analysts say the move reflects growing demand from families seeking secure communication tools for children.
“Messaging apps are increasingly becoming essential communication tools within families,” said a digital policy researcher based in London. “Supervised accounts allow platforms to address safety concerns while acknowledging that younger users are already engaging with digital communication.”
Growing focus on child safety online
The announcement comes amid increasing global scrutiny over how social media and messaging platforms handle child safety, online privacy and digital wellbeing.
Governments in several regions have introduced regulations requiring technology companies to implement stronger protections for minors. At the same time, parents and educators have raised concerns about children’s exposure to online risks, including cyberbullying, inappropriate content and contact with unknown users.
In response, many technology companies have introduced youth-focused safety tools, including restricted accounts, screen-time management features and parental supervision options.
Impact on families and the messaging ecosystem
The new parent-managed account model could give families a structured way to introduce children to digital communication platforms while maintaining safeguards.
Parents will be able to restrict who their children can communicate with, manage privacy settings and maintain control over account configuration. At the same time, the continued use of end-to-end encryption ensures that message content remains private.
Experts say the model attempts to balance two competing priorities: protecting children online while preserving user privacy.
WhatsApp said the parent-managed accounts feature will be rolled out gradually across different regions as the company gathers feedback from users and safety experts.
Industry observers expect technology companies to continue developing tools that combine privacy protections with stronger parental oversight, particularly as younger users increasingly adopt digital communication platforms.
As regulators and advocacy groups continue to scrutinize online safety practices, supervised accounts may become a more common feature across messaging and social media services worldwide.














