Accountability Shifts to Platforms
A significant legal development occurred when a Los Angeles court mandated that Meta and YouTube must pay $6 million to a young woman, now 20, who suffered
mental health repercussions as a minor due to their platforms. The jury determined that the companies were negligent in their design of these digital spaces, which were engineered to be addictive through features like algorithmic recommendations, continuous scrolling, and persistent notifications. Crucially, the court acknowledged that these platforms were aware of the adverse effects, including increased anxiety, depression, and obsessive usage, yet prioritized engagement metrics that fuel their substantial advertising revenue. This verdict not only establishes liability for past harms but also sets a precedent that could impact thousands of similar lawsuits pending across the United States, challenging the prevailing notion that young users are solely responsible for their online experiences.
Bans Fall Short
Recent discussions, like Karnataka's proposed ban on social media for individuals under 16, echo similar measures taken in various countries, highlighting a growing concern over youth engagement with digital platforms. India, with over 460 million social media users, a third of whom are under 18, faces significant challenges such as cyberbullying and negative impacts on self-image and sleep. However, the recent US court verdict undermines the efficacy and philosophical basis of outright bans. Such restrictions often prove ineffective, as children can easily circumvent them through age misrepresentation, shared device usage, or VPNs, as research has repeatedly demonstrated. Furthermore, implementing stringent age verification raises privacy and surveillance concerns. While initiatives like encouraging reading are commendable, they do not address the fundamental issue of an attention economy built on intentionally addictive design principles.
Redesigning Digital Rules
The core issue, as highlighted by the court's findings, lies not with children's online choices but with the inherent business models of social media platforms. These models are designed to maximize user engagement and data collection to drive advertising revenue, a concept often referred to as 'persuasive design' within 'platform capitalism.' Internal company documents reportedly revealed awareness of the mental health risks posed to adolescents, yet the pursuit of engagement metrics that generate billions in profit took precedence. Therefore, the verdict rightly places the onus of responsibility on the platform architects, rejecting the idea that minors are the primary source of the problem. A more effective approach, as seen in frameworks like the EU's Digital Services Act and the UK's Age-Appropriate Design Code, involves robust platform-focused regulation. This includes mandating algorithmic transparency, limiting data usage for minors, and enforcing safer design practices. India's upcoming Digital India Bill has the potential to adopt similar measures, such as default age-appropriate design, restrictions on targeted advertising to minors, and curbs on addictive features like autoplay and endless scrolling, coupled with significant penalties for non-compliance nationwide.














