By Nate Raymond
BOSTON, April 10 (Reuters) - Meta Platforms must face a lawsuit by Massachusetts' attorney general alleging that the Facebook and Instagram parent deliberately designed features to addict young users, the state's top court ruled on Friday.
The ruling by the Massachusetts Supreme Judicial Court marked the first time a state high court has considered whether a federal law that generally shields internet companies from lawsuits over content posted by their users would also bar claims that companies like
Meta knowingly addicted young users.
Meta has denied the allegations and says the company takes extensive steps to keep teens and young users safe on its platforms.
The decision comes in the wake of a landmark trial in which a Los Angeles jury on March 25 found Meta and Alphabet's Google negligent for designing social media platforms that are harmful to young people. It awarded a combined $6 million to a 20-year-old woman who said she became addicted to social media as a child.
A separate jury a day earlier found Meta owed $375 million in civil penalties in a lawsuit by New Mexico's attorney general accusing the company of misleading users about the safety of Facebook and Instagram and of enabling child sexual exploitation on those platforms.
Thirty-four other states are pursuing similar cases against Meta in federal court. The case by Massachusetts Attorney General Andrea Joy Campbell, a Democrat, is one of at least nine that state attorneys general have since 2023 pursued in state court, including one filed Wednesday by Iowa Attorney General Brenna Bird, a Republican.
Campbell's lawsuit garnered early headlines because of allegations it first aired about how CEO Mark Zuckerberg had been dismissive of concerns that aspects of Instagram could have a harmful effect on its users.
The lawsuit alleged that features on Instagram such as push notifications, "likes" of user posts and a never-ending scroll were designed to profit off teens' psychological vulnerabilities and their "fear of missing out."
The state alleged that internal data showed the platform was addicting and harming children, yet top executives rejected changes its research showed would improve teens' well-being.
Menlo Park, California-based Meta had sought to duck the Massachusetts case based on Section 230 of the Communications Decency Act of 1996, a federal law that broadly shields internet companies from lawsuits over content posted by users.
The state argued Section 230 does not apply to false statements it said Meta made about the safety of Instagram, its efforts to protect its young users' well-being or its age-verification systems to ensure people under age 13 stay off the platform.
A trial court judge agreed and said the law also did not apply to allegations concerning the negative impacts of Instagram's design features because the state was "principally seeking to hold Meta liable for its own business conduct," not content posted by third parties.
(Reporting by Nate Raymond in Boston; Editing by Alexia Garamfalvi and Dan Wallis)











