Internal Alarms Sounded
Internal communications from Meta, brought to light in a New Mexico court filing, reveal significant internal opposition to the planned rollout of end-to-end
encryption for its Facebook and Instagram messaging services. Despite CEO Mark Zuckerberg's public framing of the initiative as a privacy enhancement, a key figure, Monika Bickert, Meta's head of content policy, expressed serious reservations in March 2019. In a candid internal exchange, she described the planned action as "a bad thing as a company" and "so irresponsible." These unearthed documents, including emails and briefing materials obtained through legal discovery for a lawsuit filed by New Mexico Attorney General Raul Torrez, offer a stark contrast to the company's public narrative, detailing how senior policy and safety executives internally assessed the significant risks associated with the encryption plan.
Child Exploitation Risks
The core of the internal dissent revolved around the severe implications for detecting and reporting child exploitation. Documents indicate that Meta executives understood encryption would dramatically reduce their ability to flag illicit activities to law enforcement. Bickert explicitly stated that with end-to-end encryption, there would be "no way to find the terror attack planning or child exploitation" for proactive referral. A briefing document from February 2019 projected a drastic 65% drop in reported cases of child nudity and sexual exploitation imagery to the National Center for Missing and Exploited Children, from 18.4 million to 6.4 million, if Messenger had already been encrypted. Furthermore, a later update to the same document estimated that Meta would have been unable to proactively provide data to law enforcement in a staggering number of cases, including 600 related to child exploitation, 1,454 to sextortion, 152 to terrorism, and 9 involving threatened school shootings.
Executive Discrepancies
The internal communications underscore a significant divergence between public statements and private assessments. While Mark Zuckerberg was publicly promoting the encryption plan on privacy grounds, senior safety and policy executives privately voiced their concerns and even expressed a lack of enthusiasm for defending the initiative. Bickert, for example, admitted she was "not very invested in helping him sell this." Antigone Davis, Meta's Global Head of Safety, also raised critical points, highlighting how platforms like Facebook, which are semi-public social networks, allow predators to find both each other and children, with an easy transition to private messaging services like Messenger. She contrasted this with WhatsApp, Meta's already encrypted service, which is not directly linked to a social media platform and therefore does not present the same immediate risks of facilitating new social connections between unknown parties, suggesting that encrypting Messenger would be "far, far worse" than anything observed on WhatsApp.
Legal Battles and Safeguards
This revelation comes amidst a turbulent period for Meta, facing a multitude of legal and regulatory challenges globally concerning youth welfare on its platforms. The New Mexico lawsuit, brought by Attorney General Raul Torrez, specifically accuses Meta of misrepresenting the safety of its encryption plan, alleging the company allowed predators unfettered access to underage users, often leading to real-world abuse and human trafficking. A trial is currently underway in this landmark case. In response to the concerns highlighted by Bickert and Davis, Meta spokesperson Andy Stone stated that the company developed additional safety features before launching encrypted messaging on Facebook and Instagram in 2023. These features include the ability for users to report objectionable messages and the creation of special accounts for underage users designed to prevent unknown adults from initiating contact, aiming to mitigate the risks associated with encrypted chats.














