Mandatory Age Checks
Starting next month, Discord will enforce a global age verification policy, requiring all users to prove they are at least 18 years old. This significant
shift for the platform, which boasts over 200 million monthly active users, involves either a facial scan or the submission of identification documents to third-party vendors. This move aligns with a broader trend across social media and AI platforms, driven by increased scrutiny over online child safety and mounting legal challenges. The implementation also foreshadows a future of more age-gated internet access, as some regions consider outright bans on social media for younger users. The platform aims to enhance user safety by implementing these measures across its services, which include voice, video, and text chat functionalities, established in 2015.
Verification Methods Explained
Discord will primarily rely on its internal systems to estimate a user's age, analyzing account history, device data, and platform-wide activity patterns. However, when this automated estimation is insufficient, users will be prompted to verify their age through more direct means. The primary methods involve a video selfie, where the captured footage remains on the user's device and is not stored by Discord. Alternatively, users can upload identification documents to Discord's trusted vendor partners, who are tasked with processing the information. Discord has indicated plans to introduce additional verification options in the future to accommodate varying user needs and preferences. For those who cannot be verified as adults or are determined to be under 18, their accounts will be transitioned to a 'teen-appropriate' experience, restricting access to age-restricted servers and content.
Teen Mode Differences
From early March, all Discord accounts will default to a 'teen-appropriate experience,' requiring users to confirm they are over 18 to exit this mode. This means only verified adult users will gain full access to certain platform features, including age-restricted channels, servers, and advanced safety settings. Users might need to reconfirm their age to enable features like unblurring sensitive content or disabling specific safety options. Furthermore, adults will be the only ones able to participate vocally on stage in servers. For younger users, messages from unknown accounts will be directed to a separate inbox, and friend requests from unfamiliar users will trigger a warning prompt, enhancing their safety and privacy within the platform.
Who Needs Verification?
Discord has clarified that the majority of adult users will not need to undergo manual age verification. The platform's sophisticated age inference model, which analyzes factors such as account longevity, device information, and aggregated community activity patterns, is expected to accurately determine the age of most users without requiring additional steps. Importantly, Discord has stated that private messages and their content are not utilized in this age estimation process. Consequently, users who are not accessing age-restricted content or attempting to modify certain safety parameters will likely continue to use the platform without any changes to their current experience. Verification will primarily be prompted for individuals seeking access to mature content or specific functionalities.
Reasons for Age Checks
The implementation of these age verification measures globally by Discord is a direct response to increasing pressure from regulatory bodies and advocacy groups focused on online child safety. Following similar initiatives by other major platforms, Discord is now standardizing these checks worldwide, building upon its existing safety infrastructure. This proactive approach aims to provide robust protections for younger users while granting verified adults the flexibility they need. As stated by Discord's head of product policy, Savannah Badalich, the company prioritizes teen safety in its product design and is committed to ongoing collaboration with safety experts, policymakers, and users to foster a secure and supportive environment on the platform.
Privacy and User Concerns
Discord's new age verification policy has sparked significant user apprehension, with many expressing frustration and contemplating leaving the platform or canceling subscriptions. Chief among these concerns are privacy issues, particularly in light of a past data breach incident. Last October, sensitive data, including government ID photos from approximately 70,000 users, was potentially exposed due to a breach at a third-party vendor handling age-related appeals. Although Discord has ceased working with that vendor, the incident amplifies existing worries about user data protection. Digital rights organizations, such as the EFF, have voiced strong opposition, arguing that mandated age verification, especially through face scans or IDs, could lead to increased surveillance, enable censorship, and compromise user anonymity, potentially silencing vital online support communities.
Industry-Wide Trends
Discord's move towards mandatory age verification is part of a wider trend across major online platforms seeking to enhance child safety. Companies like Instagram, YouTube, OpenAI, and Anthropic are also deploying AI-powered tools to better ascertain user ages. Instagram, for instance, began requiring video selfies from underage users who changed their age to over 18 back in 2022 and has since intensified efforts to create more private accounts for teens. Roblox introduced mandatory facial verification for all users to access chat features, with over 45 percent of its daily active users having completed an age check. YouTube launched its age-estimation technology in the US in July 2025, and OpenAI recently rolled out an age prediction model utilizing account and behavioral signals to identify users under 18.


