New Age Verification Systems
In a significant move to prioritize the safety of its younger demographic, Discord is rolling out a comprehensive suite of enhanced safety features. This
initiative, set to begin in early March, will see teen-appropriate settings become the default configuration for all users across the platform. For adult users wishing to adjust these protections, such as loosening content filters or enabling direct messaging with strangers, an age verification process will be required. The platform is leveraging advanced facial age estimation technology and partnering with third-party vendors to confirm user ages accurately. Furthermore, a background tracking software is integrated to help ascertain user ages without always necessitating direct verification from the user, aiming for a more seamless yet secure experience. This robust approach signals a commitment to addressing the complexities of online safety for minors.
Privacy Safeguards Detailed
Discord has emphasized that these new safety protocols are designed with user privacy at their core. The company asserts that video selfies submitted for age estimation purposes remain exclusively on the user's device and are not transmitted or stored externally. Similarly, any identity documents provided for verification are slated for prompt deletion after their use. This commitment to data protection is crucial, especially when dealing with sensitive biometric information. The platform highlighted the success of its pilot programs conducted in Britain and Australia last year, which informed the decision to expand these measures worldwide, indicating a considered and tested approach to implementing these significant changes.
Industry-Wide Trend
Discord's adoption of these advanced age verification technologies aligns with a broader movement across social media platforms to bolster child safety measures. Many competitors have faced mounting pressure and scrutiny regarding their handling of online risks for minors, prompting similar actions. For instance, Roblox initiated global facial age verification for chat features in January, a response to legal challenges concerning its alleged role in enabling predatory behavior. Meta, the parent company of Instagram and Facebook, employs AI to determine user ages and has introduced 'Teen Accounts' with automatic restrictions for those under 18, proactively removing hundreds of thousands of underage accounts in specific regions. TikTok also imposes daily screen time limits and notification cutoff times tailored to age groups. This collective industry shift is also being influenced by evolving legislative landscapes, with numerous US states introducing or enacting age-related social media regulations, even as some legal challenges arise.













