Search Engines to Redirect Self-Harm Queries and Blur Explicit Content for Child Protection
Search engines like Google and Bing will soon implement new measures to protect Australian users, particularly children, from harmful online content. Starting December 27, these platforms will automatically redirect users searching for information on suicide, self-harm, and eating disorders to mental health support services. Additionally, pornographic images in search results will be blurred to prevent accidental exposure to children, although adults can choose to view them by clicking through. These changes are part of the Age-Restricted Material Codes, which apply to various online service providers, including app stores and social media services. The eSafety Commissioner, Julie Inman Grant, emphasized the importance of these measures, noting that many young people encounter age-inappropriate content unintentionally, often through search engines. The initiative aims to prevent children from being exposed to disturbing content and to provide immediate access to professional help for those in distress.