Google implements default SafeSearch to blur explicit imagery in search results
This new policy, initially announced in February, aims to safeguard users and their families from inadvertently stumbling upon explicit material.
Google has completed the rollout of its new SafeSearch feature, which automatically blurs explicit imagery, including violent and sexual photos, in search results. Initially announced in February, this change is now available for all users.
SafeSearch aims to protect users and their families from accidentally encountering explicit content. While SafeSearch is now the default setting, users still have the option to adjust or deactivate it as needed. Google will notify users when SafeSearch is activated by default, and they can view explicit images by clicking on a ‘view image’ button.
However, SafeSearch only applies to Google search results and doesn’t block explicit content on other search engines or websites. Google initially made SafeSearch the default for signed-in users under 18 in August 2021, coinciding with increased scrutiny of tech companies’ impact on children.
This change aligns with Google’s recent efforts to enhance user control over personal information, privacy, and online safety, including easier removal of self-related search results and updated policies on explicit images in search.
Why does it matter?
Explicit content, including violent and sexual imagery, can have adverse psychological and emotional effects, particularly on younger users. Default SafeSearch significantly improves online safety by protecting users, incredibly less tech-savvy individuals who might struggle to activate filters manually. However, it also raises questions about who should decide what content is acceptable, and default filters that blur explicit material could be seen as Google making value judgments on behalf of its users. As Google advances its new policy, it should consider these factors to ensure that online safety measures remain effective and respectful of individual freedoms.