Apple Expands Child Safety Measures with New Nudity Reporting Feature in Messages

Apple Expands Child Safety Measures with New Nudity Reporting Feature in Messages

Image Source:- 9to5mac.com

Apple is advancing its child safety features, introducing a new tool that lets children report nudity in photos or videos directly to Apple, as reported by The Guardian. Currently being tested in Australia through iOS 18.2, this feature is an extension of Apple’s Communication Safety tool, which already blurs explicit content detected in photos or videos sent via Messages, AirDrop, or Contact Poster.

When an image or video with nudity is received, Apple’s device-side scanning technology automatically blurs the content and displays a warning pop-up. This alert provides options for the user, including messaging a trusted adult, accessing resources for help, or blocking the sender. With the new reporting option, young users can now send a report directly to Apple if they receive inappropriate material, providing an added layer of protection and response.

The report process is detailed, gathering the flagged image or video, messages exchanged immediately before and after, and contact information for both parties. Additionally, users can complete a form describing the incident. Apple then reviews the report and, depending on the circumstances, can block the sender from sending further messages or, if warranted, report the case to law enforcement.

Apple’s move aligns with recent expansions in digital safety measures from other tech giants, including Google. This week, Google announced a similar on-device Sensitive Content Warning for its Android app, which also automatically blurs explicit content and offers support resources for users under 18.


Apple intends to make this new reporting feature available globally, although it has not provided a timeline for the worldwide release. This effort marks Apple’s latest approach to child protection, following its 2021 announcement of similar child safety measures. After backlash from privacy advocates, Apple re-evaluated and ultimately canceled plans to scan iCloud Photos for child sexual abuse content in 2022, opting instead for the on-device solutions that prioritize user privacy.