Apple announces new iPhone features to detect child sex abuse


Following a report on work the company was doing to create a tool that scans iPhones for child abuse images, Apple has published a post that provides more details on its efforts related to child safety. With the release of iOS 15, watchOS 8 and macOS Monterey later this year, the company says it will introduce a variety of child safety features across Messages, Photos and Siri. 

To start, the Messages app will include new notifications that will warn children, as well as their parents, when they either send or receive sexually explicit photos. When someone sends a child an inappropriate image, the app will blur it and display several warnings. “It’s not your fault, but sensitive photos and videos can be used to hurt you,” says one of the notifications, per a screenshot Apple shared. 

As an additional precaution, the company says Messages can also notify parents if their child decides to go ahead and view a sensitive photo. “Similar protections are available if a child attempts to send sexually explicit photos,” according to Apple. The company notes the feature uses on-device machine learning to determine whether a photo is explicit. Moreover, Apple does not have access to the messages themselves. This feature will be available to family iCloud accounts. 

Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use the technology to notify National Center for Missing and Exploited Children (NCMEC), which will in turn work with law enforcement agencies across the US. “Apple’s method of detecting known CSAM [Child Sexual Abuse Material] is designed with user privacy in mind,” the company claims. 

Rather than scanning photos when they’re uploaded to the cloud, the system will use an on-device database of “known” images provided by NCMEC and other organizations. The company says that the database is unreadable with the help of image hashing, which turns the photos into a sort of digital fingerprint. 

A cryptographic technology called private set intersection allows Apple to determine if there’s a match without seeing the result of the process. In the event of a match, an iPhone or iPad will create a cryptographic safety voucher that will encrypt the upload, along with additional data about it. Another technology called threshold secret sharing makes it so that the company can’t see the contents of safety vouchers unless someone uploads an unspecified threshold of CSAM content. “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” according to the company. 

Developing…

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *