Apple confirms to start scanning iCloud and messages to detect child abuse images

Apple confirms to start scanning iCloud and messages to detect child abuse images

Apple will begin using a system that detects sexual photos in messages, photos, and iCloud and compares them to a database of known child sexual abuse material (CSAM).

According to the announcement (via Reuters), these new child safety measures will be implemented with the release of iOS 15, watchOS 8, and macOS Monterey later this year. The move comes several years after Google, Facebook, and Microsoft introduced similar systems. Google introduced its "PhotoDNA" system in 2008, and Microsoft followed suit in 2009. Facebook and Twitter have had similar systems in place since 2011 and 2013, respectively.

Update (8/14): since the initial announcement, Apple has clarified its new photo scanning policy, specifically stating that it will only scan CSAM images flagged by multiple national clearinghouses.

The Messages app will now alert children and their parents if sexually explicit photos are sent or received. The app will blur the image and say, "It's not your fault, but sensitive photos and videos can be used to hurt you."

The system uses machine learning on the device to analyze the images. If it determines that the photo is sexually explicit, it is blurred.

"iOS and iPadOS use a new application of cryptography to limit the spread of CSAM online while respecting user privacy."

"CSAM detection helps Apple provide law enforcement with valuable information about CSAM collections in iCloud Photos.

This system allows Apple to detect CSAMs stored in iCloud Photos. It then sends a report to the National Center for Missing and Exploited Children (NCMEC).

According to MacRumors, Apple uses a "NeuralHash" system that compares photos on a user's iPhone or iPad before they are uploaded to iCloud. If the system finds that a CSAM has been uploaded, the case is escalated for human review.

Apple also allows Siri and Search to assist children and parents regarding CSAM reporting. Basically, when someone searches for something related to CSAM, a pop-up appears to assist the user.

Of course, such a system will always raise privacy concerns. Apple aims to address this as well. [Apple's method of detecting known CSAMs is designed with user privacy in mind. Rather than scanning images in the cloud, it uses a database of known CSAM image hashes provided by NCMEC and other child safety organizations to match them on the device. Apple further converts this database into a non-readable hash set that is securely stored on the user's device."

According to Apple, "the chance of accidentally flagging a particular account is less than one in a trillion per year."

Even with the allegedly low rate of false flagging, some fear that this type of technology could be used in other ways, such as pursuing anti-government protesters who upload images critical of the regime.

Regardless of potential privacy concerns, John Clark, chief executive of the National Center for Missing & Exploited Children, believes that what Apple is doing is more beneficial than harmful.

"With so many people using Apple products, these new safety measures have the potential to save the lives of children who are seduced online and whose horrific images are circulating as child sexual abuse material," Clark said in a statement. 'The reality is that privacy and child protection can coexist.'

Turning back to iOS and Android Messages, Google appears to be working on an upgrade to its Messages app to make it easier for Android users to text their iPhone friends.

Categories