Apple just delayed the iPhone photo scanning program following the backlash

Apple just delayed the iPhone photo scanning program following the backlash

Apple has reportedly decided to delay an upcoming program to scan iPhones for controversial child sexual abuse material (CSAM).

"Last month, we announced plans for a feature intended to help limit the spread of child sexual abuse materials, protecting children from predators who use communication tools to recruit and exploit them," Apple said in a statement emailed to reporters.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the next few months to gather input and make improvements before releasing these critically important child safety features."

Apple's plan, announced last month, was to scan the iPhone and iCloud simultaneously for known CSAM images. This is supposed to be implemented by the end of 2021 as an update to iOS 15, which itself is likely to roll out in September or October.

This rather complex system uses artificial intelligence to examine all images in an Apple user's photo library and matches them against a database of CSAM images provided by the National Center for Missing & Exploited Children (NCMEC) ...

If a total of 30 CSAM matches are found, both on the user's iPhone and in the same user's iCloud photo folder, the system will flag the Apple account and images for human review.

Apple defends the program as protecting user privacy as well as protecting abused children.

"We are ...... We think what we're doing here is a cutting-edge advance in privacy and enables a more private world," Craig Federighi, Apple's vice president of software engineering, told the Wall Street Journal.

Federighi added that the scanning system was designed "in the most privacy-protective way we can imagine, in the most auditable and verifiable way possible."

Apple's position is that other cloud storage companies already scan uploaded user images for CSAM without notifying users, but Apple does not and will not until its system can be implemented This is not the case. However, Apple does scan iCloud Mail for CSAM.

Despite this reassurance, the announcement was met with significant pushback from privacy advocates and technology policy experts. The Electronic Frontier Foundation called Apple's program "mass surveillance" and joined the ACLU, the Center for Democracy and Technology, and dozens of other groups in writing a letter to Apple CEO Tim Cook asking him to cancel the program.

"We thought our devices belonged to us, and Apple said during the Apple v. FBI trial, 'Your devices belong to you. Our devices are not ours,'" Riana Pfefferkorn, a researcher at Stanford University's Center for Internet and Society, said in an interview with the Verge." Now it looks like, well, maybe the device really is still Apple's after all, or at least the software on it."

.

Some in the tech community have speculated that Apple planned to implement a CSAM scanning program to satisfy law enforcement officials.

According to a Reuters report in early 2020, the company reportedly had plans in 2018 to fully encrypt iPhone iCloud backups so that even Apple could not see them, but did not do so because the FBI said it would interfere with criminal investigations It was.

"I think there is some political strategy going on behind the scenes here," Jen King of Stanford University's Institute for Human-Centered Artificial Intelligence told the Verge.

"If they're trying to take a big overall position on encryption, this was the part that they had to cede to law enforcement to do that."

When Tom's Guide asked if there might be some quid pro quo between Apple and the US Department of Justice regarding the CSAM scan, an Apple spokesperson had no comment.

Categories