Apple iPhone Update to Include Photo Scanning for Child Pornography

(AP Photo/Bebeto Matthews)

Apple, a company with a very high-profile commitment to user privacy, is worrying some users as it unveils a plan to combat child pornography and sexual abuse.

In a plan released on Thursday, Apple announced it would introduce child safety features aimed at identifying child sexual abuse in images, increasing parents’ role in children’s communication on devices, and an expansion of Siri to allow parents and children to find help if they come across an unsafe situation.

Advertisement

The initiative that is perhaps creating the most buzz is cryptography the company plans to introduce that would scan photos being uploaded into iCloud.

The detection system will only flag images that are already in the center’s database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Other abuses could include government surveillance of dissidents or protesters. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Here’s how it works: When Apple drops iOS 15 and masOC Monterey in the next couple of months, a new software called NeuralHash will assign a special code to each image. Once a user uploads their photos to iCloud, those hashes will be run against hashes of known child abuse imagery, with these hashes provided by child advocacy groups. NeuralHash will be able to identify the matching hash without revealing what the image is or letting the user know it matches.

Advertisement

More from TechCrunch:

The results are uploaded to Apple but cannot be read on their own. Apple uses another cryptographic principle called threshold secret sharing that allows it only to decrypt the contents if a user crosses a threshold of known child abuse imagery in their iCloud Photos. Apple would not say what that threshold was, but said — for example — that if a secret is split into a thousand pieces and the threshold is ten images of child abuse content, the secret can be reconstructed from any of those ten images.

It’s at that point Apple can decrypt the matching images, manually verify the contents, disable a user’s account and report the imagery to NCMEC, which is then passed to law enforcement. Apple says this process is more privacy mindful than scanning files in the cloud as NeuralHash only searches for known and not new child abuse imagery. Apple said that there is a one in one trillion chance of a false positive, but there is an appeals process in place in the event an account is mistakenly flagged.

Now, the fight against child sexual abuse and child pornography is a noble one and Apple needs to be commended for joining that fight. But the issue of user privacy is raising a lot of concern among Apple users and tech professionals, and there are instances where Apple has indeed caved to government pressure over its systems in the past.

Advertisement
Apple's Tim Cook / AP/Reuters Feed Library

Two years ago, at the height of the protests in Hong Kong, Apple caved at the behest of China and removed from its store apps that dissidents were using and organizing through. In fact, this initiative is probably related to ongoing efforts from regulatory bodies at the European Union and elsewhere to force tech companies like Apple to do more. In fairness, I do believe that Apple can do on its own better than a government program or regulation could ask it to do. But if governments keep pushing Apple, the company runs the risk of opening its door to the surveillance state much more easily than it used to.

It’s also still not clear how safe the system would be against spoofing attempts from malicious actors.

There are still a lot of questions. Initiatives like this have been on the wish list of law enforcement for a while, and while it is for a good reason, a lot of tech advocates are expressing concerns that can lead one to be skeptical.

Advertisement

Recommended

Join the conversation as a VIP Member

Trending on RedState Videos