Starting with iOS 15 and iPadOS 15, Apple is going to apply a new child-protection policy when it comes to scanning photos that you upload to iCloud. This policy will help Apple report illegal child pornography images to the authorities, and on the surface, it sounds like a good thing that Apple is doing. But there’s a lot of controversy and confusion around how they’re doing it, so let’s talk about how it works, and then what you can do if you want to stop Apple from scanning your iPhone photos.
How Apple’s iPhone photo scanning feature works
Part of the confusion comes from the fact that Apple announced two child safety features together, but they work in completely different ways.
First is the child-pornography scanning feature for iCloud Photos. Here, Apple scans the photos for digital fingerprints of child pornography and matches it against CSAM’s (Child Sexual Abuse Material) database for illegal images. CSAM is maintained by the Center for Missing and Exploited Children, a quasi-governmental entity in the U.S.
The second feature is a machine learning-based, opt-in feature limited to the Messages app on iPhone and iPad. This is used to alert children or their parents about pornographic images in the Messages app.
The controversy is surrounded by the first, the iCloud Photos scanning feature, which is enabled by default for all iCloud Photos users. When your iPhone uploads a photo to iCloud Photos (if you have the iCloud Photos feature enabled), there’s a multi-part algorithm that does some analysis of the photo on your device and sends it up the iCloud. Then, iCloud does the other part of the analysis; if you meet a threshold of 30 known child pornography images, Apple flags your account.
Then, Apple’s manual review process kicks in and Apple knows about the flagged images (not the rest of the images). Then, Apple sends the photos to the CSAM program and the authorities take over from there.
Apple says that this program only runs against the known child pornography database from CSAM, and doesn’t flag regular pornography, nude photos, or, for example, photos of your child in a bathtub. And Apple’s process here is secure, and Craig Federighi goes into the technical details in a recent WSJ interview. If you’re curious, take a look at the video below.
According to Apple, there’s no real scanning of photos going on here. Essentially, Apple assigns your photo a “neural hash” (a string of numbers identifying your photo), then compares that against hashes from the CSAM database. It then saves that process in what Apple calls a Safety Voucher, along with the image.
Then it does some more analysis and matching based on these hashes; if 30 Safety Vouchers have matches for CSAM images, only then is your account flagged by the system for human reviewers to go in to actually see if there are illegal images, and the images and account are reported.
How to stop Apple from scanning your iPhone photos
So, now that you know how the system works, you can choose if you want to stop Apple from doing it. This scanning only happens when photos are uploaded to iCloud.
Photos that are sent in messaging apps like WhatsApp or Telegram aren’t scanned by Apple. Still, if you don’t want Apple to do this scanning at all, your only option is to disable iCloud Photos. To do that, open the “Settings” app on your iPhone or iPad, go to the “Photos” section, and disable the “iCloud Photos” feature. From the popup, choose the “Download Photos & Videos” option to download the photos from your iCloud Photos library.
You can also use the iCloud website to download all photos to your computer. Your iPhone will now stop uploading new photos to iCloud, and Apple won’t scan any of your photos now.
Looking for an alternative? There really isn’t one. All major cloud-backup providers have the same scanning feature, it’s just that they do it completely in the cloud (while Apple uses a mix of on-device and cloud scanning). If you don’t want this kind of photo scanning, use local backups, NAS, or a backup service that is completely end-to-end encrypted.