Apple doesn’t make many mistakes when it comes to the security and privacy of its billion-plus iPhone owners. But when they do, they hit extra hard. So it is with the furor building this week as surprised users start asking whether what happens on your iPhone still stays on your iPhone.
We’re talking photos, of course, and the complex Enhanced Visual Search feature into which Apple has seemingly “auto-opted” all its users, “violating” its fabled privacy guarantee. As I reported when this was first outed, Apple is capturing, masking and then centrally analyzing portions of user photos to flag and then geolocate landmarks against a central dataset. If it works as billed, there is no privacy risk for users—but few will understand the technicalities and so it becomes a leap of faith.
So does this really violate your privacy. No—but neither is it nothing. There are two serious issues here for Apple. The first is optics—when you build a brand within a privacy bubble, you don’t want to tease it with a pin—ever. The second is a thin end of the wedge problem. This kind of hybrid device/cloud photo scanning has gotten Apple into trouble before, with its ill-fated CSAM proposal in 2021.
That child safety upgrade was designed to screen photos on-device against a hashed dataset of known, illicit CSAM imagery, and then where multiple images were flagged to send them for human review. As was pointed out at the time, the issue wasn’t the CSAM per se, but rather a door being opened to screening for other material—religious, sexual, political—based on local laws and regulations. As I said at the time, the solid defense that it’s not possible technically would suddenly fall away.
And so it is here—possibly. The idea that a user’s photos can be screened against a cloud dataset for any purpose is uncomfortable to many—at least those who see their iPhones as their personal-eyes-only vault. As privacy expert Matthew Green posted on BlueSky, “it’s very frustrating when you learn about a service two days before New Years and you find that it’s already been enabled on your phone.”
Apple says Enhanced Visual Search “allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides IP address. This prevents Apple from learning about the information in your photos.”
But as Jeff Johnson, the blogger who started this furor, points out, “I don’t understand most of the technical details of Apple’s blog post. I have no way to personally evaluate the soundness of Apple’s implementation of Enhanced Visual Search… Computing privacy is simple: if something happens entirely on my computer, then it’s private, whereas if my computer sends data to the manufacturer of the computer, then it’s not private, or at least not entirely private.”
Apple’s absolute control over on-device versus off-device is already being challenged by the new cloud-based AI services being pushed out. Much effort has gone into developing and then publicizing Apple’s “groundbreaking” Private Cloud Compute, which essentially provides a cloud extension of the device’s encrypted enclave to enable central processing within a user’s private space. Where processing escapes this enclave, such as with ChatGPT, Apple specifically calls it out to the user. Contrast that with the secretive nature of this update. As Green says, “it was ‘discovered’ not announced by Apple.”
And that’s the issue here—the lack of transparency. And it’s made worse because, as Michael Tsai points out, “not only is it not opt-in, but you can’t effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you’ve already opted out of uploading your photos to iCloud… I don’t think the company is living up to its ideals here.”
That’s the real issue here—optics versus perception. And it’s a serious mistake. Had Apple presented this more openly, there would have been little if any fuss and most would not have opted out. But elsewhere the iMaker has gone to such lengths to force opt-ins for any off-device data capture, that this stands out as an oddity. I would not be surprised to see a u-turn or retrospective opt-in appear.
I have approached Apple for any response to this furor—thus far the company hasn’t commented.