Apple Auto-opts Everyone Into Having Their Photos Analyzed By AI For Landmarks
Apple last year deployed a mechanism for identifying landmarks and places of interest in images stored in the Photos application on its customers iOS and macOS devices and enabled it by default, seemingly without explicit consent.
Apple customers have only just begun to notice.
The feature, known as Enhanced Visual Search, was called out last week by software developer Jeff Johnson, who expressed concern in two write-ups about Apple's failure to explain the technology, which is believed to have arrived with iOS 18.1 and macOS 15.1 on October 28, 2024.
In a policy document dated November 18, 2024 (not indexed by the Internet Archive's Wayback Machine until December 28, 2024, the date of Johnson's initial article), Apple describes the feature thus:
Apple did explain the technology in a technical paper published on October 24, 2024, around the time that Enhanced Visual Search is believed to have debuted. A local machine-learning model analyzes photos to look for a "region of interest" that may depict a landmark. If the AI model finds a likely match, it calculates a vector embedding – an array of numbers – representing that portion of the image.
The device then uses homomorphic encryption to scramble the embedding in such a way that it can be run through carefully designed algorithms that produce an equally encrypted output. The goal here being that the encrypted data can be sent to a remote system to analyze without whoever is operating that system from knowing the contents of that data; they just have the ability to perform computations on it, the result of which remain encrypted. The input and output are end-to-end encrypted, and not decrypted during the mathematical operations, or so it's claimed.
The dimension and precision of the embedding is adjusted to reduce the high computational demands for this homomorphic encryption (presumably at the cost of labeling accuracy) "to meet the latency and cost requirements of large-scale production services." That is to say Apple wants to minimize its cloud compute cost and mobile device resource usage for this free feature.
With some server optimization metadata and the help of Apple's private nearest neighbor search (PNNS), the relevant Apple server shard receives a homomorphically-encrypted embedding from the device, and performs the aforementioned encrypted computations on that data to find a landmark match from a database and return the result to the client device without providing identifying information to Apple nor its OHTTP partner Cloudflare.
Thus, Apple unilaterally began running people's Photos through a locally running machine-learning algorithm that analyzes image details (on a purely visual basis, without using location data) and creates a value associated with what could be a landmark in each picture. That value is then used on a remote server to check an index of such values stored on Apple servers in order to label within each snap the landmarks and places found in Apple's database.
Put more simply: You take a photo; your Mac or iThing locally outlines what it thinks is a landmark or place of interest in the snap; it homomorphically encrypts a representation of that portion of the image in a way that can be analyzed without being decrypted; it sends the encrypted data to a remote server to do that analysis, so that the landmark can be identified from a big database of places; and it receives the suggested location again in encrypted form that it alone can decipher.
If it all works as claimed, and there are no side-channels or other leaks, Apple can't see what's in your photos, neither the image data nor the looked-up label.
- Apple offers to settle 'snooping Siri' lawsuit for an utterly incredible $95M
- Fining Big Tech isn't working. Make them give away illegally trained LLMs as public domain
- Apple called on to ditch AI headline summaries after BBC debacle
- Apple and Meta trade barbs over interoperability requests
Apple claims that its use of this homomorphic encryption plus what's called differential privacy – a way to protect the privacy of people whose data appears in a data set – precludes potential privacy problems.
"Apple is being thoughtful about doing this in a (theoretically) privacy-preserving way, but I don’t think the company is living up to its ideals here," observed software developer Michael Tsai in an analysis shared Wednesday. "Not only is it not opt-in, but you can’t effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you’ve already opted out of uploading your photos to iCloud."
Tsai argues Apple's approach is even less private than its abandoned CSAM scanning plan "because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes."
Nonetheless, Tsai acknowledges Apple's claim that data processed in this way is encrypted and disassociated with the user's account and IP address.
While there's no evidence at this point that contracts Apple's privacy assertions, the community concern has more to do with the way in which Apple deployed this technology.
"It’s very frustrating when you learn about a service two days before New Years and you find that it’s already been enabled on your phone," said Matthew Green, associate professor of computer science at the Johns Hopkins Information Security Institute in the US.
The Register asked Apple to comment, and as usual we've received no reply. We note that lack of communication is the essence of the community discontent.
"My objection to Apple's Enhanced Visual Search is not the technical details specifically, which are difficult for most users to evaluate, but rather the fact that Apple has taken the choice out of my hands and enabled the online service by default," said Johnson in his second post.
He told The Register that it's unclear whether the data/metadata from your Photos library is uploaded before you even have a chance to disable the opt-out setting.
"I don't think anybody knows, and Apple hasn't said," Johnson observed. ®
From Chip War To Cloud War: The Next Frontier In Global Tech Competition
The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more
The High Stakes Of Tech Regulation: Security Risks And Market Dynamics
The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more
The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics
Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more
The Data Crunch In AI: Strategies For Sustainability
Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more
Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser
After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more
LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue
In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more