Apple’s Photos app offers various functionalities to assist users in locating images within their library and gaining insights into the content of those images. One notable feature is Enhanced Visual Search. Here’s how it operates and how Apple safeguards your privacy while you use it.
Visual Look Up vs Enhanced Visual Search
It’s essential to distinguish between Enhanced Visual Search and Apple’s Visual Look Up feature. Introduced with iOS 15, Visual Look Up allows users to identify objects, landmarks, plants, and more within the Photos app.
For instance, you can swipe up on a photo to discover the breed of dog featured in it. It’s also capable of interpreting laundry care instructions on garments and explaining those obscure symbols on your vehicle’s dashboard.
On the other hand, Enhanced Visual Search functions independently of Visual Look Up. While Visual Look Up provides information about a specific photo currently in view, Enhanced Visual Search allows you to retrieve all photos in your library that are linked to a specific landmark or location. This feature also operates without the need for geolocation data.
For example, a search for “Golden Gate Bridge” will yield relevant images from your library, even if the landmark is out of focus in the background.
How does Enhanced Visual Search protect your privacy?
Recently, reports surfaced regarding how Enhanced Visual Search transmits your location data to Apple to help pinpoint landmarks and points of interest. According to the Settings app, Apple states: “Allow this device to privately match places in your photos with a global index maintained by Apple so you can search by almost any landmark or point of interest.”
This has understandably raised concerns about privacy—especially since the feature defaults to opt-out rather than opt-in.
Nevertheless, Apple has implemented a robust privacy framework designed to safeguard your data when Enhanced Visual Search is utilized.
The process begins with a method known as homomorphic encryption, which operates as follows:
- Your iPhone encrypts a query before transmitting it to a server.
- The server processes the encrypted query and generates a response.
- This response is then sent back to your iPhone, where it gets decrypted.
Crucially, in Apple’s implementation, only your devices have the decryption key, preventing the server from accessing the original request. Apple employs homomorphic encryption for multiple features, including Enhanced Visual Search.
Additionally, Apple utilizes a technique called private nearest neighbor search (PNNS) for Enhanced Visual Search. This functionality allows a user’s device to privately query “a global index of popular landmarks and points of interest maintained by Apple to find approximate matches for places depicted in their photo library.”
Here is a comprehensive overview of the Enhanced Visual Search request pipeline, as detailed on Apple’s Machine Learning website:
- An on-device machine learning model evaluates a photo to identify any “region of interest” (ROI) that may contain a landmark.
- If an ROI is detected, the model generates a “vector embedding” for that section of the image.
- This vector embedding is then encrypted and sent to a server database. Notably, the photo and its pixels are not transmitted to Apple; only a mathematical representation of the identified ROI is sent.
- Apple combines differential privacy with OHTTP relay, managed by an external party, to obscure the device’s IP address before the request reaches Apple’s servers.
- The device also sends “fake queries along with its genuine ones, preventing the server from differentiating between the two.” Additionally, queries are processed through an anonymization network to ensure the server cannot link multiple requests to the same client.
- The server processes the relevant parts of the embeddings and returns associated metadata, like landmark names, to the device. The server does not retain any data once the results are sent back to your iPhone.
Though this may be a lot of technical jargon, it supports Apple’s assertions that the company does not have access to any specific information about the contents of your photos.
To disable Enhanced Visual Search, navigate to the Settings app, tap “Apps,” then “Photos.” Scroll to the bottom, where you will find a toggle for Enhanced Visual Search. Apple recommends utilizing this toggle in low-data areas.
Further reading on Enhanced Visual Search:
Follow Chance: Threads, Bluesky, Instagram, and Mastodon.
FTC: We use income-earning auto affiliate links. More.