Does your iPhone share photo data with Apple by default?


Sure enough, when I checked my iPhone 15 Pro this morning, the toggle was on. You can find it for yourself by going to the Photos settings on your phone (via the iOS Settings app) or a Mac (in the Photos app settings menu). Enhanced visual search lets you search for landmarks you’ve taken photos of or search for those images using the names of those landmarks.

To see what it does in the Photos app, swipe up on a photo you’ve taken of a building and select “Find Landmark,” and a map will appear that conveniently identifies it . Here are some examples from my phone:

This is indeed St. Mary’s Cathedral in Austin, but the image on the right is not a Trappist monastery, but the city hall building in Dubuque, Iowa.
Screenshots: Apple Photos

At first glance, this is a convenient extension of the visual search functionality of Photos that Apple introduced in iOS 15 and which allows you to identify plants or, for example, find out what they are . the symbols on a laundry label mean. But Visual Look Up doesn’t need special permission to share data with Apple, and it does.

A description below the toggle says you’re allowing Apple to “privately match locations in your photos with a global index maintained by Apple.” As for how, there are details in a Apple Research Blog on Machine Learning about the enhanced visual search that Johnson links to:

The process begins with an ML model on the device that analyzes a given photo to determine if there is a “region of interest” (ROI) that could contain a landmark. If the model detects a return on investment in the “landmark” domain, a vector integration is calculated for this region of the image.

According to the blog, this vector embedding is then encrypted and sent to Apple for comparison with its database. The company offers a very technical explanation of vector integrations in a research paperbut IBM put it more simplywriting that embeddings transform “a data point, such as a word, phrase, or image, into a ndimensional array of numbers representing the characteristics of this data point.”

Like Johnson, I don’t fully understand Apple’s research blogs, and Apple did not immediately respond to our request for comment on Johnson’s concerns. It appears the company has gone to great lengths to maintain data privacy, in part by condensing image data into a format readable by an ML model.

Still, making the opt-in toggle, like the one for sharing analytics or recordings or Siri interactions, rather than something users have to figure out, seems to have been a better option.

Correction of December 29: An earlier version of this story misstated the location of the enhanced visual search toggle. It’s in iOS Settings under Apps > Phone and in Photos > Settings in the macOS Photos app. The title has also been made clearer.

Leave a Reply

Your email address will not be published. Required fields are marked *