Enhanced Visual Search is enabled by default on iPhones and Macs running the latest operating system. | Image credit – Lapcat Software.
However, Apple Enhanced visual search first mentioned only on October 24, and that too discreetly. The feature allows your iPhone to identify photos of places, or more specifically landmarks, in your photo library.
Enhanced visual photo search, which allows a user to search their photo library for specific locations, such as landmarks and points of interest, is an illustrative example of a useful feature powered by the combination of ML with HE and searches on private server. Using PNNS, a user’s device privately queries a global index of popular landmarks and points of interest maintained by Apple to find approximate matches to the locations depicted in their photo library.
A machine learning (ML) model built into the device analyzes a photo and decides whether there is a region of interest (ROI) that could include a landmark. If the answer is yes, an encrypted query is sent to Apple’s server, where it performs a set of processes, such as applying homomorphic encryption and differential privacy, and returns the candidate benchmarks to the iPhone . A reranking model built into the device then takes over and predicts the best candidate.
The photo’s metadata is then updated with the landmark’s tag, allowing you to easily find it by searching for its name on your iPhone.
While there may be many use cases for this feature and many may find it useful, it should not have been enabled by default, given that it sends information about your photo to Apple.
Worse yet, Apple never publicly announced this feature. Also, in case you are wondering, this feature is not part of Apple Intelligence. I found the setting on my iPhone 14 Pro (and immediately disabled it), which can’t work Apple Intelligence.