I have an iPhone 15 pro in my possession, which means that I also have the possibility of accessing most Apple’s intelligence tools after Apple launched its suite of AI features with the iOS 18.1 version last fall. Most tools, but not all.
When Apple launched the iPhone 16 range last year, it also announced a new feature entitled Visual Intelligence. With the camera control button on these iPhone 16 models, you can invoke Apple’s response to Google Lens and get more information and even a handful of usable commands depending on everything you indicate the camera on your phone.
Although the iPhone 15 Pro and the iPhone 15 Pro Max have enough RAM and processing power to execute Apple Intelligence’s features, Visual Intelligence was not one of them. This means that the iPhone 16 at $ 799 could do something that the phone you paid at least $ 999 could not and when the iphone 16th of $ 599 made its debut last month, we learned that it could also access Apple’s intelligence while the iPhone 15 Pro Max The Owners have remained safe. Why, the very idea!
This changed, however, with the arrival of the second public beta version of iOS 18.4. If you try this beta version on an iPhone 15 pro, you have now acquired the possibility of running visual intelligence. And although this is not necessarily a decision that changes the situation, it gives your old iPhone new powers that it did not have before. And some of these powers are very useful.
Here is a quick overview of how users of the iPhone 15 Pro and the iPhone 15 Pro can configure their devices to take advantage of visual intelligence, as well as a recall of what you can use this feature powered by AI.
How to configure visual intelligence on an iPhone 15 pro in iOS 18.4
If you want to use Visual Intelligence on your iPhone 15 Pro, you will have to find a way to start the feature because only iPhone 16 models are delivered with a camera control button. Fortunately, you have two other options, thanks to the iOS 18.4 update.
iOS 18.4 adds visual intelligence as an option for the action button, so you can use this button on the left side of your iPhone to trigger visual intelligence. Here’s how.
1. Enter the action button menu in the settings.
(Image: © Future)
Launch the Settings applicationand on the main screen, Press the Action button.
2. Select visual intelligence as your action
(Image: © Future)
You will see a possible shortcut list to trigger with the action button. Scroll through Until you saw visual intelligence. Press the settings to leave.
From this moment, each time you press the action button, it will launch Visual Intelligence.
If you do not wish to attach your action button with visual intelligence, you can also use the possibility of customizing the shortcuts of the lock screen in iOS 18 to add a control of visual intelligence. The iOS 18.4 adds a visual intelligence control that you can place at the bottom of your locking screen.
To add this control, simply change your screen and select the control you want to customize. (In the screen above, we put it in the lower left corner.) Select Visual Intelligence in the available options – You will find it under Apple Intelligence & Siri Controls, although you can also use the search bar at the top of the screen to follow the control. Press the icon to add it to your lock screen.
What you can do with visual intelligence
Thus, your iPhone 15 Pro is now configured to launch Visual Intelligence, either from the action button or a locking screen shortcut. What can you do with this feature?
Essentially, visual intelligence transforms the iPhone camera into a search tool. We have step -by -step instructions on how to use visual intelligence, but if your experience is like mine, you will find very intuitive things.
Once the visual intelligence has been launched, point your camera on what you want to look for – it could be a business sign, a poster or almost anything. The iOS 18.3 update arrived last month added the capacity to identify plants and animals, for example.
The information that appears on your screen varies depending on what you point. A restaurant facade can produce the hours of the place is open, while you can also collect phone numbers and URLs by capturing them with visual intelligence.
I captured a poster of an upcoming event with my iPhone 15 Pro, and Visual Intelligence gave me the opportunity to create a calendar event with the already full date and time. It would be good if the location was also copied, because this information was also on the poster, but we will crack it until the first days for visual intelligence.
Visual intelligence can also be disconcerted in situations like these. When I tried to add a specific football match for my calendar from a calendar listing several dates, Visual Intelligence was confused about which to choose. (It seems that the date at the top of the list is by default.) Faced with incorrect data beat most of the objectives of this particular capacity, but you expect that Apple extends the bag of Visual Intelligence stuff over time.
You have two other options to develop the information that visual intelligence gives you. If you have activated the GPT cat in Apple Intelligence, you can share the information with Chatgpt by selecting the request button, or you can press the search to execute Google search on the image you have collected.
Among these two options, Chatgpt seems to be the most complete in my experience. When I captured a recipe for a bean dip, Chatgpt initially summarized the article, but by asking a follow-up question, I could make the chatbot lists the ingredients and the steps, which I could then copy and paste in the notes application by myself. For me, it’s much more practical than having to pinch and zoom in on a photo of a recipe or, worse, to transcribe things myself.
Searches on Google Image Captures of Visual Intelligence can be much more affected. A photo of a restaurant marquee near me produced search results with restaurants named in a similar way, but not the real restaurant I was in front.
Google Search did a much better work when I took a photo of a book coverage via Visual Intelligence, and subsequent search results produced criticism from the book from various sites. It could really be useful the next time I am in a bookstore – it is a place that sells printed volumes, young people, ask your parents – and that I want to know if the book I think to buy is actually as good as its coverage.
This is my experience of visual intelligence so far, but my colleagues have been using it since its release last year as everything from a virtual guide in an art museum to a navigation tool to get out of a corn labyrinth. If you have an iPhone 15 Pro, you can now try your own uses for visual intelligence.