Apple’s biggest upcoming AI features will arrive in iOS 18.4, with a particular focus on Siri upgrades. But it’s on the Vision Pro that I’m most excited to use these capabilities. Here’s why.
The new era of Siri brings gradual change
Siri is in the middle of a transition period. Apple says we’ve entered “a new era” for Siri, but that change happens over time.
- iOS 18.1 debuted a new Siri design and some minor improvements
- iOS 18.2 brought powerful ChatGPT integration
- but iOS 18.4 is where some of the biggest upgrades will come
So far, ChatGPT support has been by far the best part of this new Siri era.
But in some ways, Apple’s need to integrate OpenAI’s chatbot is just further evidence of Siri’s own shortcomings.
However, many of these shortcomings will soon be fixed by iOS 18.4.
Three new Siri superpowers coming to iOS 18.4
iOS 18.4 will launch for all users in April and will be available in beta next month.
When she arrives, Siri will benefit from three new superpowers:
- New application actions: Apple says Siri will be able to perform hundreds of new actions in Apple apps, without needing to open those apps. App Intents will also bring the same functionality to supported third-party apps.
- Knowledge of personal context: Like a real-life assistant, you’ll be able to reference texts you’ve received, past calendar events, and more personal data to get truly intelligent assistance from Siri.
- Screen Awareness: Siri will know what’s on your screen, so you can easily ask her to take action on whatever you’re looking at.
If these features work as expected and with the level of reliability we all hope for (and yes, I’m skeptical), they could be a game changer.
Siri might finally become the kind of intelligent assistant she was always meant to be.
But now let me move on to the Vision Pro, because that’s where the Siri changes could be particularly appealing.
Spatial computing works best when Siri is at its best
I am a fairly new Vision Pro owner. I’ve had the device for less than a month and haven’t used it much yet due to holiday travel and busyness.
But one thing I discovered very quickly is how important Siri is to the platform.
Spatial computing makes voice computing more important than ever.
On my iPhone and iPad, my main computing devices, I use Siri regularly, but I don’t rely on the assistant.
Siri is great for quickly starting timers, setting reminders, and other small tasks.
But on Vision Pro, I found myself relying on Siri for a lot more things.
For example, Siri is my preferred method for opening apps in visionOS.
This may just be part of my visionOS learning curve, but I find it much quicker and more intuitive to use Siri to navigate the system.
I never open apps with Siri on my other devices, but it works extremely well on Vision Pro.
Siri is also my choice for dictating responses in Messages and Mail. Although I often have a hardware keyboard connected via Bluetooth, it feels more natural to dictate a response.
iOS 18.4 features on Vision Pro could prove revolutionary
Apple Intelligence is not yet supported on Vision Pro, although it will likely arrive this year.
But imagine what iOS 18.4’s Siri capabilities could look like on Vision Pro.
When Siri can perform hundreds of advanced actions in apps and understand what you’re looking at and your personal context, spatial computing can truly begin to deliver on its science fiction promises.
Siri upgrades will be great for the iPhone, but they could be crucial to the success of Apple’s “spatial computing.”
The dream of simply speaking out loud to your computer and having it take intelligent actions for you is starting to become more real.
Do you use Siri a lot on Vision Pro? What do you think of the iOS 18.4 upgrades coming to visionOS? Let us know in the comments.