A year since Apple unveiled the Vision Pro, and about four months since its muted launch, the spatial computing headset still feels surprisingly undercooked. Simple features, like the ability to organize icons in the visionOS home screen, are nowhere to be found. Content that truly shows off the Vision Pro’s immersive capabilities is still rare (the recent Marvel experience was just a glimpse of what’s possible).
According to the latest report from Bloomberg’s Mark Gurman, the company will show off visionOS 2 at its Worldwide Developers Conference ((WWDC 2024), but the update will mostly focus on polishing the Vision Pro experience. We can expect native Vision Pro versions of Apple software (right now the headset uses iPad versions of many apps), as well as a Passwords app and new environments. Apple’s major AI push will also reportedly be called “Apple Intelligence,” a cheeky way of colonizing the term “AI.”
Beyond minor polishing and bug fixes, here’s what I’d like to see on the Vision Pro at WWDC 2024 (or really, anytime in the next year, Apple!).
iPhone and iPad screen mirroring
Perhaps the most baffling aspect of the Vision Pro is how it refuses to play well with the iPhone. If you ever need to unlock your phone to use an authentication app, or quickly peep a Slack message, you’ll either have to remove the Vision Pro to use FaceID, or type in your PIN and squint through the headset’s middling cameras. Why?!
ADVERTISEMENT
Advertisement
If Apple can already deliver sharp and lag-free macOS mirroring, it’s not a huge leap to give us something similar for iPhones and iPads. Sure, ideally you’d be able to manage your text messages and other tasks in the Vision Pro without relying on other devices. Realistically, though, the Messages app doesn’t always receive texts as quickly as your iPhone, and its history of texts and contacts often differs too.
Offering a quick pop-up of your iPhone’s screen would erase those issues, and it would keep you within the flow of whatever you’re working on in the Vision Pro. As for the lack of FaceID, Apple could tie authentication of your iPhone together with your Apple ID. You already have to sign into your Vision Pro with a PIN or Optic ID scan, as well as log into your ID itself, so Apple already knows who you are.
When it comes to iPads, screen mirroring could be just as useful as it is on Macs. If you were typing away on a document on an iPad Pro with a Magic Keyboard, why shouldn’t you be able to continue doing that on the Vision Pro? Supporting less powerful iPads could also be useful, since they could mirror downloaded media or games. Why burden the headset’s M2 processor when you could tap into an M2 chip on an iPad Air?
Taking this concept a step further, it would also be nice to have Apple Watch mirroring eventually. Imagine lifting up your wrist and having a glanceable view of notifications or media controls while using the Vision Pro. What if you could immediately see a 300-inch version of your Apple TV’s home screen as soon as you sit down on your couch. Apple has the potential to shape reality itself while using its headset, so why not lean into that for its own devices?
More native Vision Pro apps
Recent rumors suggest we’ll see native versions of Apple’s apps on the Vision Pro (many are just repackaged iPad apps right now), but I’m hoping to see more developers jump on the platform. There still aren’t any Vision Pro apps for Netflix, YouTube or Spotify. If you want to use those services, you’ll have to log into a web browser, or rely on a third-party app like Supercut. This isn’t the seamless spatial computing future I was promised, Apple.
ADVERTISEMENT
Advertisement
Now I’m sure it’ll be tough for Apple to get YouTube to play nice with the Vision Pro, especially as Google just recently struck a mysterious partnership with the AR headset company Magic Leap. But not being able to get Netflix and Spotify on the headset remains a huge problem for Apple. Without the apps we live with every day, Vision Pro will always seem undercooked.
Cast audio to speakers and home theater systems
The Vision Pro’s built-in speakers are fine, but they lack the depth of a proper pair of bookshelf speakers or Apple’s own HomePod. And they certainly don’t have the low-end kick you’d get from a complete home theater system and subwoofer. So why can’t we just send audio easily to those devices?
Let us AirPlay to HomePods on a whim! Let me sit in my home theater and enjoy the massive speakers surrounding me, while watching Fury Road at near-IMAX scale on the Vision Pro! While I enjoy using AirPod Pros for immersive audio on the go, they can’t hold a candle to the Dolby Atmos-equipped towers in my basement.
I’m sure home theater users aren’t a high-priority consideration for Apple, but at the moment, who else is known for spending way too much money on hardware that isn’t meant for everyone?