At Apple’s WWDC23, I think I saw the future. [Pausing to ponder.] Yeah, I’m pretty sure I saw the future–or at least Apple’s vision of the future of computing. On Tuesday morning, I got to try the Apple Vision Pro, the new $3,499 mixed-reality headset that was announced this week and ships next year.
I’m here to tell you the major details of my experience, but the overall impression I have is that the Vision Pro is the most impressive first-gen product I’ve seen from Apple–more impressive than the 1998 iMac, or the 2007 iPhone. And I’m fully aware that other companies have made VR headsets, but Apple does that thing that it does, where it puts its understanding of what makes a satisfying user experience and creates a new product in an existing market that sets a higher bar of excellence.
Yes, it’s expensive, and yes, this market hasn’t proven that it can move beyond being niche. Those are very important considerations to discuss in other articles. For now, I’ll convey my experiences and impressions here, from a one-hour demonstration at Apple Park. (I was not allowed to take photos or record video; the photos posted here were supplied by Apple.) The device I used is an early beta, so it’s possible—likely even—that the hardware or software could change before next year.
An emotional high, thanks to the displays
My Apple Vision Pro demo covered a lot of ground, but the demonstrations of spatial photos and video and of immersive experiences are what put me in awe of what the headset could do. With the spatial media Apple demoed, I was put right in the middle of the recorded memory and it triggered memories and emotions of my own from similar moments. How would it feel to see my own spatial media moments? Incredible, I imagine.
The immersive video gave me thrills. My body reacted to situations and my mind responded to the sights and sounds. In one demo, I got to pet a freakin’ dinosaur. Not a beast that looks like a 3D model set against an illustrated background, but a realistic-looking dinosaur that sniffed my hand and let me pet it. And it sent chills up and down my spine.
Now, this sense of immersion isn’t new to the world of VR headsets–it’s the core of the product in general. But the difference with the Apple Vision Pro is the two displays set in front of each eye. The resolution and color they display are fantastic and make things look realistic. They aren’t perfect. I notice pixelation at times, and I often saw playback stutter not of the demo videos, but of the live people in the room with me in real-time.
Apple’s demo videos during the keynote and on its website give the impression that wearers never see the headset surrounding the video, but you do see it, though it doesn’t interfere with the sense of immersion.
Usability and wearability
When seeing the Home screen for the first time, my instinct was to use my finger to tap an icon, as you would do on an iPhone or iPad. But instead, you look at what you want to use, and then use hand gestures to perform an action. At first, I felt like I would have a hard time adjusting to this method, but after 20 minutes, it felt natural.
It helps that you don’t have to hold your arms out and gesture like an orchestra conductor. I was able to keep my arms comfortably at my side while I sat on a couch and navigate the UI, and the eye tracking was accurate and didn’t feel like it put a strain on my eyes. My experience using the OS was limited; I didn’t get a chance to use the on-screen keyboard, nor did I use Bluetooth input devices. I also didn’t use the Apple Vision Pro as a Mac display, a major component of the keynote presentation.
Before my demo, I was afraid that the headset would be ill-suited for my head, which is on the big side (my hat size is between 7 ½ and 7 ¾.) But when I first put the Apple Vision Pro on, I had to tighten–not loosen–the straps to get a proper fit. In my demo, the headset had a Velcro strap that goes across the top of your head–this strap is not shown in Apple’s product photos or videos. I think that after the measurements were taken of my head, Apple determined that I would benefit from this top strap.
Apple uses what it calls a Light Seal to close the gap between the face and the headset. It blocks out the light in the room, and for me, it fits correctly across the top where my eyebrows are. But underneath there was a visible gap between my nose and the headset. When asked how the Light Seal sizing works, Apple said that it doesn’t use a simple small, medium, and large sizing scheme, but that the Seal is available in multitudes of shapes and sizes–after all, faces come in multitudes of shapes and sizes. Had I been in a retail situation, I could have Apple adjust the fit.
After about an hour, my demo was done and I took off the headset. I didn’t have any fatigue in my neck or feel any tenderness where the headset and straps hugged my head. I felt like I could’ve gone on longer with the session, and I would’ve if Apple let me. I left the demo with a sense that I was lucky to see something genuinely innovative that can really affect how we use computers in the future if Apple plays its cards right and the market embraces the Apple Vision. And I think it has a chance to do so.
Learn more about the Apple Vision Pro
If you want to hear more about my experience with the Apple Vision Pro and visionOS, tune into the latest episode of the Macworld Podcast. I talked about getting fitted for my glasses, the headset lenses, making a FaceTime call, visionOS’s support for user accounts, and more. You can tune in to the podcast below or find it in Apple Podcasts.