You can get a lot of work done while wearing Apple’s Vision Pro and have fun doing it—but it’s not yet at the stage where most of us will want to fully embrace spatial computing as the new way of working.
I spent more than a week working almost exclusively in the Vision Pro. I carried on Slack conversations, dialed into Zoom video calls, edited Google Docs, wrote articles, and did everything else I do within my day-to-day responsibilities as an editor at Ars Technica.
Throughout the experience, I never stopped thinking about how cool it was, like I was a character in a cyberpunk novel. The Vision Pro opens some new ways of approaching day-to-day work that could appeal to folks with certain sensibilities, and it offers access to some amenities that someone who hasn’t already invested a lot into their home office setup might not already have.
At the same time, though, I never quite zeroed in on a specific application or use case that made me think my normal habit of working on a MacBook Pro with three external monitors would be replaced. If you don’t already have a setup like that—that is to say, if you’ve just been working on a laptop on its own—then the Vision Pro can add a lot of value.
I plan to explore more use cases in the future, like gaming, but this is the last major piece in a series of sub-reviews of the Vision Pro that I’ve done on various applications, like entertainment or as an on-the-go mobile device.
My goal has been to see if the Vision Pro’s myriad use cases add up to $3,500 of value for today’s computing enthusiast. Productivity is front and center in how Apple markets the device, so this is an important one. Let’s see how it holds up.
Table of Contents
- The basics
- Vision Pro with peripherals
- Turning my whole home into an office
- Working with a Mac
- Meetings and Personas
- Could the Vision Pro be your main workhorse?
- The good
- The bad
- The ugly
The basics
Outside the realm of entertainment, visionOS and its apps are mostly about flat windows floating in 3D space. There are very few apps that make use of the device’s 3D capabilities in new ways that are relevant to productivity.
There are two types of visionOS apps: spatial apps and “Compatible Apps.” The former are apps designed to take advantage of the Vision Pro’s spatial computing capabilities, whereas Compatible Apps are simply iPad apps that work just fine as flat windows within the visionOS environment.
In either case, though, you’re usually just getting the ability to put windows around you. For example, I started out by sitting at my kitchen table and putting my writing app in front of me, Slack and my email app off to the side, and a browser window with a YouTube video playing on the other side. This felt a bit like using several large computer monitors, each with an app maximized. It’s cool, and the ability to shift between your real environment and fully immersive virtual ones can help with focus, especially if you do intensive creative work like writing.
If there’s one thing Apple has nailed better than any of its predecessors in the mixed reality space, it’s the interface. Wherever your eyes are looking, a UI element will glow to let you know it’s the item you’ll interact with if you click. Clicking is done by simply tapping two of your fingers together almost anywhere around your body; the headset has cameras all over, so you don’t have to hold your hands up or in front of you to do this. There are also simple pinching-and-moving gestures for scrolling or zooming.
I had to train myself to linger longer while I tapped my fingers. This felt slow at first, but I adjusted to it before too long. Apple has done an amazing job with the interface, and it’s hard to go back to using other headsets with their clunky controllers now.
The on-screen keyboard is a mixed bag, though. You can sort of type on it virtually as if it were a real keyboard, but only your index fingers work, and there’s no feedback when you hit a key, making it a suboptimal experience. The alternative is gazing directly at each key on the keyboard and tapping your fingers anywhere, the same way you would “click” in other parts of the interface. I found this to be more practical, but it’s not as good as using a real keyboard. Fortunately, you can connect a Bluetooth keyboard—more on that shortly.
As I noted in my article on entertainment with the Vision Pro, I don’t find the headset uncomfortable to wear, even for hours on end. Other people feel differently, though, and the specific light seal and headband fit you get seems highly relevant. If you try it and it’s uncomfortable, try asking Apple to help you find different components for a better fit—but as with AirPods and other wearables, it seems that some people may simply never find it comfortable.
It’s also important to talk about the battery life. Apple says it can offer a little over two hours without being plugged in, and that matches my experience. When working at a desk, I always kept it plugged in, though, so this wasn’t an issue most of the time. I did try a more mobile approach to working, for which the battery life is a notable limitation. I’ll talk more about that in a bit.
Vision Pro with peripherals
In addition to the eye tracking, the pinching gestures, and the floating virtual keyboard, I tried connecting the Vision Pro to Apple’s Magic Keyboard and Magic Trackpad. Obviously, the keyboard made typing much better. Still, I’m not a fan of the Magic Keyboard, so I tried my own keyboard (a Keychron K1), and that worked just as well. From then on, I rarely used the on-screen keyboard.
I saw a lot less value in the trackpad. There were two reasons for this. First, its behavior felt a bit awkward because the pointer gets attracted to and stuck to UI elements as if they were magnets and because the cursor simply pops from one window to another rather than moving in the space between the apps. Second, the eye tracking and finger gestures felt just as effective as using a mouse pointer, if not more so.
As a result, I stopped using the trackpad. It was too different from the mouse pointer experience I’m accustomed to, and it didn’t seem like it offered any advantage over Apple’s already well-thought-out default visionOS interface.
Turning my whole home into an office
My initial instinct when using the Vision Pro was to simply sit at a table and arrange the virtual windows around me just like I would physical monitors, and that’s exactly what I did for a few days. After a bit, though, I decided to try something different: I designated different rooms in my apartment for different work activities and positioned the relevant application windows in those rooms. I then walked around the apartment as I worked.
I made the dining room table my writing space; that’s where I put Microsoft Word and the web browser with an active WordPress tab. The room that is usually my office became the communications area, where I dropped Slack, Discord, and Spark, my email app. The living room was a media space where I kept the Apple Music and Apple TV apps. And finally, the kitchen was the planning area, where I placed both the Google Sheets app (which displayed a document the Ars staff uses to track articles) and Parse.ly, a traffic monitoring and reporting tool for websites.
Up until this point, my approach to using Vision Pro to get work done usually felt like a qualified replacement for my desk setup; it let me get some perks from working on my laptop or away from my office that I normally have to be in a certain spot with some non-portable physical hardware to get.
The placing-windows-around-the-house approach marked the first time I felt like using the Vision Pro allowed me to do something that I couldn’t do before.
A few months ago, I replaced a simple Parsons table that had been my home office desk for a decade with something much more specialized and heavy-duty: Secretlab’s Magnus Pro, a metal desk with a whole bunch of cable management features, adjustable monitor arms, and the ability to switch between user-definable standing and sitting positions.
After years of simply sitting in a chair at my desk, I started spending at least 15 minutes out of every hour standing instead. I initially did this for health reasons, but I started feeling that I performed better doing certain types of work this way. Now, I have a hard time imagining going back.
Walking around my entire apartment with Vision Pro on my head, strolling between large windows that cover different walls in each space, with specific rooms dedicated to certain kinds of work activities, felt like a radical extension of the standing desk.
It’s not something that will appeal to everyone, but I do think there’s something to the idea that moving around or changing contexts can keep the mental juices flowing in a way that staying in one spot, looking at one screen, cannot—especially for creative work. Experts on positive habit-building have written about the idea that each space should be used for one task if possible. Why not extend that into the digital, too?
Nonetheless, I think this would be a difficult transition to make, even for people to whom it appeals. Changing habits is difficult, and we’ve been using the same devices and types of spaces for productivity for decades. It seems like there could be a benefit to this approach, but it will require deliberate personal rewiring to make it the default instead of an exception.
Consider me sold on the prospect of spatial computing being a meaningful new way to work. I’m just not sure how likely it is that people will want to put in the effort to change their working habits around it—and the battery life is a problem for this use case.
Working with a Mac
You can get quite a bit done with the visionOS apps that are available, especially when you include iPadOS apps that work well in visionOS, even though they don’t have any additional spatial bells or whistles. But depending on the nature of your work, you might still want to use a Mac.
Fortunately, you can. The Vision Pro can produce a virtual external display for any modern Mac. For MacBooks, you can just look at the screen, and a little UI button will appear above that you can look at and tap your fingers to launch the virtual display. For a Mac desktop, you have to launch from Control Center.
Intel Macs are supported, but only at a maximum resolution of 3K. The resolution for Apple Silicon Macs is effectively 4K. The virtual display feels responsive and works with connected keyboard or mouse peripherals. The text is highly readable. There are some limitations, though. You can only have one virtual display, so you can’t mimic a multi-monitor setup. Activating this virtual display even disables the built-in screen on your MacBook. Further, you can’t play audio sources from the Mac through your headset’s audio device.
I don’t have any complaints about how the virtual display itself works—it’s great. It just seems like the full potential of this feature would be in having multiple displays or, alternatively, turning each macOS window into its own display-independent spatial window. You can’t do that, though you can use the virtual Mac display alongside windows of visionOS apps running on the headset.
I ended up using the Mac virtual display for apps, like Final Cut, that were either more robust on the Mac or were unavailable on visionOS/iPadOS and then using visionOS or iPadOS apps alongside those. It was nice to be able to go into full immersion mode to write, and the large virtual display would be better than using my 14-inch MacBook Pro’s small built-in screen if I didn’t have physical monitors to use. But when working on a Mac, I’d generally prefer using my usual setup.
Still, if you want a bigger screen while you’re away from your usual office setup (or if you don’t have an external monitor setup in your office) I could see this being a useful feature. I’ll be traveling over the next three weeks, and I expect to use the Vision Pro for this when I’m in an Airbnb without my triple monitor setup.
Meetings and Personas
“Productivity” and “meetings” might be contradictory terms in many cases, but there’s no escaping the fact that meetings are a major part of day-to-day work for many people, whether they work from home or fully remotely.
Apple has included two features in the Vision Pro meant to facilitate communication with others: EyeSight for in-person interactions and Personas for video calls.
The Vision Pro has cameras inside the headset that record your eyes, and there’s a screen on the front that reproduces them with a 3D effect so people around you can see where you’re looking, whether you’re winking or blinking, and so on. EyeSight is sort of effective at conveying essential social cues to, say, a household member who walks in to ask you a question while you’re working, but I can’t imagine relying on it in an in-person meeting in a corporate setting. It looks uncanny, and that just wouldn’t seem professional to me.
I discussed EyeSight at length in my article on wearing the Vision Pro in public, so I won’t go into too much more detail here. As I said then, I don’t think it’s a successful feature.
Personas are 3D avatars that take your place in video calls. EyeSight is actually based on your Persona, which you have to go through a quick set-up process to start using. The headset scans your face to create a 3D model of it, then uses its built-in cameras to read your facial expressions and present them to people in the call.
You appear as a sort of disembodied ghost who looks a bit like the most uncanny valley attempts at 3D human characters from Hollywood movies in the 2000s. I joined a couple of Zoom calls using this feature, and it was clear that co-workers found it distracting. They commented on it, including saying “yikes” as soon as they saw it. No one thought it looked accurate or good. Like EyeSight, it doesn’t work very well. I genuinely think it would have been better for Apple to go for a cartoon character approach rather than producing these eerie simulacrums. Neither approach would work in a professional setting, though.
I touched on this point a bit when talking about using the Vision Pro in public, but it’s even more important in this context: Apple’s social and communications ambitions for Vision Pro are the platform’s weakest link.
Most of the day-to-day communication on the fully remote Ars Technica team happens via Slack or email. Different staffers have varying experiences as far as meeting frequency goes. People who are entirely focused on writing might go a whole week without any meetings. Editors and management have more, as they work with multiple writers or collaborate with outside groups within the wider company.
As an editor, I have more meetings than most of the writers do, but I don’t have nearly as many as a lot of other corporate information workers do. I often have at least one Zoom meeting a day, and sometimes two or three, but it doesn’t make up the majority of my time. But even for me, those meetings ended up being the main barrier to feeling like I could just fully do my job in the Vision Pro. Fortunately, I work with a bunch of tech geeks who knew I was reviewing the device, so they were both amused and patient with the fact that I was living inside this thing and showing up to meetings as a weird CGI character. I’m not so sure that would be as true for most other people who take meetings in their daily work.
A few years ago, I had a job where I was in back-to-back meetings all day every day, from 9 am to 6 pm, mostly in person. Obviously, the Vision Pro would not have worked for me then. If a big part of your job is meeting with people on video calls, this is not the productivity device for you. If you work in an office rather than at home, it’s even harder to recommend.
Could the Vision Pro be your main workhorse?
Throughout this series, I’ve been asking whether Apple’s Vision Pro can replace existing personal tech to justify its $3,500 cost. That $3,500 is hard to justify if you’re spending it on top of what you’ve already spent on your computer, your phone, your TV, and so on—but it could seem more reasonable if the Vision Pro could manage to replace multiple devices in one swoop, just like the iPhone did when it arrived.
To answer that, we must first ask whether you can actually get serious work done while wearing the Vision Pro. I think you can. If you can do your work on an iPad, you can do it on the Vision Pro—and I even feel that Vision Pro is more natural to use than an iPad for most productivity tasks. Due to the lack of some heavy-duty applications like Xcode, Final Cut, Maya, and others, it can’t necessarily replace a Mac for everyone, depending on the type of work they do.
Of course, it works great in tandem with a Mac for those applications (though I’d like to see more features added on that front, like multiple monitors or better audio management), but then it’s replacing your monitor, not your whole computer.
Using Apple’s Vision Pro feels futuristic and cool compared to traditional ways of computing. The user interface is intuitive and effective after just a short adjustment period. And for the most part, app support is strong.
When I reviewed the iPad Pro in the past, I made similar attempts to do all my work on the device to test whether that was truly practical. I always concluded it wasn’t. That’s not the case with the Vision Pro, though. Unlike with the iPad, there was never any point when I felt I had to take it off and use another device to get something done.
It’s a great productivity device, and I could see it replacing a Mac or PC, provided you’re not using one of the heavy-duty creative applications I mentioned. That said, the focus on 2D apps in 3D space doesn’t do a lot to sell the user on the idea that they’re doing anything here they can’t do already on a laptop or desktop. As I noted earlier, there’s something neat about treating your whole home as a workspace, with apps positioned in various rooms, and I think that setup will resonate with some people in the same way that standing desks do for some. But most won’t feel the drive to change their habits to adopt that way of working.
The real Achilles’ heel here for many is the fact that there’s no practical way to use the Vision Pro in meetings, either in person or over video. Apple has put a lot of effort into solving that problem, but I don’t believe most users will think it was successful. If you have to join Zoom calls for work, there’s no way this will replace your computer. Certain kinds of individual contributors who have few if any meetings will fare better, though.
It’s cool to have a big-screen monitor that you can take with you while you travel, and the immersive environments can be great for focus. It all adds up to a nice experience—it’s just not an essential one. Still, I’ve seen enough demos of theoretical AR apps to know that there is potential here for truly new ways of working. As with so many other aspects of the Vision Pro, though, it seems like the potential is in the future, not the present.
So where does that leave us on the Vision Pro’s value? Based on a few weeks of using it a lot in various contexts, I’ve found that the case is strongest for frequent travelers who want all the perks of home—a big-screen TV, an external monitor for a laptop, and so on—when they’re away. Its value for that user is crystal clear. I just so happen to fit that profile, but I don’t believe I’m representative of most users.
When you wear the Vision Pro, you can’t escape how exciting it is to use. It really is amazing, and your mind easily wanders to all the neat things it could do in the future. We’ll just have to wait to see where future hardware and software iterations—plus continually expanding third-party software support—lead us.
Right now, the Vision Pro is a powerful tool for frequent travelers and a neat toy for a handful of other tech enthusiasts. It will take new applications and a significantly lower price to convince most other people, but I could see that happening further down the road.
Just… maybe steer clear of the uncanny valley 3D avatars in the future, Apple.
The good
- The eye-tracking and finger-gesturing interface is brilliantly conceived and beautifully realized—it’s genuinely a delight to use
- Passthrough looks great, as do the immersive settings, so there’s a lot of flexibility for your working environment
- Being able to walk around to purpose-focused spaces in your home or office could help keep the creative juices flowing
- It works seamlessly with the Mac and with Bluetooth keyboards, allowing much heavier-duty productivity than you can get on most other mobile devices
- App support is very strong so shortly after launch, thanks to both spatial apps and compatible iPad apps
- Text is clear and legible, and the interface feels highly responsive
The bad
- The price tag is way too steep for most people
- Mac integration could be a lot more robust, with more monitors, break-out apps, and audio management
- The virtual keyboard is not very nice to use
- Battery life needs to improve to take full advantage of new ways of working
- There are very few apps that truly take advantage of the device’s 3D spatial capabilities
The ugly
- Uncanny 3D avatars mean it feels like social suicide to use the Vision Pro in meetings
Listing image by Samuel Axon