This is the third part of my series on early notes from the Vision Pro. This one is focused on media consumption—earlier entries covered the hardware and interface and productivity.
Media consumption with photos and videos is fantastic. There’s nothing like it. Watching movies on it is even better. I’ve never had a particularly good home theater system. Now I do.
I started by watching Moana in the Disney+ app in their theater environment. It is like having your own movie theater. I got so absorbed in the movie’s climax that I teared up a bit. Since I couldn’t wipe my eyes with the Vision Pro, it made me cry a bit, but the story about watching Moana and coming to tears without being able to wipe my eyes, my light seal cushions got wet, which was kind of funny. Hippie.
3D videos are impressive, but at this point, more like a demo. When I have older videos of my family, they’ll start ruining light seals like Moana did.
Panoramas look great. I will be shooting a lot more of them going forward. I can tell newer vs. older panoramic photos based on their fidelity. I want to be able to make some of them the equivalent of a background wallpaper so I can put apps in front of them. My guess is Apple is more focused on Environments.
I watched a Netflix show in Safari. It was also great, but app-specific media is better.
The big asterisk with media consumption is that it is a solitary experience. There are shows I watch without my family, and it’s great for that. The device does not enable any joint viewing experience.
visionOS has roots in iPadOS, and it shows. You’ll be disappointed if you are looking for a Vision Pro to replace a Mac.
Instead, I’ve focused on ways Vision Pro is superior to the Mac for productivity, like my writing cabin.
Vision Pro is very good at keeping me isolated for focused work. I can already be productive with the device where that focus matters.
We don’t have enough environments to get the most out of that last point.
I found an attached Bluetooth keyboard a big help. I use a connected trackpad much less, but it also can come in handy.
That said, dictation is much better than it used to be, and don’t forget to use it with the Vision Pro.
Fantastical is a stand-out app. Putting up your calendar and make it the size of the wall is pretty damn cool. It works particularly well for the big picture of monthly, quarterly, and yearly use. I’ve got a massive version of my monthly calendar installed on my ceiling. As I think about next month, I can look at the ceiling to see what’s on deck.
MindNode Next is also an interesting early entry. It’s a mind-mapping app but also a brainstorming app where you can locate ideas in space.
Ideation development (like MindNode) is an excellent use case for Vision Pro. Apple’s Freeform could also serve in this capacity, but it’s not yet there. My experiments continue.
If you want to capture a lot of text, try Aiko, an AI-based transcription tool. You just hit the record button, which converts the recording to text with the Whisper AI engine. I checked with the developer, who reports all work is done on-device.
Mac display mode acts as an escape hatch, but I don’t see it replacing monitors for extended work sessions. It makes tons of sense to have a big display attached to a laptop in a hotel room or to give you the ability to move your desktop Mac display to a different room, though.
We are in early days for the productivity question on Vision Pro. There are still many workflows to be explored and apps to be released.
Now that I’ve logged some serious hours in the Vision Pro, I thought I’d share some thoughts about it. This post focuses on the hardware and interface:
Strapping into the Vision Pro does feel a little bit like getting ready to make a spacewalk. I charge the battery (generally) with it disconnected, letting me store the hardware (along with a keyboard) in a drawer. When it’s time to go into the device, I put the battery in a pocket and run the cable under my shirt to my neck to avoid getting tangled in things if I go mobile.
For productivity work, a keyboard is necessary. I had an extra keyboard and trackpad. I’ve combined them into one unit using this gizmo from Amazon. Twelve South also makes one that looks a little nicer.
The screens are excellent, and anything rendered by them (apps, icons, environments) is entirely believable. The pass-through cameras, however, are darker and grainier than I expected.
The pre-release discussion of it being too heavy was overblown. I’ve worn it for hours without much trouble.
The Dual Loop Band is more comfortable for me than the Solo Knit Band, but the Solo Knit Band is more straightforward to configure. I use the Solo Knit band for short sessions and the Dual Loop band for longer ones, like watching movies.
The audio on the Vision Pro is much better than I expected. I connected my AirPods earlier today to confirm they work, but I’ve been using the built-in speakers exclusively thus far for everything (including watching movies), and they seem fine to me.
You must train yourself to avoid picking it up by the light seal. It’s a light magnetic connection, and it is easy to drop the device.
Touch targets on iPad apps are too small. The eye tracking works great with native apps but is sometimes tricky with iPad apps.
One of the nice touches: when you grab the handle of a window, it automatically aligns rotationally to where you’re standing in the space in the room. There are so many subtle details with the way it renders windows. The shadows on real-world objects are another of my favorites.
If you’re having trouble with tracking, make the object bigger by stretching it or bringing it closer to you. I kept forgetting about that.
You can rotate a window by rotating your head.
The pinch gesture only works when you have your hand with your palm down. I never got it to work with my palm up.
You can long-press the pinch gesture, and you get right-click options. I’d like to know how many other ideas they have for gestures as this product matures.
Strangely, I think I feel things when I touch them: virtual keyboard keys, butterflies, and the like.
I struggle a little bit with app management. There aren’t any options except to go through the alphabetical list.
It seems silly that you can’t set favorites, have a dock, or otherwise arrange your applications beyond the main screen.
With a device so dependent on dictation, there should be an easier way to trigger dictation without resorting to the virtual keyboard.
We had a meetup in the MacSparky Labs a few days ago about how we’re doing with the Vision Pro, where we answered questions and discussed whether or not we’re keeping these new gizmos. It was a lot of fun and quite informative, with members sharing their experiences and workflows. …
This entire post was composed on Apple Vision Pro with dictation and a Bluetooth Apple Keyboard attached…in virtual Yosemite Valley.
One of my interests in the visionOS platform is whether or not I can use it to get work done. Apple thinks so and has made that a big part of the marketing push for this device. However, it is a new platform with a fledgling App Store and many questions surrounding whether it is useful for productive work.
Moreover, the types of workflows that lend themselves to the platform are also in question. Don’t forget the Vision Pro operating system is based on the iPad, not the Mac. It’s easy to strap on this new device, thinking you can turn it into a Mac. (The fact that you can mirror a Mac display makes it even more tempting.) That’s the mistake I made with the iPad, and I spent years drilling empty wells, looking for productivity workflows that would allow me to duplicate Mac workflows. It was only after I accepted the iPad as an iPad that it became productive for me.
I’m not going to make that mistake with the Vision Pro. I’m going into this thing with open eyes and a sense of curiosity for where it can be used to get work done.
This is not a Macintosh. It is something else. And that is where the opportunity lies. While Mac workflows don’t work here in visionOS, are there things in visionOS that don’t work on a Mac? That is where we should be looking.
And for me, that starts with the idea of contextual computing. I have long felt that computers put too much interference between you and your work.
If you want to write an email, you need to open an email application, which will show you a bunch of new emails, but not a compose window where you can write that email. So many times, you’ll start with that task to write that important email but never actually find your way to the compose window. If you want to work on your task list, you often have to wade through screens and screens of existing tasks before you can get to the ones you need. Put simply, computers need to put you in the context of the work with as little interference as possible.
Sadly, most modern software doesn’t do that. Instead, it does the exact opposite. This is partly due to bad design and partly because tech companies have figured out ways to monetize your attention. They are intentionally trying to divert you from the work. That’s how they keep the lights on. One of the easiest ways to be more productive on any platform is to find quick ways to get yourself in the context of the work you seek to do with as little interference as possible.
This is where visionOS and Vision Pro come in. It’s a new platform tightly controlled by one of the only big tech companies interested in user privacy. This new visionOS is where you can work if you are smart about it.
I’m still experimenting and figuring out my workflows, but here’s an easy one I’ve been using in visionOS for several days: my context-based writing space.
It starts in Yosemite Valley. Using the visionOS “Environments” space, I have found myself in an immersive rendition of the Yosemite Valley in winter. There’s snow on the ground, but I’m sitting there right now comfortably with just my socks on … which is nice.
The main screen in front of me has Apple Notes, where I’m writing this article. To my left is a floating research pane with Safari in it. That’s it. A little research. A place to write. Yosemite Valley. I’ve written about 3,000 words here in the valley over the last few days, which is very comforting. I’ve got a focus mode on, so I don’t get interrupted, and I genuinely feel alone with my words. That’s important. For this to work, I need to be off the grid. This is my cabin in the woods, where I do my writing.
When I’m not writing, I don’t go to Yosemite to watch a visionOS movie, or check email, or play with some other aspect of visionOS. My brain is already figuring out that Yosemite Valley equals writing. My Mac is far away, back at my studio, along with the the cognitive load that comes with the work I do on my Mac. That’s all a distant memory here in Yosemite Valley. My brain is successfully duped.
As the context sticks, the work gets easier. This is a form of contextual computing that I’ve never experienced before. I’ve tried it with other headsets, but the poor-quality screens made it unbearable. I expect this writing context will get only easier over time. As the habit sticks and more writing apps and tools start showing up, I’ll consider bringing the better ones with me to future trips to the valley.
When I’m done writing, I leave this place, knowing Yosemite Valley will be there the next time I want to write.
This immersive context is not possible while sitting at a Mac. And for me, it is just the beginning of these explorations. I’m considering building a similar workflow in some other environment for journaling. And I’ve got more ideas after that.
This started simply as a proof-of-concept experiment, but now it’s set for me. I’ll return here the next time I need to do some serious writing. It’s already working: the valley appears, and my brain says, “Okay. Let’s get to it. Let’s start moving that cursor.”
This a digitally created distraction-free environment that is made possible by visionOS. And this is the productivity story for Vision Pro. I’m not looking to replace an existing platform but find new ways that are only possible in the new platform. The valley proves it’s possible. So now I need to see what else it can do. visionOS isn’t at a place where it can become my only operating system. But that doesn’t mean it can’t be an essential tool in helping me get my work done.
I’ve been using Vision Pro for several days now. Here’s a video where I share my initial impressions and answer questions about the new platform from labs members. I’m simultaneously releasing the audio from this to the podcast feed…
The Vision Pro review videos are all now dropping. The ones I’ve enjoyed the most are listed below. I think it’s fun that this new product category has got everyone doing some head scratching. None of the reviews have yet gone deep on using the device for productivity. That’s something I intend to explore.
One interesting effect of watching these reviews with my wife in the room is that she now wants her own fitted light seal to watch all her Disney movies in Theater mode. So, if you watch the below links with loved ones around, you’ve been warned.
Hope you’re looking forward to a good week. I heard from Labs members over the weekend about Vision Pro and have more thoughts. Here’s a short Monday Brief video on the topic… This is a post for MacSparky Labs Members only. Care to join? Or perhaps do you need to sign in?
Today, I woke up at 4:30 AM so I could give Apple $4,000 for a product I’ve never tried before. Crazy.
The order process went smoothly (at first). I was able to get through the checkboxes fast enough. There are upgrades for additional memory. An extra $200 will double your storage to 512GB, and an extra $400 will get it to 1TB. There is also Applecare for $499 or $25/mo.
I had a lot of questions about glasses. My distance vision is 20/20, but I need readers for books and screens. A dialog box asked if I needed glasses, and then it asked what kind of glasses. I ticked the box for readers and told them 1.5-1.75 works for me, and that was it. I was not required to upload a prescription.
My final order was a 512GB device (probably dumb to add the extra storage). I did not order any additional accessories. I’m going to wait to see how I use the device first. Then I clicked the button to pay and (foolishly) picked Apple Pay in haste. The problem is that my business card is not part of Apple Pay. (My bank only supports Apple Pay for personal cards, not business cards.) I had a moment of crisis there but decided I’d go ahead and pay on my personal card and let my accountant sort it out.
The app gave me a 9:00 AM pick up time at my local Apple Store on February 2, and I was good to go. I pushed the Buy button.
Declined.
I have no idea why. That card has a balance of a few hundred dollars and plenty of credit. Likely a fraud thing.
So I switched (in Apple Pay) to my company debit card. The only problem was that my pickup window was then gone, so I had to pick a new one. 11:30 on February 2. Check. Press Buy.
Declined.
Again, I have no idea why. Plenty of money to pay for this ridiculous headset.
So then, I canceled the checkout. I figured at that point I had a 50/50 chance that pushing that cancel button would reset the whole transaction, and then I wouldn’t be able to get one–since I’d lost my place in line. At that point, I was okay with that potential outcome.
So I pressed Cancel.
Good news? It didn’t cancel the transaction but just brought me back to the screen where I could choose to pay via Apple Pay or traditionally with my company card (as I usually do with Apple transactions). Now the first available time is 3:00 PM on February 2. Click Buy.
Transaction failed. The allotted time is already taken. Pick a new one.
So this went on multiple times. I’d pick a time, and then it was no longer available when I pushed to buy a second later. It finally worked with me picking up at 12:30 PM on February 3.
So success? I think? I have to admit I’m mixed about spending so much on a product I haven’t tried and don’t fully understand. I’m hoping that there is a productivity/contextual computing story around this headset, and the only way I’m going to really know that is to try it for myself. So I have some trepidation and am mindful of that return window. But I’m also excited to try something entirely new from Apple. So often, it is when they come to an existing platform with their own unique spin that Apple does their best work, and I want to see them do that again. Either way, here we go.
P.S. For you Mac Power Users listeners, Stephen also got one, so we’ll be sharing thoughts soon.
I’ll join the digital queue this Friday morning to purchase my Vision Pro. This is an interesting product as we head towards its launch because it appears that while it won’t be a big seller (on an Apple scale), it may still be hard to buy.
If the rumors are true, those fancy screens are hard to make and will limit the number of units Apple can ship. I also can’t help but wonder if Apple doesn’t particularly want to make this first iteration of the Vision Pro something that sells in the millions. I suspect they are still figuring out the product category themselves and getting feedback from a few hundred thousand users will give them a lot of good ideas.
The Vision Pro is expensive, and the story is unclear. A lot of the Apple faithful will pass, at least initially. This point landed for me in a recent MacSparky Labs meetup. Labs Members like Apple products. A lot. Yet we had a room full of Apple fans and only a few of them intend to buy one. Again, I expect that is due to the price and the fact that people aren’t sure what they would do with it.
The interesting point is that despite the fact that demand for the Vision Pro is lower than for other Apple products, the rumored limited quantities could still make it hard to get. (Strange, right?)
Regardless, the story of this product is not about its first iteration. Apple is thinking long-term, as they always do. Fourteen years ago, John Gruber wrote about how iteration is Apple’s superpower. Here we go again.