top of page

What Apple Vision Pro Needs to Do

Updated: May 20




Should you buy the Apple Vision Pro in 2024 for $3500? If you're not a developer, no.

But will you buy an Apple Vision Pro (or non-Pro) in the next couple of years? Well, that's an entirely different question. There are a few things that Apple needs to do in order to make this question a "Yes" for as many people as possible, and here are the ones I believe to be the most important.


Make It Lighter

The first iteration of Apple Vision Pro has many of the expected first-product-generation flaws, but probably the most significant flaw is comfort. It is a heavy headset, and reviewers have consistently noted that much of this weight falls directly on your cheeks. The Dual Loop Band (currently included in the box, but featured on none of Apple's marketing material) helps with this problem a little bit, but it's still quite heavy, and forward-weighted.


This is a problem for comfort, of course, but it's a problem for avatars, too. Anything that uses the avatar system - FaceTime and EyeSight, specifically - is informed by the unit's sensors looking at your eyes and face from its vantage point just in front of your face. When your cheeks are being impacted by the weight of the device itself, this corrupts the input data, and contributes to the uncanny valley aspect of the avatar.


It's easy to say "yeah Apple, make it lighter already" from the sidelines, but that's just something that needs to wait for the progression of technology, right? Well, there's at least two ways they could facilitate this without waiting for the next leap in technology.


Hyper-personalize Each Unit

Right now, when you buy an Apple Vision Pro, you get the same unit as everyone else does. You get a faceplate from a number of different selections that best suits your face, and if you have prescription glasses you get Zeiss lenses, and these all snap on magnetically. But remove these things, and one AVP is the same as another. For weight and thickness reduction, this is a problem.


The AVP must have motors and tracks in it to move its lenses left and right to match your interpupillary distance, and the prescription lenses snap overtop of a lens assembly that is already tuned to 20/20 vision. The faceplate fits your face pretty well but not perfectly, which makes it a bit less comfortable and allows some light leakage.


But what if that faceplate was 3D printed to match your specific face? What if the lens assembly wasn't tuned to 20/20 with a prescription attachment overtop, but was tuned to your prescription from the start? What if the unit was manufactured to your specific IPD from the get-go?


All of these things would reduce the weight and the required distance from your eyes, massively increasing the comfort of the device. This is what the Bigscreen Beyond headset does, allowing it to bill itself as "the world's smallest VR headset", without compromising on visuals. The AVP, being a standalone unit and not a PCVR display, needs more hardware in it than the Bigscreen Beyond and thus can't get quite as small as that, but it can certainly get close by maximizing customization.


Ditch the Heavy, Premium Materials

This one is pretty straightforward. At its price point, AVP is a premium product, and Apple clearly wanted to make people feel like it was a premium product when they hold it in their hands. But people don't use the headset while holding it in their hands, and reducing weight is far more important for widespread adoption. Metal and glass feel nice until they're pressing on your cheeks; plastic is what we need for the next version.


Double Down on EyeSight

One of the frequently criticized aspects of the Vision Pro is EyeSight, the avatar eyes that appear on the front of the device for other people to see. The display is dim and low-resolution, and the avatar is deep in the uncanny valley. The speculation on many of these reviews is that the feature will not survive to the next iteration of Apple Vision.


Apple would be wise to ignore such suggestions. EyeSight needs to be improved, not removed. The first major software update, now in beta, is already reducing the creepiness of the uncanny-valley avatars, and the next revision to the hardware can improve the other flaws.


An AVP without EyeSight would be an isolating, lonely device. The AVP with the current iteration of EyeSight is, admittedly, a slightly creepy and unusual device. But the AVP with a properly functioning EyeSight is the device that could become ubiquitous.


Many people seem to be obsessed with the idea that the AVP is a precursor to AR spectacles - but I'm not convinced. Not only is the technology not there, the technology will probably never be there. In order for spectacle style AR to look as good as AVP's passthrough-camera style AR, the latency needs to be zero - not just low, but literally nothing - plus it needs to be able to actually obscure light coming in, not just add to it. There is no display technology now or on the horizon that that could accomplish either of these things (especially the first), and the result is that virtual elements of spectacle-style AR are relegated to a floaty, glowy space that can never be fully integrated. Don't get me wrong, there's a place for that kind of computing, but it will never be as immersive as a pass-through, and I don't believe the Vision Pro is going to make that immersion sacrifice in the foreseeable future.


EyeSight is the alternative to being spectacles. If EyeSight is improved enough, the AVP will look like spectacles for any reasonable purpose, but without the nagging downside of being technologically impossible.


Make the "Apple Vision Air" a Display for an iPhone or iPad


The AVP requires the use of a pocket-bound battery pack to reduce the weight of the headset. For now, this is a required piece of the puzzle for comfort. One day we'll probably be able to get everything integrated on the head, but until that happens, Apple may as well lean on the pocket unit to offload as much as possible - including the apps. And once you do that, you basically have an iPhone running the AVP. So why not have an iPhone actually just run it?


The division of labor between the M2 and the R1 processors in the AVP seems clear. The M2 is essentially running like an iPad, while the R1 is throwing all that iPad stuff into headset mode. All the major video bandwidth stuff - sensors and displays - is hooked into the R1, freeing up the M2 to run the apps themselves, which then goes back and tells the R1 what needs to be displayed where. Due to the bandwidths involved and the importance of low latency, the R1 needs to be in the headset itself, close to all the sensors and displays. The M2, however, has no such limitations - it has a lot more leeway in terms of latency and it's not shifting anywhere near as much data around. That means the M2 could be at the other end of a cable, just like the battery pack. In fact, the M2 could just... be an iPhone. Make this version of the headset the "headphones for your eyes" that Steve Jobs once talked about, just an accessory to the iPhone that holds your apps and powers it.


This might be where spectacle-style AR could have a future within Apple's spatial computing vision. If you are powering far fewer displays (no EyeSight, and considerably less rendering for the user-facing displays), then leaning on the battery power of an iPhone is far less onerous, plus the R1 processor could be reduced in power enough to fit into reasonable-looking spectacles. The R1 would take care of watching your eyes and hands and translating that into virtual taps and drags; the iPhone would tell the R1 what all the windows look like. It would support some lightweight 3D content, but nothing like with the Apple Vision Pro supports, because it just conceptually could never look as good.


This would be one of the larger divisions between a Pro and non-Pro product line in Apple's history (they would barely, if at all, be running the same OS, even), but it would allow the greatest coverage of the AR market. Those who just want some data to float in the real world with minimal intrusion can accept the floatiness and lower visual quality of the spectacle-style Vision Air; those who want better apps and better integration into the world can get the heavier, more dedicated Vision Pro.

Comments


bottom of page