As we enter a new year, it’s time for our annual ritual of synthesizing the lessons from the past twelve months and formulating the outlook for the next twelve. 2023 was an incremental year for AR & VR, which both continue to gradually trudge uphill toward mainstream traction.
Highlights in 2023 included the beginnings of XR’s convergence with AI, Apple Vision Pro’s unveiling, and some ups and downs for VR. 2023 was also defined by the rise of passthrough AR that’s incubated within VR a la Quest 3, as well as some smart glasses milestones.
These ends of the spectrum – passthrough AR and lightweight smart glasses – represent two paths towards AR’s holy grail. The former will get there through graphically rich UX in bulky form factors that slim down over time; while the latter gains UX richness to accompany its wearability.
Until then, what does spatial computing’s near-term look like? Aligned with the more extensive predictions of our research arm, ARtillery Intelligence, we’ve devised 5 AR Insider predictions. We’ll break them down weekly, continuing with prediction 5: Vision Pro, Propelled by Wearables.
Prediction 1: AR and AI Get Hitched
Prediction 2: A Smart Glasses Turning Point
Prediction 3: New APIs & SDKs Elevate XR
Prediction 4: Mixed Reality, The New VR Standard
Prediction 5: Apple Vision Pro is Propelled by Wearables
Experiential Depth
Wearables will be a key piece of the AR puzzle. They’ll create experiential depth in terms of integrations and continuity features between devices. For example, smart watches can be positioned as inputs and controllers for AR glasses while smartphones offer processing muscle.
Moreover, because near-term smart glasses will be optically underpowered (see earlier prediction), elegant integrations with wearables could make up for it by deepening experiences in other ways. We’re talking about integrations that provide utility and AI-driven personalization.
Moving on to Apple Vision Pro (AVP), it will embody such integrations more than any other AR device. This boils down to Apple’s classic vertical-integration playbook, and the opportunities for sensor fusion and continuity among its suite of ubiquitous wearables like Watch and AirPods.
The idea is to allow AVP – a new concept that needs all the adoption help it can get – to piggyback on established devices. Given the existing installed base enjoyed by Apple Watch and AirPods, they’ll be enlisted to add utility and appeal to AVP through several integrated use cases.
In fact, Apple has already gotten started with these integrations and gave us a few clues. Apple Watch now has the same finger-tap gestural input as AVP, while AirPods support low-latency spatial audio. Lastly and most notably, the iPhone 15 records spatial video for AVP playback.
All the above applies our signature “follow the money” framework in weighing financial motivations. Among other things, Apple is sufficiently motivated to have you buy several devices – a key part of its signature ecosystem approach that boosts average revenue per user (ARPU).
Historical Patterns
Though Apple could lead the way in wearables integrations for all the above reasons, it won’t be alone. Meta is investing in deep research for wrist-worn inputs, and even futuristic brain-control interfaces (BCI). These are longer-term innovations that won’t appear in 2024.
Meanwhile, Meta will continue to invest heavily in wearable input devices that elevate and add experiential dimension to its XR devices. That includes two “paths” to AR glasses noted earlier: AI-fueled low-immersion smart glasses and mixed reality devices like Quest 3.
However, Meta does have one challenge in that it is already committed to hand-controller-based inputs on its VR and mixed-reality devices. This is partly a function of the already-established foundation of games and apps that are designed around these controllers.
Developers of these experiences will find it difficult to port them into other input formats, which could present challenges for Meta to evolve in the above ways. That said, Meta is increasingly diversifying into controller-less hand tracking in various gaming and non-gaming experiences.
Back to Apple, we predict that it will reveal more ways this year that AirPods and Watch (as well as iPhone, iPad, and others) integrate with, and add depth to spatial experiences. These will unlock more Vision Pro use cases, giving existing Apple device users greater reason to adopt.
That will happen to the tune of 225,000 AVP unit sales in 2024. This figure takes into account the historical sales patterns of Apple wearables, plus a heftier price tag. 225K units is low in Apple terms, but big enough to inflect the low-scale AR headset market as it stands today.
For more color, see the full report on spatial computing’s 2023 lessons and 2024 outlook…