As we approach a new year, it’s time for our annual ritual of synthesizing the lessons from the past twelve months and formulating the outlook for the next twelve. 2023 was an incremental year for AR & VR, which both continue to gradually trudge uphill toward mainstream traction.
Highlights this year include the beginnings of XR’s convergence with AI, Apple Vision Pro’s unveiling, and some ups and downs for VR. 2023 was also defined by the rise of passthrough AR that’s incubated within VR a la Quest 3, as well as some smart glasses milestones.
These ends of the spectrum – passthrough AR and lightweight smart glasses – represent two paths towards AR’s holy grail. The former will get there through graphically rich UX in bulky form factors that slim down over time; while the latter gains UX richness to accompany its wearability.
Until then, what does spatial computing’s near-term look like? Aligned with the more extensive predictions of our research arm, ARtillery Intelligence, we’ve devised 5 AR Insider predictions. We’ll break them down weekly, continuing here with prediction 3: New APIs & SDKs Elevate AR.
Prediction 1: AR and AI Get Hitched
Prediction 2: A Smart Glasses Turning Point
Prediction 3: New APIs & SDKs Elevate XR
Prediction 4: Mixed Reality, The New VR Standard
Prediction 5: Apple Vision Pro is Propelled by Wearables
Distributed Innovation
To date, AR achieves the greatest scale with free creation platforms. For example, Snap Lens Studio brings low-friction AR tools to developers and creative professionals. That has scaled up lens creation, variety and consumer entry points. The result is 6 billion lens plays per day.
Another version of this principle comes in APIs and SDKs. The former provides functionality that developers can infuse into their AR (and non-AR) apps. The latter is a toolkit to create full-blown AR experiences and apps, such as Apple ARkit, Snap Camera Kit, and Niantic Lightship.
We expect both APIs and SDKs to be a vehicle for AR growth in 2024. Starting with APIs we believe that in 2024, Google could create one around Google Lens. This is its visual search tool that lets users point their phones at physical objects to identify and contextualize them.
As we examined recently, Google Lens carries advanced functionality that Google uniquely has. This includes its widescale image training set (Google Images) that’s required for visual object recognition. That unique technical capability is often a good condition for an API to thrive.
In such cases, developers are more inclined to use existing best-of-breed technologies than build advanced functionality from scratch. Moreover, visual search will continue to grow in demand and is additive to a wide range of apps – everything from shopping to travel to education.
Meanwhile, Google has a proven interest in API approaches in AR, as seen in its Geospatial API. This lets developers tap into the functionality that Google has uniquely built around location-oriented AR experiences (built from its vast Street View data), such as neighborhood wayfinding.
Experiential Spectrum
As for SDKs, our money is on a kit that supports spatial audio or audio AR. Apple has already done the hard part in conditioning hardware behavior around AirPods. We wear them persistently, but they’re inactive most of the time and use cases are narrow (e.g., phone calls).
Building on that installed base, the opportunity is to stimulate audio-rich apps that utilize proximate iPhone sensors like GPS and accelerometer. The result could be audio AR apps that range in functionality from local discovery to social connection to sports and performance.
For example, golf apps could whisper advice and intelligence as you step up to the tee box on a given hole. Local discovery apps could give you subtle audio cues when you are in proximity of pre-defined interests (think: craft beers). Travel apps could offer GPS-enabled walking tours.
In all these cases, it’s all about situational awareness and AI-driven insights, delivered in subtle and ambient ways via the AirPods you already wear. And it doesn’t have to be just AirPods: Meta is keen on this opportunity, as it has demonstrated with the latest Ray-Ban Meta Smartglasses.
For Apple, a spatial audio SDK could not only breathe new life into AirPods – already a high-stakes Apple revenue center – but broaden the experiential spectrum and immersive capabilities for Apple Vision Pro. In both cases, Apple has a lot riding on adoption and unit sales growth.
Synthesizing all of the above into a concrete 2024 prediction, we’ll see a Google Lens-based API, and a spatial audio SDK from Apple. The latter would exist alongside other Apple SDKs such as iOS, ARKit, WatchOS, VisionOS, and tvOS. Both moves could meaningfully advance AR.
We’ll pause there and circle back next week with another 2024 prediction. Meanwhile, see more color in our full report on 2023 lessons and 2024 outlook.