Spatial computing – including AR, VR, and other immersive tech – continues to alter the ways that we work, play, and live. But there have been ups and downs, characteristic of hype cycles. The pendulum has swung towards over-investment, then towards market correction.

That leaves us now in a sort of middle ground of reset expectations and moderate growth. Among XR subsectors, those seeing the most traction include AR brand marketing and consumer VR. Meta continues to advance the latter with massive investments and loss-leader pricing.

Beyond user-facing products, a spatial tech stack lies beneath. This involves a cast of supporting parts. We’re talking processing muscle (Qualcomm), experience creation (Adobe), and developer platforms (Snap). These picks and shovels are the engines of AR and VR growth.

So how is all of this coming together? Where are we in XR’s lifecycle? And where are there gaps in the value chain that signal opportunities? This is the topic of ARtillery Intelligence’s recent report Reality Check: the State of Spatial Computing, which we’ve excerpted below.

Reality Check: The State of Spatial Computing

The Long Run

Looking back on this report so far, most of it focused on the present and near-term outlook for segments of the spatial spectrum. But what about the longer-term future? When talking about emerging tech, the discussion should include both short and long-run perspectives.

That brings us full circle to spatial computing’s lifecycle. One of the lessons learned after XR’s circa-2016 period of elevated hype is that it’s ill-advised to set overblown expectations. Many companies and investors got burned from believing that XR’s revolutionary impact was imminent.

Consumers have also been turned off to some degree. That’s not because these technologies aren’t compelling, but because they’ve been disappointing relative to their hyped promises. Magic Leap’s first headset is one example of this sequence, and the company has paid for it.

With that backdrop, what’s the timeline for fully actualized AR and VR? For example, with AR, when will we get all-day AR glasses that offer both graphically robust UX and stylistic viability? That combination isn’t possible today due to the technological realities and tradeoffs.

The consensus is that these goals could be reached sometime the 2030s. For example, Snap CEO Evan Spiegel is one executive who’s been realistic about this longer time horizon in his public statements. Meta CEO Mark Zuckerberg has also begun to do similar.

But to show rather than tell, here are a few examples of those comments….

Mark Zuckerberg on The Information Podcast.

“When we got phones, we didn’t get rid of our computers. We maybe just shifted some of our time towards phones. My guess is that we’ll have phones for a while too, so that part of what we do will always be important. But I think over time, [AR] will become the platform for more and more people, and I think that there’s a lot of awesome stuff that comes from that. If you can deliver a computing platform that’s fundamentally more human and about creating natural interactions between people, that’s sort of the dream that we’ve been chasing for a long time. If we can build that — and I do think it will be in a decade — a lot of the things we’ve talked about today should be delivered and at scale. I think that that will be very exciting. A lot of this stuff will start to come about in the 2020s. It may not really reach the full scale until 2030.”

Evan Spiegel at TechCrunch Disrupt

“Spectacles represent a long-term investment in augmented reality hardware. […] So I think it’ll be roughly ten years before there’s a consumer product with a display that could be really widely adopted. But in the meantime, we’ve built a relationship with our community and all these people who love building [AR] experiences and we’re sort of working our way towards that future, rather than go in a hole or in an R&D center, and try to make something that people like, then show them ten years later. We’ve sort of created a relationship with our community where we build that future together.”

The New Face of AR: Ambient & Intelligent

Fully-Actualized Self

So there you have it… though the road to AR glasses’ endpoints is long and winding, there are meaningful wins along the way. The above two tech leaders helm companies that are achieving such milestones, including user engagement and revenue for mobile AR viral lenses.

Meta, Snap, and others are also reaching tangible milestones on the way to headworn AR. For example, Meta Ray Ban Smartglasses – though not AR glasses – represent a turning point in compelling UX and focused use cases. The same could be said for Quest 3.

Taken together, these two devices represent separate tracks toward AR’s fully-actualized self. Ray-Ban Meta Smarglasses start with style and wearability, working towards more robust AR over time. Quest 3 has rich AR (via passthrough cameras) and hopes to slim down over time.

These two tracks will eventually meet somewhere in the middle. The big question is which side will get there first. And what tech giants will reach that convergence point first? Apple could likewise bet on both horses, with Vision Pro as the first step… and lighter smart glasses to follow.

Meanwhile, these short-term wins fuel Zuckerberg’s ability to speak honestly about the long term, per the above. He can temper expectations for fully-formed AR while armed with the confidence that the XR work being done today is producing tangible – albeit gradual – progress.

This framing makes AR’s 2030s deferment easier to swallow. And the sooner we all come to terms with that – barring Op/Eds that continue to characterize AR glasses’ world-changing impendence – the more we’ll set these technologies up to succeed through realistic expectations.

More from AR Insider…