Spatial computing – including AR, VR, and other immersive tech – continues to alter the ways that we work, play, and live. But there have been ups and downs, characteristic of hype cycles. The pendulum has swung towards over-investment, then towards market correction.

That leaves us now in a sort of middle ground of reset expectations and moderate growth. Among XR subsectors, those seeing the most traction include AR brand marketing and consumer VR. Meta continues to advance the latter with massive investments and loss-leader pricing.

Beyond user-facing products, a spatial tech stack lies beneath. This involves a cast of supporting parts. We’re talking processing muscle (Qualcomm), experience creation (Adobe), and developer platforms (Snap). These picks and shovels are the engines of AR and VR growth.

So how is all of this coming together? Where are we in XR’s lifecycle? And where are there gaps in the value chain that signal opportunities? This is the topic of ARtillery Intelligence’s recent report Reality Check: the State of Spatial Computing, which we’ve excerpted below.

Reality Check: The State of Spatial Computing

Elusive Animal

In the last installment of this series, we reviewed Magic Leap 2 from a hardware perspective, as well as its market positioning. Now we switch gears to do the same thing for a different AR headset and market segment: Snap Spectacles and consumer AR.

Backing up, consumer AR glasses are an elusive species. Prominent players like Microsoft and Magic Leap have pivoted to enterprise, while consumer-grade AR glasses like Nreal Light and Air continue to expand. Venerable AR glasses specialists like Vuzix sit somewhere in between.

There’s also an emerging and broadly-defined crop of “smart glasses,” which could be a sizable category in the long term. These are defined by smart features like cameras and speakers, a la Ray-Ban Stories, but don’t have optical systems to display visual content.

Snap Spectacles occupied this smart-glasses territory for several generations until its latest version launched with full-blown AR capabilities. But there’s one drawback: they’re not for sale. Instead, they’re meant for developers to start to seed the content ecosystem.

AR Glasses: Opportune But Underbaked

On Brand

Upon unboxing, Spectacles’ most striking quality is its build quality. The glasses feel solid and sturdy but aren’t onerously heavy at 134 grams – a key attribute for anything face-worn. Spring-loaded arms adjust to head size while input buttons and touch panels are intuitive.

On brand for Snap, the glasses are also halfway stylish. This is notable, given design tradeoffs endemic to AR glasses. Every UX enhancement carries a cost, usually heat and bulk – neither of which are onerous with Spectacles. They also output 2000 nits of brightness.

So where’s the tradeoff? The battery life is relatively low at 30 minutes. But again, this device is purpose-built for developers. When it makes its way to consumer markets, this battery life will presumably be addressed, as it’s a key factor for any wearable.

As for potential use cases, they’re represented in lens creators’ work so far. For example, a solar system lens signals the potential for education by letting you walk around orbital bodies with anchored and positionally-tracked graphics, care of Snap’s Spatial Engine.

How Long is AR Glasses’ Path to Scale?

From Handheld to Headworn

Another potential category is meditative lenses, such as “Metascapes” and “The Door.” For the latter, animations direct hand gestures that unlock calming sequences – demonstrating the Spatial Engine’s hand-tracking capability. AR inputs will continue to be a developing area.

Beyond lenses, Spectacles’ killer app may be its Scan feature. Already prominent in Snapchat’s mobile app, this visual search feature lets you tap a physical button on the device’s frame to activate. It identified simple objects in our tests, including bushes, trees, and benches.

In fact, visual search could be a killer app, as we’ve examined in the past, especially in its transition from handheld to headworn. Identifying objects is both practical and monetizable, including making the world sharable and shoppable. These factors align well with Snap’s AR playbook.

A similar evolutionary path is apparent in Snapchat’s mobile AR expansion from front-facing lenses (selfies) to rear-facing lenses to augment the broader canvas of the physical world. These developments are all related because, with AR glasses, all lenses are world-facing.

Could Visual Search Become AR’s Killer App?

Cars to Couches

This transition from user-facing to world-facing also has revenue implications. Because a wider range of use cases occupy that “broader canvas” it means that there’s a larger addressable market of brands that can develop sponsored lenses – everything from cars to couches.

But to get to that point, Snap knows that it has to start with developers. By putting Spectacles into their hands first, it will get them to start thinking spatially and lead the way in terms of world-facing AR lens standards. And because this is a new medium, it will be a learning process.

Back to the hardware, though Spectacles are developer-first, they foreshadow the design and UX of an eventual consumer model. They would have to in order to have continuity for developers. And if that’s the case, Spectacles’ future – just like its optics – is looking fairly bright.

More from AR Insider…