Qualcomm today announced the latest addition to its fleet: the Snapdragon AR2. Built for AR glasses, the chipset contains a processor, co-processor, and connectivity. It joins the Snapdragon XR chips used across 60+ AR and VR headsets, splitting off into AR-dedicated design.

This is a significant moment for Qualcomm and for AR, in that the subset of spatial computing is getting its own chip. It signals Qualcomm’s confidence in the category that it will be large enough to support its own purpose-built processors. And it’s putting its money where its mouth is.

Going deeper on the chipset, the components listed above are meant to be integrated and arrayed across AR glasses frames. The processor fits on one of the glasses’ arms, while connectivity (WiFi 7 chip) is embedded on the other arm, and the co-processor goes on the nose bridge.

This configuration is designed for optimal distributed processing. For example, the processor handles scene understanding and display output among other things. The co-processor powers eye tracking, and the connectivity is the bridge to a host device, such as a smartphone.

This distributed load is meant to lower design barriers for AR glasses manufacturers (Qualcomm’s customers) in that it spreads out the heat and bulk of the processing units. This is otherwise challenging in that AR glasses design is a game of tradeoffs between UX and wearability.

“One of the reasons I’m excited about this chip is that we all want fashionable, sleek devices to be available,” said Qualcomm VP and GM of XR, Hugo Swart during a press pre-briefing we attended, “but yet not to compromise on performance.”

Qualcomm Cuts the Cord

Play to Your Strengths

One thing that’s clear from AR2’s design is that Qualcomm continues to double down on distributed processing as a design principle for AR glasses. This is all about sharing the computing load between the glasses themselves, as noted, and a proximate host device.

Stepping back, there are several AR glasses designs including standalone (HoloLens 2) and tethered (Magic Leap 2). Sitting between them are “viewers” that include faceworn display peripherals (Nreal Light) and smart viewers that split the computational load (Lenovo A3).

That last category is where distributed processing shines, achieving power efficiency and a richer UX. Here, it’s all about offloading high-thermal compute processing from a device meant to be worn on your face. Doing so alleviates the underlying design challenges noted above.

The result is that glasses can handle positional tracking and computer vision, while the smartphone or computing puck does the heavy lifting for running apps and graphics processing (GPU). This not only spreads things out but lets each piece of hardware play to its strengths.

For example, the AR2 achieves sub-2 millisecond latency and low power consumption for advanced functions like reprojection. This is when an AR device moves around digitally-placed objects and must constantly re-localize and track its movement to keep those objects anchored.

“Some customers that want to do standalone AR can still use our XR product line,” said Swart. “But if you want to have a sleek, low-power device, you need an AR solution. It’s a very different chip. We had to create an AR-only chip. That’s why we created the new product category.”

Qualcomm Moves Up the Stack with Snapdragon Spaces

Software & Silicon

So how does AR2 clock in? According to Swart, it burns 50 percent less power and gains 2.5x AI performance versus the legacy XR2. This is useful when AR glasses scan their surroundings to identify objects, localize, and reproject – precursors to overlaying the right content.

That’s not to say that the XR2 is sub-par, but that the AR2 is built for AR glasses and doesn’t have to do double duty on device classes. AR2 is also 40 percent thinner and lighter than XR2 and can achieve 45 percent less in-device wiring, which means more wearable AR designs.

All of this expands Qualcomm’s positioning in the XR world. It continues to become more vertically integrated for both AR and VR. For example, the AR2 joins last November’s launch of its Snapdragon Spaces platform, giving Qualcomm tighter integration of software and silicon.

Meanwhile, Snapdragon AR2 is already being used and tested by AR leaders including Adobe, Microsoft, and Niantic. And beyond these players, the AR industry as a whole benefits, as AR headsets get their own purpose-built chip for the first time. Consider it AR’s coming of age.

“Why do we need something purpose-built?” said Swart. “We can’t be an XR across-the-board chip. We need a dedicated solution for Augmented Reality.”

More from AR Insider…