In a “one more thing”-style moment at its Partner Summit today, Snap unveiled developer-facing AR glasses. As the next generation of its Spectacles camera glasses, they feature optical and display systems for the first time. That’s right, Spectacles are now true AR glasses.
Before going into hardware specs and strategy, one point to reiterate is the key qualifier above: ‘developer-facing.’ These glasses won’t be for sale, but rather made available for select Lens Studio developers to envision and prototype AR experiences with a tighter feedback loop.
In other words, these glasses are built with the primary purpose of giving developers hardware that will inspire new lens use cases that are native to a face-worn orientation. This is on-brand for Snap in setting up its lens creators with tools to push AR experiences further.
“We’ve offered Spectacles to a select group of global creators,” said Evan Speigel from the Partner Summit virtual stage. “For the first time, creators can build Lenses, see them overlaid on the world, and realize their vision by sending Snaps to friends, right from the glasses.”
Under the Hood
Going deeper on the hardware, these are 134-gram standalone smart glasses with optical and display systems to overlay images on the wearer’s field of view. They employ a stereo color display (both eyes) with 480 x 564 resolution and 26.3-degree diagonal field of view.
The dynamically-adjusting display shines up to 2000 Nits of brightness, which Snap says makes them suitable for outdoor use. The glasses also have four built-in microphones and stereo speakers for capturing and playing spatial audio, which could engender audio AR experiences.
The glasses can be controlled through a touchpad, two physical buttons, and voice command. These inputs will have standard system-wide functions but could also evolve and repurpose as use cases themselves do. The glasses also have a 30-minute run time and USB-C charging.
The frames importantly include two RGB 30 FPS cameras. In past Spectacles, this was to capture stereoscopic video for 3D playback. But in these AR-enabled frames, cameras scan users’ surroundings to build spatial maps that can inform the believable placement of AR graphics.
Snap says that this is all powered by its Spatial Engine, which supports six degrees of freedom, as well as hand, marker and surface tracking. There will also be tight integration with Snap’s Lens Studio so that developers can build AR experiences natively for the new form factor.
Speaking of native development, this is one of the main drivers for Snap’s new Spectacles, as noted. The goal is for new AR use cases to be discovered as developers get a feel for face-worn AR orientation. This will let them better envision and prototype lens experiences.
This follows an important trend in Snapchat’s AR evolution. Though early use-cases leaned heavily towards selfie lenses, Snap has been explicit that the future of AR will also involve the broader canvas of the physical world, seen through the rear-facing smartphone camera.
It now doubles down on this principle with AR glasses, as the cameras are only world-facing. This should accelerate the development of world-facing lenses — something that Snapchat has already begun to cultivate through developments like Local Lenses and Snap Scan.
While we’re naming trends and patterns, the latest model is aligned with the Spectacles’ purpose from the beginning. It’s often missed that the true purpose for Spectacles is to learn from user behavior and social dynamics. Now it’s doing the same but in a developer-facing way.
It’s also worth noting that this is the anti-Apple approach. Secrecy in hardware is in Apple’s DNA, and its much-rumored AR glasses are no different. But is it missing out on key behavioral insights that Snap continues to internalize through a more open market-testing approach?
Follow the Playbook
Back to the developer-facing designation for Snap’s new Spectacles, this is the latest in a long line of Snapchat moves to feed and fuel its 200,000-strong lens-creator community. As we’ve examined in the past, lens creators are the first step in kicking off Snap’s AR virtuous cycle.
In other words, lens libraries attract users and depth of engagement. Growing audiences then attract more lens developers which further expand the library and, in turn, more users. All of the above then attracts the real endgame: advertisers. It all starts with developers.
Spectacles now follow the same playbook, which isn’t surprising. But though they’re developer-facing, these Spectacles could foreshadow the UX for eventual consumer AR Spectacles. For continuity in developer prototyping, that would have to be the case to some degree.
Meanwhile, Snap just took a sizable evolutionary step in its AR road map. It continues to signal that it wants to steer the technology’s direction. It’s driven to do so as it sees AR as a core component of its product, its business, and its continued actualization as a camera company.