Consumer AR glasses are an elusive species. Prominent players like Microsoft and Magic Leap have pivoted to enterprise (for now), while consumer-grade AR glasses like Nreal Light have been difficult to get until recently. Venerable players like Vuzix sit somewhere in between.
Then of course there’s an emerging and broadly-defined crop of smart glasses, which could be a sizable category in the near term. These are defined by smart features like cameras and speakers, a la Ray Ban Stories, but don’t have optical systems to display visual content.
Snap Spectacles have occupied this smart-glasses territory for several generations, until its latest version launched with full-blown AR capabilities. But there’s one drawback: they’re not for sale. Instead, they’re meant for Snap lens creators to gain creative footing for AR’s next era.
Despite these access challenges, we recently got our hands on a pair….under the supervision of Snap of course. Perched in the heights of Beverly Hills, we received a briefing with Snap execs to unpack recent AR moves, strategic trajectory, and hands-on Spectacles action.
Style & Sturdiness
So what were our impressions? We’ll break it down below, but first a few disclaimers: This isn’t a formal product review because Spectacles aren’t yet a consumer product. And we won’t list all the hardware specs which are well established, including our launch coverage.
With that backdrop, the most striking quality when unboxing Spectacles is build quality. The glasses feel solid and sturdy but aren’t onerously heavy – a key attribute for anything face-worn. Spring-loaded arms adjust to head size while input buttons and touch panels are intuitive.
True to Snap’s brand persona, the glasses are also halfway stylish. This is an accomplishment, given the design tradeoffs endemic to AR glasses. Every graphical and UX enhancement carries a cost, usually in the form of heat and bulk – neither of which are deal breakers for Spectacles.
So where’s the tradeoff? Though it didn’t come up in our short demo, the battery life is relatively low at 30 minutes. But again, this device is purpose-built for developers. They can schedule work in sequences of programming, prototyping, then testing their creations in shorter bursts.
When the device makes its way to consumer markets, this will presumably be addressed. In fact, Snap has already announced battery-saving features such as “endurance mode” that helps developers and early users get around battery constraints in smart and software-based ways.
On to the UX, the first impression when firing up Spectacles is the display brightness. It rocks 2000 nits – enough for crisp graphics outdoors. This is critical, says Head of AR Platform Partnerships Sophia Dominguez, as the point is to enhance the world….which contains daylight.
As for lenses, the range of potential use cases is represented in creator work so far. For example, a solar system lens signals the potential for education by letting you walk around orbital bodies with anchored and positionally-tracked graphics, care of Snap’s Spatial Engine.
On the other end of the experiential spectrum, a zombie game lets you run from cartoon undead. Again, positional tracking is solid without noticeable drift. Turning back to see your assailant reveals realistic and believable pacing and positioning. It’s as scary as it sounds.
Another potentially-impactful category is meditative lenses, such as Clay Weishaar’s Metascapes and Heather Dunaway Smith’s “The Door.” For the latter, animations direct hand gestures that unlock calming sequences – demonstrating the Spatial Engine’s hand tracking capability.
Beyond lenses, Spectacles’ killer app may be its Scan feature. Already prominent in Snapchat’s mobile app, it lets you tap a physical button on the device’s frame to activate. It was able to successfully identify objects for me, including a bush, tree and bench (insert “not hot dog” joke).
In fact, visual search could be an AR killer app – especially in the transition from handheld to faceworn. Identifying real-world objects has a certain James Bond quality. Moreover, it’s potentially sticky in practical (and monetizable) ways, including making the world sharable and shoppable.
In all the above, a key theme is a developer-first mindset. Similar to Snap’s growth in mobile AR, the goal is for use cases to be propelled as developers gain creative footing. As noted, it’s getting a head start on AR glasses by giving those creators tools to start developing muscles.
In fact, this is the latest in a long line of Snapchat moves to feed and fuel its 250,000-strong lens-creator community. As we’ve examined, lens creators kicks off Snap’s AR virtuous cycle which leads to lens engagement, more creators/lenses and, finally, brand marketing dollars.
With Spectacles, lens creators will also lead the way in discovering use cases. This will be a creative process, says Dominguez, given a new vantage point. In other words, early lenses were selfie-based, while Snap’s AR future takes place on the broader canvas of the physical world.
That evolution has been underway with mobile AR for some time, given Snap evolutions like Landmarkers and Local Lenses. But Spectacles accelerate the trend, given that the cameras and user perspective are only world-facing. It’s more about surroundings than selfies.
The payoff could be sizable given a wider range of use cases that occur within that broader canvas. Beyond organic fare, this fuels Snap’s growth as a business, as the addressable market of sponsored lenses extends from face fodder to anything that fits in the physical world.
Back to the hardware, though Spectacles are developer-first, they could foreshadow the design and UX of an eventual consumer model. To maintain continuity with lens-creator prototyping, that would have to be the case. And if it is, Spectacles’ future – just like its optics – is looking bright.
Disclosure: the author of this article owns stock in SNAP. See AR Insider’s Disclosure and Ethics Policy in full here.