There’s no doubt that Snap is all in on AR. The technology has fueled its revenue growth over the past few years. Moreover, Snap has internalized that feedback loop and continued to double down on AR through investment in its platform. This continues to be its North Star.

Fueling that feedback loop are Snap’s AR usage milestones. It now has 250 million daily lens users who engage 6 billion times per day collectively, and 5 trillion times cumulatively. This all comes from a universe of 250,000 creators who have developed more than 2.5 million lenses.

Snap knows that all of the above is propelled by lenses themselves, which get built by all those creators. So the name of the game is to keep them well-fed with AR tools. And that’s what Snap Partner Summit is all about, including a parade of announcements about new capabilities.

These include Lens Cloud, new commerce and product try-on features, ray tracing for greater lens detail, and deeper lens analytics among other things. So for this week’s XR Talks, we break down a few of the AR highlights, including video and summarized takeaways below.

Hands-On with Snap Spectacles

From Puma to Prada

Starting with AR fashion and shopping, Snap announced a new “Dress Up” feature that becomes the new primary home within Snapchat for virtual try-ons. It features full-body lenses for wide-angle shots that let users try on entire outfits in addition to individual style items.

Beyond user-facing features, Snap is also launching tools that lessen friction for fashion brands to create lenses. This involves its lens creation technology that can be brought into these brands’ own websites and development environments where their 3D workflows live.

This is essentially a fashion-forward extension of Camera Kit – Snap’s SDK that lets any business take Snap’s AR creation into their own environment. In fact, Camera Kit is one of the most under-exposed components of Snap’s growth strategy, used by consumer giants like Disney.

The idea is that this casts a wider net and broadens Snap’s addressable market. In addition to lens creation under its own roof, this extends its reach through third parties. The result could be Snapchat’s fashion-lens capability that you’ll see on websites and apps from Puma to Prada.

Digital Companion

Along the same lines of casting a wider net, Snap announced a partnership with Live Nation to bring more event-themed lenses to sports, concerts, and festivals. This provides opportunities for digital companion experiences for a given live event in order to deepen fan relationships.

One example is what it already did at this year’s Super Bowl. Cartoonish player animations were available for fans to play with. Similar use cases at concerts could include posing with the band through dimensional lenses, or trying on band merchandise before purchasing.

These types of experiences broaden AR use cases in new directions as the techology continues to find its footing. Event-based experiences could also be well received by not only fans but performers. Indeed, they’re hungry for fan engagement as business models get upended.

First, streaming music flipped revenue potential from recorded music to live events. Then the pandemic killed that last vestige of monetization. So as paid events return, interactive lenses could be one way that artists deepen fan relationships and boost the appeal of attending.

Cloud Cover

But it’s not all fun and games. Snap elevated its AR prowess across the board with new AR infrastructure. Its new Lens Cloud, is a server-side utility that beefs up lens capabilities in several ways. For example, it helps achieve the coveted “multi-player” AR functionality.

For those unfamiliar, AR on its own requires triggers to activate graphics and place them in the right spots. This gets more complicated for several people to see the same thing, with graphics that are anchored to a given point in a way that renders correctly from several angles.

Lens Cloud provides the infrastructure to pull that off. It also comes with a location-based component that lets creators anchor lenses to specific places for persistent AR. And deeper analytics let them measure usage in more granular ways so they can optimize lenses accordingly.

Lastly, Lens Cloud offers data storage, which expands AR capability beyond local device limits. It also lets developers store lens assets on Snap’s servers and load them up when needed. Practical outcomes include letting users leave a lens, then pick it up later in the same state.

https://youtu.be/cqIsDo-f670

Simple Fun

Lastly, Snap sprinkled in some wow factor. In the spirit of Steve Jobs’ signature “one more thing,” Evan Spiegel unveiled Pixy. Shown in the video above, this is a $229 nifty little camera drone that has automated controls to follow users and capture their life moments from above.

The aerial companion is designed to follow the user wherever they go, and fly in pre-set patterns such as linear or orbital paths. While doing that, it remains focused on its subject  – presumably through short-range mobile hardware tracking – to capture footage all along the way.

And just like Spectacles, the resulting footage can be edited with Snapchat lenses and other tools, such as its new “Director Mode.” Finished clips can be output and exported to any other platform. This open approach is smart – adding appeal and exposure for the new aerial modality.

But most of all, this Jobsian coda adds some magic. Beyond the technicalities inherent in developer events, Pixy has a coolness factor that makes you just want one. After all, that “experience sell” – eschewing tech specs for simple fun – is at the core of Snap’s DNA.

We’ll pause there and cue the full keynote below…

More from AR Insider…