Snap continues to invest in AR lenses as a driver for its advertising business. This is evident in its latest usage numbers, including six billion lens engagements per day. As we examined earlier this week, it also announced 3.5 trillion cumulative views for 2.5 million lenses.

But beyond new usage figures, Snap also continues to advance capabilities in its Lens Studio creation platform. At its Lens Fest last week, it announced v. 4.10 with broader lens sound libraries, greater depth mapping, and creator monetization tools such as integrated calls-to-action.

In addition to advancing Lens Studio’s capabilities, these updates represent a few key themes in Snap’s evolution as an AR platform. So we’re highlighting the Lens Studio keynote and its strategic takeaways for this week’s XR Talks (video and summary below).

Snap Lens Fest: the Data Dive

Depth & Immersion

Zeroing in on these “themes,” one key pattern is lens depth and immersion. This was demonstrated in more licensed music in Snap’s Sounds library, so lenses can have greater audio dimension. It also extended its World Mesh AR immersive capabilities to low-end phones.

Unpacking that a bit, World Mesh enables devices to scan extensive depth maps of a given space before placing a lens. Armed with that capability, lenses can more realistically interact with physical space. And an updated physics engine emulates forces like gravity.

Additionally, Lens Studio 4.10 brings greater access to APIs. For example, lens creators can access APIs from Snap partners to integrate things like stock tickers or animations that sync with weather conditions. The idea is for creators to run with this in several directions.

Similarly, there are now more geo-local capabilities for lenses that are discoverable at specific places. Following Snap’s “Space Race” play, this takes the geo-local capabilities of Landmarkers and opens them up for creators to scan a given place, then create an associated lens.

Lastly, there are a few key updates to Snap Spectacles, which are mostly a developer tool at this point (more on that below). Among other things, a new Endurance Mode will adjust display brightness to preserve battery. And Connected Lenses enable sync’d multi-user experiences.

The AR Space Race, Part IV: Snap

Demand Signals

Another key theme that was evident throughout Lens Fest is Snap’s investments to empower the Lens Studio creator community. It recognizes that lens creators kick off the “virtuous cycle” that attracts users, which then attract more developers, more users… then brand marketers.

This theme is evident in all the Lens Studio updates outlined above. But it’s also seen in new tools that support creator exposure (profiles and networking) and creator monetization (lens-based calls to action). The latter can link users to creators’ eCommerce stores, for example.

Furthering this monetization potential, creators can format “lens packs,” available for brands to purchase. These flow into Camera Kit, which is Snap’s API that lets brands integrate and customize Snap lenses directly in their apps, rather than build experiences from scratch.

Meanwhile, Snap’s AR Innovation lab “Ghost” will let lens creators apply for grant funding up to $150,000 to develop new lenses. This follows the recent launch of AR Lab and Arcadia, which similarly support lens development — albeit more geared towards brand advertisers.

Lastly, Lens Studio 4.10 comes with deeper analytics features baked in. That way, creators can get a better sense of what lenses resonate most. They can use these demand signals to course correct or optimize their time – particularly those that make a living on Lens Studio.

Snapchat’s Virtuous AR Cycle: Users, Developers & Advertisers

Developing Muscles

In all of the above, another theme is the extension of lens capabilities from selfie fodder to world-facing lenses. As we’ve examined, this broadens AR use cases – and correspondingly, the addressable market of advertisers – beyond things that go on one’s face.

But more importantly, this shift to world-immersive lenses primes Lens Studio creators to start thinking spatially. In other words, they’re developing muscles for AR’s next evolution: AR glasses. In that form factor, selfie lenses will mostly recede as all AR lenses are world-facing.

But the key term is “developing muscles.” One misconception about Spectacles is that Snap is positioning them as a revenue center. Rather, Spectacles are a means to an end to get creators trained on world-immersive lenses, pursuant to a hardware-agnostic play.

“We will continue to look for opportunities to work with any company doing innovative work,” Snap’s Bobby Murphy told Protocol. “Whether building our own hardware or operating our software on others’ hardware, we’re going to empower the best AR experiences that we can.”

We’ll pause there and cue the full video below…

More from AR Insider…