Almost one year after Snap announced Lens Studio 5.0 from the AWE conference stage (we were on stage with them at the time), the company has released version 5.10. This version features a new development toolset called Bitmoji Suite, and new assets to power AR games.

Starting with Bitmoji Suite, it carries new capabilities to personalize and animate the traditionally-static Bitmoji. These new functions let creators design custom outfits, stylized props, animations, and integration with Snap’s game-engine-like functions for creators to build mini-games.

That brings us to the second big update in Lens Studio 5.10. Including but not limited to Bitmoji, there are new game functions in Snapchat’s Asset Library. For those unfamiliar, this is a series of tools and templates for creators to build games that users can play with and against each other.

New gaming functions include turn-based mechanisms to engender back-and-forth play between friends. There’s also a new movement system for customized game mechanics, a character controller, player metrics, and updated leaderboard functions, among other things.

In addition to these tools and templates, Snap is offering some inspiration with a few fully-baked Bitmoji Game Lenses. These include Bitmoji Bistro and Bitmoji Buckets, which make good use of the above functions, such as turn-based play. We’ll see where creators go with it. 

That last part is the point, and the ethos behind Lens Studio. Snap has invested ample resources in the platform to make it creator-friendly and robust. The idea is that user engagement is elevated by the breadth and depth of available lenses – which in turn scale through creators. 

XR Talks: What Sits at the Intersection of AR and AI?

Democratization Move

All the above exists within the broader evolution of Lens Studio, which brings us back to all the moves in the past year since Lens Studio 5.0 launched. Many platform updates since then have been AI-fueled, including Easy Lens, AI Video Lenses, and Sponsored AI Lenses.

This signals a clear direction for Snap generally, and Lens Studio specifically. AR and AI go together in several ways, which works for Snap on a few levels. One level is AI’s ability to streamline creation – again, a big priority for Snap. Another is to elevate user-facing experiences.

Taking those one at a time, creator-facing AI includes things like Easy Lens, which lets AR creators generate lenses using text prompts – a democratization move. Snap revealed recently that Easy Lens has been used to create 10,000 lenses since it launched in December.

As for user-facing AI, the goal is to bring all of the imaginative benefits of generative AI to lenses – we like to call it generative AR. In other words, just like users today can generate 2D images through text prompts, they will be able to do the same to generate AR lenses on the fly.

There are computational challenges to bring this vision to life, but Snap is working on it. Meanwhile, things like AI Video Lenses, and Sponsored AI Lenses bring Snap closer to that goal through lenses that adapt to real-world scenes using AI to interact dynamically and dimensionally.

Meanwhile, we’ll hear more about Lens Studio 5.10 and Snap’s AR and AI evolutions next week at AWE. Evan Spiegel will keynote the event and reveal more about Snap’s spatial trajectory. We’ll be in the front row to hear what he has to say and will report back with highlights.

More from AR Insider…