Snap this week took the latest step in its ongoing quest to converge AR and AI. Its new Sponsored AI Lenses bring generative AI to Snap’s established but evolving slate of immersive brand marketing tools. By doing so it boosts the appeal and addressable market for AR marketing.

Backing up for context, sponsored lenses have historically included virtual try-ons and what we call “incarnations.” The latter are thematic lenses that exude brand vibes, but aren’t necessarily try-ons. Think: a beer brand dressing you in your favorite team gear on MLB opening day.

In these use cases and others, AI can make lenses more dynamic, materializing in real time based on environmental factors and spoken prompts. This was most recently seen in Snap’s new Video Generative AI, which features wildlife that interacts with whatever is in the frame.

In that light, the new Sponsored AI Lenses represent two main evolutions for Snap’s AR endeavors. First, they bring its work in AI-fueled lenses to brand marketers, which is a big step for AR monetization. Second, they introduce new possibilities for Snap’s Lens UX.

Snap Takes a Step Closer to Generative AR

Business Case

Taking those one at a time, the monetization angle is meaningful for Snap because the company is all-in on AR as an engagement driver for its users. That in turn fuels advertiser interest in reaching AR-forward users – including brand-coveted Gen-Z – using the same UX language.

All of this has translated to meaningful revenue for Snap’s business. Now with AI, it can amplify paid lens efforts in a few key ways. First, it meets a cultural moment when advertisers are keen to learn how they can use AI. But more importantly, it broadens creative possibilities.

On the second point, Sponsored AI lenses have a more varied set of possibilities as they’re generated on the fly. Each lens uses a preset prompt to keep it somewhat consistent and on brand. But from there, environmental inputs are used to create several variations per lens.

This engenders diverse experiences that are customized for users on the fly and avoid repetition. That will improve the UX and inspire repeat/recurring engagement, which brands want. In fact, Sponsored AI lens tests with Uber and Tinder show higher playtimes than standard lenses.

All the above will appeal to brands on practical levels too. AI’s role in the equation lessens production costs and timelines. Rather than traditional VFX workflows, they can use lens templates for a wider range of creative possibilities. It also democratizes AR marketing for SMBs.

XR Talks: What Sits at the Intersection of AR and AI?

Flip the Script

That brings us to the second point of evolution noted above: the AR user experience itself. Most lenses to date apply a graphical element to interact with a physical scene – such as a user’s face. This has always been appealing because face fodder is whimsical, fun, and viral.

But Snap’s efforts to merge generative AI and AR effectively flip the script. In other words, augmentations aren’t things that go on your face or in your scene… but become the scene itself. This happens as a gen AI composite of a user’s selfie is placed into a gen AI environment.

One point of appeal in this approach is that it carries the vanity that drives engagement around face filters. But it does so in a way that has more creative depth and breadth. Users’ AI composites can be placed within – and interact with – fun environments from wineries to ski resorts.

That brings us back to brand marketers. All of the above means a wider variety of creative engagements they can build. After all, with selfie lenses, there’s only so much that can go on your face. But broadening lenses to scenes and surroundings opens things up.

Put another way, all the above talk about creative range boils down to one key metric for Snap: Total addressable market. Its TAM just got a lot bigger, as a wider range of brands will be drawn to the ability to create fun ways for their products and consumers to dance together.

More from AR Insider…