There’s no doubt that Snap is all in on AR. The technology has fueled its revenue growth over the past few years. Moreover, Snap has internalized that feedback loop and continued to double down on AR through investment in its platform. This continues to be its North Star.
Fueling that feedback loop are Snap’s AR usage milestones. It now has 250+ million daily lens users who engage 6 billion times per day collectively, and 5 trillion times cumulatively. This all comes from a universe of 300,000 creators who have developed more than 3 million lenses.
Snap knows that all of the above is propelled by lenses themselves, which get built by all those creators. So the name of the game is to keep them well-fed with AR tools. And that’s what Snap Partner Summit (SPS) is all about, including a parade of announcements and new capabilities.
One theme at this year’s SPS was reaching beyond the walls of Snapchat. This includes scaling lenses by bringing them to various other environments. So for this week’s XR Talks, we break down the AR highlights, including the full keynote video and summarized takeaways.
Spaces & Screens
One of the places that Snapchat lenses are expanding is sporting events. Starting in Snapchat’s hometown, the LA Rams’ SoFi Stadium will use Snap Lenses on its jumbotron “Infinity Screen.” Lenses will be used to enliven fan footage in playful ways throughout a given game.
Via Snap’s Camera Kit SDK, lenses will also be expanding into other types of events, namely concerts. Building on a relationship with Live Nation, custom lenses will be created and offered to fans in and around 16 new shows and festivals like Lollapalooza and Bonaroo.
As Snap recently announced, lenses will also be coming to Microsoft Teams calls. During remote calls and meetings, Teams users can apply lenses such as their Bitmoji. This can add playfulness or utility, the latter applying to situations when you’re not “camera ready” for a work call.
Lastly, Snap will expand into new hardware. Through retail partners and ARES users (Snap’s new SaaS play), it will offer smart mirrors that bring try-on lenses to retail dressing rooms. A custom shoe-oriented version will sit close to the ground, angled up, for virtual shoe try-ons.
This brings several advantages to retailers, including streamlined inventory. Shoppers can try on infinite wardrobe possibilities including styles and colors, without needing physical items on hand. Similarly, sneakerheads can design custom shoes, try them on, then order them.
Panning back, all these lens expansions mark an evolution in Snap’s AR business. By moving beyond the walls of Snapchat itself, Snap can scale up its addressable market. This is all about finding new users, use cases, and outlets (read: sponsored lens inventory) for AR engagement.
Besides lens expansion to new spaces and screens, the big AR news coming out of SPS is AI integrations. Jumping on all the recent advancements and excitement in generative AI, Snap is merging the technology with AR. This plays out in a few ways that boost lenses.
For example, it will begin rolling out AR lenses that are powered by generative AI. The first of these was introduced at SPS in the form of a “Cosmic Lens” that turns your face and surroundings into an animated sci-fi scene. Several other generative AI lenses will follow in the near term.
Snap’s previously released My AI has also come out of beta and is available to everyone. This is a conversational AI chatbot that can answer several questions such as nearby businesses and activities. As for its AR integrations, it can suggest the best lenses to use in a given scenario.
This is the beginning of Snap’s AI integrations, as AR and AI will continue to converge in several ways. These include AR lenses that are generated on the fly, as well as generative tools that let lens creators streamline their workflows. All these possibilities fit well with Snap’s AR efforts.
“AI and AR are deeply interconnected, powering a new spectrum of creative possibilities,” Snap CTO Bobby Murphy said from the keynote stage. “Our vision is to weave computing seamlessly into the world — and support our vibrant community of AR creators, developers, and partners.”
See the full keynote below…