Snap announced Q4 earnings late yesterday, including 10 percent year-over-year revenue growth to $1.7 billion. Net income was $45 million, and adjusted EBITDA was $358 million. Altogether, it was a strong quarter that beat estimates and saw positive market reaction so far.

Explicitly driving revenue growth was continued AR efforts. That means mobile AR lenses today, including 8 billion daily lens engagements, and monetization that flows from it. The latter involves AR lenses that let brands market themselves in creative and immersive ways.

But beyond that business story today, Snap is in the midst of an expansion from handheld to headworn AR. This includes its highly anticipated consumer-geared Spectacles that are scheduled to launch later this year; as well as its newly-formed and wholly-owned Specs Inc. subsidiary.

As we examined last week, the key driver behind Specs Inc. is focus. That focus materializes both operationally and in capitalization. Snap wants to delineate the messaging to investors around its core business and that of its faceworn future – two inherently different narratives.

Some pundits pegged Specs Inc. as a hedge, letting Snap commit to Spectacles but not in ways that would bring Snap Inc. down if the effort fails. If true, there’s nothing wrong with that approach, as it demonstrates confidence in Spectacles while formulating a smart capitalization model.

Snap Launches Specs Inc. as an AR-Focused Subsidiary

Highlight Reel

With that backdrop and macroenvironmental context, what were the AR highlights from Snap’s Q4 earnings call? We listened and extracted AR-relevant segments to save time for AR Insider readers. Here they are in no particular order, with all quotes attributed to Evan Spiegel.

General AR Positioning (prepared comments)

Our long-term vision for augmented reality extends beyond the smartphone to a future when computing is more natural, contextual, and seamlessly integrated into the real world. For more than a decade, we have invested in building a platform that brings digital experiences closer to how people see, move through, and interact with their everyday environments. Specs are central to this vision. After 5 generations of development and refinement, we plan to launch Specs publicly in 2026, which we believe represents a significant step forward in human-centered computing and the evolution of our AR platform. As we prepare for launch, we have continued to strengthen both the platform and the ecosystem that is designed to support adoption at scale.

We began testing Snap Cloud powered by Superbase to make advanced back-end capabilities more accessible within Lens Studio, enabling developers to build richer, more dynamic AR experiences. We also announced that all lenses built today for Spectacles will be compatible with Specs at launch, providing continuity and scale for developers from day one. Partners and developers are already building compelling AR experiences that demonstrate the breadth of what is possible on specs. Star Wars: Holocron Histories from ILM is now live on Spectacles, highlighting the power of smart glasses for immersive storytelling with one of the world’s most beloved franchises. This experience showcases the studio’s continued innovation, technology, and platforms through an extension of the Star Wars Galaxy.

In addition, developer Harry Banda created Card Master, a multiplayer AR card game that lets players face AI opponents in classic card games with tutorials and achievements evolving into a broader suite of AR card experiences for Specs. We believe Snap is uniquely positioned to lead the next wave of spatial computing. With Snap OS 2.0, Lens Studio, Snap Cloud, and a global developer ecosystem, we have built an end-to-end AR platform spanning software, tools, and hardware. Together, these capabilities position us to deliver fully stand-alone human-centered eyewear that expands creative expression and unlocks new ways for people to engage with the world around them.

AI Integrations with Lenses & Lens Studio (prepared comments)

We’re enhancing our camera with AI-powered capabilities that make creation more intuitive, dynamic, and social. Recent breakthroughs in our proprietary models allow us to deliver high-quality generative AI camera experiences efficiently at scale by running our models on device. AI-driven lenses represent a meaningful evolution from traditional lenses, shifting the experience from applying a fixed set of visual overlays to creating images and scenes dynamically through generative AI. Snapchatters can now prompt, explore, and co-create personalized content in real time, and this shift is already resonating with our community. More than 700 million Snapchatters have engaged with generative AI Lenses more than 17 billion times, often discovering and sharing these lenses through conversations with friends and family.

Our Imagine Lens, launched in September, has already been engaged with nearly 2 billion times, highlighting strong early traction and repeat usage. This momentum is supported by a global creator and developer ecosystem that is unmatched in scale. More than 450,000 creators from nearly every country have built over 5 million Lenses using our industry-leading AR and AI tools, helping ensure that camera experiences remain fresh, relevant, and closely aligned with how our community builds relationships. Sharing Snaps with friends and family remains the foundation of Snapchat and a core driver of engagement, retention, and long-term value creation. Our platform is designed around visual communication that enables frequent interactions and helps our community maintain close relationships over time.

Specs Drilldown (during Q&A segment)

We’re super excited about what’s ahead this year with the launch of specs and, obviously, graduating from the R&D phase of Specs to broader consumer adoption. In preparation for that, we’ve been working on several prior versions of Specs, including most recently, the version released in 2024 to developers who can subscribe to Specs and start building Lens experiences.

We’ve seen some people build really spectacular things, whether it’s utilities or new educational tools, for example, like an at-home chemistry lab, you can have an augmented reality to even some of the more interesting work we’ve been doing with the browser and the ability to stream video on a virtual screen grounded in the real world through your glasses. So it’s been really exciting to see all the new use cases that developers are building for Specs with the current version released back in 2024. And those will be able to run on the upcoming or the forthcoming version of specs released later this year.

So I think we’ll be able to launch with a really wide variety of compelling experiences, which I think is so important for the early success of a product like this. And we’re just really focused on getting the hands of early adopters. We’re so fortunate to have this passionate base of developers, hundreds of thousands of developers who’ve used Lens Studio to build lenses. And I think they’re really excited about this forthcoming product. So, really trying to engage them and early adopters with specs later this year is super exciting.

And I think as we look out to future generations of the product through the end of this decade, we’ve got a really clear path here to lightweight, affordable, and incredibly powerful glasses that can deliver immersive experiences in the real world.

AR’s Future (during Q&A segment)

To just maybe take a step back on why we started working on specs in the first place. When we invented Snap, and we worked on things like ephemeral messaging or stories that put content in chronological order, or even things like opening to the camera, our vision, our work was really designed to make computing or smartphones feel more human. And we think that’s quite a really important role in connecting people with their friends and their family. But we also saw a lot of limitations of the smartphone and of computers. And I think today, people are spending something like 7 hours a day in front of a screen.

And so I think there is, at this moment, a real opportunity to change what the computer is, instead of something that you’re constantly operating using a keyboard and a mouse, something that, now powered by AI, can actually get work done for you. And so in that way, it’s really a continuation of this vision to try to work to make computing more human for folks. And so I think now that we are exiting the R&D phase of spec development, there are a couple of important things.

One is developing a strong stand-alone brand. I think Spec,s the product itself, in many ways, appeals to a different audience segment than the core Snapchat audience, and it’s going to be really important for us to develop a stand-alone brand identity for Specs. And then I think longer term, as we look at the rollout and broader deployment of Specs, there may be opportunities to raise additional capital to accelerate balancing that, obviously, with our own sort of ownership interest and any potential dilution. So I think right now, given that we’re so close to launch, the key here is really just nailing the launch and making sure that we deliver an extraordinary product.

And then I think we have a lot of flexibility to think about how we want to capitalize it moving forward.

More to Come

So there you have it. We’ll be watching Snap and the rest of the AR market closely, as always, to see which aspirations turn into accomplishments in Q1 and FY 2026.