Snap today rolled out the latest feature in its Lens Studio AR creation platform: ray tracing. For those unfamiliar, this offers a more advanced and realistic flavor of AR, including better color and light. By infusing it in Lens Studio, Snap continues its mission to democratize advanced AR.
Going deeper on ray tracing, it’s often used in advanced digital production like console gaming. There, it creates realistic lighting that bounces off in-game digital elements. Applying it to AR can achieve a similar effect in that it can help digital objects have realistic light reflections.
It does this by applying an algorithm to trace the path a beam of light would take in the physical world. Using this technique, a platform like Lens Studio can determine the source of light in a given environment, then simulate light direction and intensity for reflective properties.
Use Your Illusion
The result is that AR objects can reflect a given environment’s true light. This is a key step because, though AR can render dimensional accuracy, lighting can make or break a true sense of presence. If the digital object doesn’t share the environment’s lighting, it doesn’t look right.
This is analogous to the way that lighting can make or break realism in the use of green screens in live-action film production. Though a fake background may look right in every other way, if the light shining on the actor doesn’t match the background light, it breaks the illusion.
Back to AR, what does Lens Studio gain from ray tracing? Beyond more realistic lens rendering in a variety of environments, it could broaden the platform’s capabilities to shine (literally) with certain surfaces or textures. We’re talking things that are metallic, glass, stone, etc.
To that end, making lenses more realistic is great for user engagement, but it could really pay dividends with another constituency: brand marketers. Rendering the surface of that Rolex watch with greater luster could boost the brand’s incentive to jump into paid AR marketing.
The same goes for several luxury goods from sports cars to diamonds. In fact, Snap is already gaining traction at the high end. Tiffany & Co is the first brand out of the gate that will utilize the new ray tracing functionality in its “Lock Lens.” See it in action using this snapcode.
Back to the part about democratizing advanced AR, this has been Snap’s core mission with Lens Studio since its inception. Making AR lenses more functional and easier to create is a key component of Snap’s drive to bring more creators onto the platform and getting them more active.
This is a core priority at Snap because it knows that creators kick off the virtuous cycle that ends in revenue. In short, creators build content that drives users and usage. Those lifts in engagement attract more creators which attract more users… which ultimately attract brand advertisers.
That last part is how Snap’s AR efforts make money, in the form of sponsored lenses that offer paid amplification. This is a growing piece of Snap’s ad revenue mix, and one that Evan Speigel continues to emphasize. He has sent the message loud and clear that Snap is all in on AR.
Back to the virtuous cycle, all of that starts with creators, which in turn get attracted by Lens Studio capabilities. Snap has internalized this cycle, which is why you see rapid product release cycles for Lens Studio, including AR functionality, a la ray tracing, as well as non-AR functions.
The latter includes networking tools to find paid work, and analytics to refine and optimize lenses. Expect more such updates on a rapid basis as Snap continues to evolve Lens Studio. It’s all about staying ahead in the AR platform wars, including new competition from TikTok.