There have always been divergent paths in AR development. Like other software, this forces developers to make choices. Beyond the choice of AR creation platform – everything from Snap’s Lens Studio to Meta Spark – there’s a broader decision about apps versus web AR.
Apps are more polished and functional but the end-user experience requires activation energy. That includes waiting for an app to download, which doesn’t align with AR’s serendipitous real-world encounters. AR is too early and unproven to create additional friction for itself.
Web AR conversely offers easier activation. All you need is a mobile browser, a fairly-modern smartphone, and a link or QR code. It trails native apps in functionality but that gap is closing thanks to the work of innovators and proponents – most notably Niantic’s 8th Wall.
But a third path has emerged: App Clips. This is Apple’s standard – along with Google’s counterpart Instant Apps – that lets users launch app-like experiences without the download. This is much broader than AR, but could end up showing the most value as an AR delivery system.
So how do App clips work? First launched at WWDC in 2020, They atomize functions traditionally housed in full-blown apps and launch them on the fly. They do this by downloading a temporary micro engine that runs an intended function in a quick and lightweight way.
This means that real-world activities and serendipitous encounters don’t pass by as you’re waiting for an app to download. There’s also a convenience factor in random activities like paying a parking meter. Again, the standard wasn’t created for AR, but AR could see the most benefit.
One company that’s beginning to demonstrate that potential is Adobe. Its Aero creation engine resides within the pervasive Creative Cloud and lets users build AR animations and interactions in a low-code way. That Creative Cloud positioning could massively accelerate AR development.
As for App Clips, Adobe has chosen it as one way to distribute AR experiences. And this has been built into the Aero workflow so that creators can not only build but distribute their AR experiences in low-friction ways. That includes generating a QR code for App-clip launches.
This was on display during a recent AWE Nite SF. Adobe’s Ben Sax and James Zachary demonstrated the end-to-end process of creating an AR experience, then launching it as an app clip. You can see the end result, including a QR code to launch the App Clip here.
Back to AR’s two-party system (apps and web AR), the question is if App Clips emerge as the best of both worlds. There are some downsides such as heat and power consumption (just like AR apps). And will the market be fragmented between App Clips and Google’s Instant Apps?
These matters aside, App Clips could offer advantages, which raises the question of why they’re not more prevalent. The answer has a lot to do with warring factions in the AR world. Most creation platforms have an associated delivery network such as Lens Studio and Meta Spark.
Adobe, being independent of those networks has chosen App Clips, which makes the most sense for its particular positioning. But we could see others adopt App Clips more and more. Our money is on Snap, as it seeks AR growth through channels that extend beyond its own walls.
App clips could also find fertile soil with AR glasses. That’s partly due to its alignment with AR glasses use cases that require robust yet fast-launching experiences. It’s also partly due to the influence wielded by App Clips’ creator, Apple. The same could be true for Google’s Instant Apps.
But that future is years away. Apple’s nearer-term play will likely be VR (including passthrough AR) before we see any sort of everyday smart glasses. But when and if the latter materializes, App Clips could be a go-to delivery system. It has from now till then to plant the right seeds.