“Trendline” is AR Insider’s series that examines trends and events in spatial computing, and their strategic implications. For an indexed library of spatial computing insights, data, reports and multimedia, subscribe to ARtillery PRO.

AR continues to hold lots of potential as a new visual front end to our computing lives. That includes use cases that range from gaming to product visualization. But one increasingly evident direction where AR is headed — and aligned with the return to “normal” — is local commerce.

As we discussed with M7 innovations, this can play out through AR lenses that contextualize products in retail and QSR settings. AR is also inherently conducive to local commerce, as its geo-anchoring capabilities provide a foundation for geo-relevant local search and discovery.

This is a key principle behind the AR cloud: a persistent data layer that’s anchored to the physical world to activate AR graphics. For example, Google wants to build an Internet of places — revealed through Google Lens — by indexing the physical world just like it indexed the web.

Payoffs for this vision include monetization potential — through advertising, affiliate revenue, or other models — to facilitate local offline commerce. It’s often forgotten that brick-and-mortar commerce (at least in normal times) accounts for a commanding majority of consumer spending.

Transactional Layer

All of the above has been top of mind, but it was underscored further in announcements from Snap and Apple at their respective developer conferences. Amidst lots of other AR and non-AR feature launches, there were some notable parallels in each company’s AR commerce ambitions.

Starting with Snap, its Local Lenses will let developers create geo-anchored persistent content that Snap users can discover through the camera interface. The company claims that this will also include the ability for users to leave persistent AR graphics for friends to discover.

The use case that Snap has promoted is more about fun and whimsy, including painting the world with digital and expressive graffiti (see video below). But where this could lead in a more practical sense is user-generated info (think Yelp reviews/ratings) on local storefronts.

One potential outcome is something we’ve been calling visual SEO. Just as local businesses apply rigor to local listings management for web search, a sub-sector could develop around optimizing data to show up prominently and correctly in AR and visual search experiences.

Separate Snap developments could tie in. For example, Snap Minis are new HTML 5-based apps that will live in Snapchat’s Chat section and include micro-functionality and utilities. It’s similar in concept to Apple’s subsequently released AppClips (more on that in a bit).

Launch partners include Coachella (coordinate and plan a festival experience); Headspace (launch meditation sessions and send to friends); and Movie Tickets by Atom (choose showtimes, watch trailers, buy tickets). These demonstrate a wide range of use cases for AR innovation.

Minis could also be developed to discover, plan, and transact local activities such as dining out. The model is what WeChat has done in China. Tying it back to AR and Local Lenses, Minis could provide the transactional layer that flows from some of the above geo-anchored AR experiences.

Atomizing Commerce

Moving on to Apple, it similarly continues to show AR aspirations. The latest is GeoAnchors for ARkit, announced at WWDC. These evoke AR’s location-based potential by letting users plant and discover spatially-anchored graphics that are persistent across sessions and users.

Similar to Google’s Live View AR navigation, this will tap into Apple’s Street View-like “Look Around” feature. GeoAnchors can use this data to localize a device before rendering the right spatially-anchored AR graphics in the right place. This will be Apple’s version of an AR cloud.

But the other key parallel is how, like Snap, these front-end AR experiences could be tied to the transactional infrastructure that Apple is separately building. Apple’s new “App clips” will atomize functions previously housed in full-blown apps, and make them more accessible on the fly.

Activities like pre-paying a parking meter will no longer require downloading the associated app. Instead, users will scan QR codes to access mini-app functionality to fill that parking meter, one of any number of possible use cases developers and Apple’s retail partners may devise.

This ties back to AR because there have been separate indications in “project Gobi” that Apple will plant QR codes at retail partner locations (think Starbucks), which will activate the smartphone camera to overlay informational AR graphics such as product information or promotions.

In that way, it could be the same infrastructure that serves both AR experiences and transactional functionality in AppClips. This will put real utility behind AR, which the technology needs. Meanwhile, Apple’s halo effect and revenue incentives can drive brand and retailer adoption.

Of course, a lot of this is speculative in terms of Snap and Apple’s intentions. But the evidence is stacking up. And though the current state of the world isn’t very conducive to local offline commerce, Apple and Snap could be planting seeds for AR’s role in local commerce’s return.

Header image credit: Apple

More from AR Insider…