“Trendline” is AR Insider’s series that examines trends and events in spatial computing, and their strategic implications. For an indexed library of spatial computing insights, data, reports and multimedia, subscribe to ARtillery PRO.


Announcements at WWDC this week continue to paint a picture of Apple’s long-term AR play. As we examined, that includes AirPods Pro spatial audio — signaling a continued march towards audio AR, and a wearables suite that carries different flavors of sensory augmentation.

But more closely related to AR and its common graphical connotations, Apple announced GeoAnchors for ARkit 4. These evoke AR’s location-based potential by letting users plant and discover spatially-anchored graphics that are persistent across sessions and users.

AR proponents and enthusiasts will recognize this for its AR cloud underpinnings: To overlay location-relevant graphics, devices first must understand a scene and localize themselves. That can happen by mapping the contours of the scene or having previously-mapped spatial data.

Google’s answer to this challenge is to utilize imagery from Street View as a visual database for object recognition so that AR devices can localize. That forms the basis for its storefront recognition in Google Lens, as well as AR-infused urban navigation in its Live View feature.

Is Visual Mapping Apple & Google’s Next Battleground?

Look Around

Back to GeoAnchors, they’ll similarly tap into Apple’s Street View-like “Look Around” feature, including purpose-built point clouds as we theorized at its launch. GeoAnchors can use this data to localize a device before rendering the right spatially-anchored AR graphics in the right place.

Also similar to Google Lens, Apple will use a combination of data sources. Image recognition via Look Around’s visual database is just one source. Other data sources will likely include a device’s position (via GPS) where it’s pointing (compass), and how it’s moving (IMU).

This combination of inputs allows for maximum power and data efficiency. Spatial mapping and point clouds have data-heavy payloads. So an AR device — in this case an iPhone — can selectively access just-in-time data based on where it is. This is a core AR cloud principle.

Speaking of which, GeoAchors round out AR cloud initiatives from tech giants. Google has the above-mentioned efforts. Facebook has Live Maps and ongoing acquisitions to tie it together. And Snap revealed its AR cloud play last week, which will crowdsource data from snaps.

Speaking of crowdsourcing, Niantic/6d.ai has a similar approach where roaming Pokémon Go players map the world as they play. We’ll continue to see efforts evolve from these and other players, each mapping (excuse the pun) to their strengths and “follow the money” goals.

Is Apple Building Its Own AR Cloud?

Planting Seeds

Back to Apple, it will roll out GeoAnchors in select cities — logically the same cities where Look Around is active. But what it’s really doing — in line with Doug Thompson’s astute theory on Apple’s diabolical plan —  is planting seeds by motivating developers to start to build things.

Meanwhile, other Apple initiatives play into the same long-term vision. As we’ve examined, Codename: Gobi will sit at the center of an effort to plant QR codes throughout retail partner locations. These will trigger AR information and discounts when activated by an AR camera.

This essentially means Apple has dual-tracks to AR. World-immersive AR cloud activations represent one track. Gobi will meanwhile represent a marker-based AR approach. The latter is more rudimentary but also practical and user-friendly for mainstream consumers and retailers.

Closely related to this is App Clips, announced at WWDC. These atomize functions traditionally housed in full-blown apps and make them more accessible on the fly. So real-world activities like pre-paying a parking meter will no longer require downloading the associated app.

Instead, QR codes — we believe the same ones deployed for Gobi — will let users scan to access mini-app functionality to fill that parking meter… or other use cases that developers come up with. Apple’s halo effect and revenue incentives will drive brand and retailer adoption.

Will iOS 14 Lessen AR Friction?

Lead Time

As this happens, AppClips QR codes will begin to populate high-traffic physical spaces. Primary and practical uses will drive adoption and participation from retailers, brands, and developers. But just like Apple Maps’ Look Around, the effort will support longer-term AR outcomes.

In other words, with QR codes populating stores, public spaces and points of interest, Apple will have an installed network of AR activation triggers. Rolling it out in this sequence will allow time for the physical infrastructure to expand, while mobile AR adoption itself gradually grows in step.

Speaking of lead time and long-term thinking, the other wild card is Apple’s AR glasses which will piggyback on all of the above. Apple wants the infrastructure in place by the time its glasses release in the next few years. That way, there are compelling use cases ready to go.

This importantly applies to users and developers. Similar to our longstanding narrative around ARkit’s very existence, Apple’s intent with much of the above is to acclimate users to get the AR demand juices flowing; while getting developers to start thinking spatially and building things.

All of the above could give AR glasses a fighting chance for consumer adoption, and Apple has a lot riding on that. Fortunately, its signature halo effect will also lend a hand. If anyone can pull this off — including planting QR codes across the physical world — it’s probably Apple.

Apple Fires the Latest Shot in the Wearables Wars

Header image credit: TechCrunch

More from AR Insider…