As you likely know, one of AR’s foundational principles is to fuse the digital and physical. The real world is a key part of that formula… and real-world relevance is often defined by location. That same relevance and scarcity are what drive real estate value….location, location, location.

Synthesizing these factors, one of AR’s battlegrounds will be in augmenting the world in location-relevant ways. That could be wayfinding with Google Live View, or visual search with Google Lens. Point your phone (or future glasses) at places and objects to contextualize them.

As you can tell from the above examples, Google will have a key stake in this “Internet of Places.” But it’s not alone. Apple signals interest in location-relevant AR through its geo-anchors and Project Gobi. Facebook is building “Live Maps,” and Snapchat is pushing Local Lenses.

These are a few utilitarian, commerce, and social angles. How else will geospatial AR materialize? What are its active ingredients, including 5G and the AR cloud? This is the theme of our new series, Space Race, where we break down who’s doing what….continuing here with Apple.

AR’s ‘Space Race’ Revs Up

Part II: Apple 

Apple’s AR play — as with most endeavors — is to sell hardware. That makes its AR ambitions less related to location-based content and monetization, relative to other AR players. For example, the last installment of this series examined Google’s natural “Internet of Places” play.

But that doesn’t mean Apple doesn’t have geospatial components in its AR master plan. For one, ARkit GeoAnchors evoke AR’s location-based potential by letting users plant and discover spatially-anchored graphics that are persistent across sessions and users.

AR enthusiasts will recognize this for its AR cloud underpinnings: to overlay location-relevant graphics, devices first must understand a scene and localize themselves. That can happen by mapping the contours of the scene or having previously-mapped spatial data.

GeoAnchors will do this by tapping into Apple’s Street View-like “Look Around” feature, including purpose-built point clouds as we theorized at its launch. GeoAnchors can use this data to localize a device before showing the right spatially-anchored AR graphics in the right place.

Is Apple Building Its Own AR Cloud?

Look Around

If this sounds familiar, it again invokes Google’s geospatial AR efforts. As we recently examined, Google uses imagery from Street View and other sources as a visual database for object recognition so AR devices can localize. This is how storefront recognition in Google Lens works.

In addition, to Look Around’s visual database, other data sources that Apple will use to localize AR devices will include position (via GPS), directional heading (compass), and movement (IMU). Because Apple owns the entire tech stack in iPhones, it has sensor fusion on its side.

This combination of inputs also enables power and data efficiency. Spatial mapping and point clouds have data-heavy payloads. So an AR device — in this case an iPhone — can selectively access just-in-time data based on where it is. This will also tap Apple’s spatial mapping efforts.

Meanwhile, Apple is rolling out GeoAnchors in select cities — logically the same cities where Look Around is active. But what it’s really doing — in line with Doug Thompson’s astute theory on Apple’s diabolical plan —  is planting seeds by motivating developers to start to build things.

Is AR Navigation Coming Next from Apple?

Dual Tracks

Meanwhile, other Apple initiatives play into its geospatial AR efforts. As we’ve examined, Codename: Gobi will sit at the center of an effort to plant QR codes throughout retail partner locations. These will trigger AR information and discounts when activated by an AR camera.

This essentially means Apple has dual-tracks for AR. World-immersive AR cloud activations represent one track. Gobi will meanwhile represent a marker-based AR approach. The latter is more rudimentary but also practical and user-friendly for consumers and retailers.

Closely related to this is Apple’s App Clips. These atomize functions traditionally housed in full-blown apps and make them more accessible on the fly. So real-world activities like pre-paying a parking meter will no longer require downloading the associated app.

Instead, QR codes — we believe the same ones deployed for Gobi — will let users scan to access mini-app functionality to fill that parking meter….or other use cases that developers come up with. Apple’s halo effect and revenue incentives will drive brand and retailer adoption.

Will iOS 14 Lessen AR Friction?

Paving the Way

As this happens, AppClips QR codes will begin to populate high-traffic physical spaces. Practical non-AR uses will drive adoption and participation from retailers, brands, and developers. But just like Apple Maps’ Look Around, the effort will support longer-term AR outcomes.

In other words, with QR codes populating stores, public spaces and points of interest, Apple will have an installed network of AR activation triggers. Rolling it out in this sequence will allow time for the physical infrastructure to expand, while mobile AR adoption itself gradually grows in step.

Speaking of lead time and long-term thinking, the other wild card is Apple’s AR glasses which will piggyback on all of the above. Apple wants the infrastructure in place by the time its glasses release in the next few years. That way, there are compelling use cases ready to go.

This applies to both users and developers. Similar to our longstanding narrative around ARkit’s very existence, Apple’s intent is to acclimate users to get the AR demand juices flowing; while getting developers to start thinking spatially and building things.

2021 Predictions: AR Glasses Evolutionary, Not Revolutionary

Means to an End

All of the above could give AR glasses a fighting chance for consumer adoption, and Apple has a lot riding on that. Fortunately, its signature halo effect will also lend a hand. If anyone can pull this off — including planting QR codes across the physical world — it’s probably Apple.

All of the above will be accelerated by evolving hardware, including LiDAR. Speaking of which, autonomous vehicles, while collecting point clouds to navigate properly, could feed into an AR cloud database. Just like “data is the new oil” in other areas, it will play a key role in AR’s future.

Boiling it all down, Apple’s geospatial AR efforts aren’t a monetization endpoint — as they are for Google — but a means to an end. By having architecture and content in place, AR experiences will be more compelling. And that will be required to sell AR glasses and other wearables.

We’ll pause there and return in the next Space Race segment with a different company profile…

More from AR Insider…