When it comes to emerging tech buzzwords, it doesn’t get any buzzier than 5G. Possibly in second place is the metaverse. But beyond the jargon, these are legitimate principles in the AR and VR orbit. 5G will be an enabler and a force-multiplier, while the metaverse is XRs endgame.

But how do these two concepts come together? This was the topic of a recent VentureBeat Summit panel discussion with Verizon, Niantic and author Joost van Dreunen. Tackling definitions and strategic implications, it’s the topic of this week’s XR Talks (video and takeaways below).


Level Setting 

The metaverse of course is a broad term with several definitions and connotations that continue to morph. In the VR world, the metaverse often refers to an alternate digital domain where synchronous interaction takes place between placeshifted participants (think: Ready Player One).

Niantic CEO John Hanke thinks of the metaverse more in AR terms. And this is what Niantic is building with its Real World Platform. We’re talking digital enhancements to a physical world that are synchronous (experienced together at the same time) and persistent (anchored to locations).

Pokémon Go and Ingress are the first manifestations of that vision, but the idea of a geo-relevant and place-anchored metaverse will be much broader. If this all sounds familiar, it’s because it’s generally aligned with the well-worn principle of the AR Cloud. It’s AR’s metaverse.

Back to 5G, it will be a necessary technology for the AR metaverse (ahem, cloud) to exist. In other words, a parallel representation of the physical world can’t really happen without the functional capacity that 5G offers. That’s everything from low-latency to millimeter-wave precision.

How Will the AR Cloud Unlock the Spatial Web?

High Frequency

Going deeper on 5G, it brings several advantages that could feed into what we now envision as a metaverse or AR cloud. The most commonly known advantage is greater network speeds and low latency. These factors will enable bandwidth-intensive and interactive AR graphics.

Beyond speed is the matter of volume. Network capacity will be able to handle many more simultaneous users in close proximity — the type of network load that breaks 4G. To quantify this, Hanke specifies that 5G can serve 10x more devices per square kilometer than 4G.

There’s also location accuracy. 5G’s low-range high-frequency signal enables millimeter-level precision. This compares to GPS’ meter-level precision, which fails in urban areas. This will be critical for geo-spatial AR use cases like holding up your phone to identify waypoints.

Add it all up and Verizon’s Ronan Dunne frames 5G as being both a quantitative and qualitative enabler. Quantitatively speaking, anything you can do in 4G, you can do in 5G at scale. Qualitatively speaking, there will be new use cases altogether that 5G unlocks, just as 4G did.

The AR Space Race, Part I: Google

Think Natively

To achieve all of the above, a tech stack will develop around 5G. And the combination of those factors can be considered a new platform, says Hanke. This includes telcos that deliver service, hardware capabilities to process experiences, and the app layer for user experiences.

These elements will ratchet up together in step, feeding into each other’s heightening capabilities. For example, edge computing will boost processing while 5G elevates speed, and LiDAR enhances spatial mapping. Altogether, says Dunne, we’ll “dematerialize traditional constraints.”

But this could take a while to fully actualize. For example, novel use cases often aren’t devised until new platforms seep into the developer mindset. Apps like Uber — utilizing the mobile form factor and 4G — weren’t imagined when these enabling technologies were themselves devised.

Rather, it took time and acclimation before developers could start thinking natively. Only then could they build experiences that tap into the unique advantages and capabilities of a new platform. The same process will unfold with 5G, meaning we have a lot of innovation to look forward to.

See the full panel discussion below.


Header image source: WayRay

More from AR Insider…