Metaverse mania continues to escalate….though it could follow a typical hype cycle and fall into a trough soon. In the meantime, it’s reaching overblown and sometimes-comical levels of attention and association. Look out for that metaverse cover story in Scrapbooking Monthly.
But beyond overuse and ambiguation, the metaverse holds some legitimate principles. Well-informed thought exercises aren’t a bad thing, as they can help conceptualize our connected future, and inform tech companies as they build towards something resembling a metaverse.
Among those bonafide voices is Avi Bar-Zeev. A legend in the AR world, he’s weighed in on the metaverse, including Medium posts (and articles here on AR Insider). His recent AWE presentatation was also a show favorite, and the focus of this week’s XR Talks (video below).
Defying Definition
One model that’s continually espoused for the metaverse is the good-old web. It has the interoperability that everyone keeps talking about. And like the web, the metaverse will and should evolve organically, says Bar-Zeev. We don’t know what it will eventually look like.
For example, the web started with one website, then two, then several. The browser, links, and search engines followed to make those sites interoperable. The metaverse may evolve similarly, but how will interoperability happen? Portals, analogous to iframes, could be one answer.
Bar-Zeev explains that this could be a small piece of content from one location or domain that’s planted on another domain (again, like iframes) and serves as a bridge between the two. But this model could also have some issues in terms of imposing style and function.
Speaking of issues, the web isn’t a perfect model for the metaverse. Though it ticks the box for interoperability, there are clearly issues around data privacy and centralized authority (a web3 battle cry). Is the construction of a metaverse an opportunity to rethink that architecture?
Personal and Asyncronous
Meanwhile, questions of interoperability apply equally but differently to AR and the real-world metaverse. As background, the metaverse could have two tracks: the online/multiplayer metaverse often discussed, as well as digital content that adds dimension to the physical world.
In that second (and potentially more significant) track, Bar-Zeev argues that it shouldn’t be interoperable. That’s simply because experiences should be personal and asynchronous. For example, a digital wayfinding arrow on the street should be seen by you alone, not everyone.
Similarly, geospatial AR experiences like wayfinding or local discovery will need a filtration system or contextual discovery engine. That way, everyone doesn’t see everything – a scenario that invokes the famous Hyper Reality vision that’s the classic example of AR’s potential dystopia.
The hardware used to access the metaverse – possibly AR glasses – could be that filter. In that sense, Bar-Zeev submits that AR glasses could be sort of the browser for the geospatial web. They could hold locally stored data and settings that govern your preferences and permissions.
Land Grab
As all of the above develops, we could benefit from a framework to help conceptualize the metaverse’s evolutionary arc, and set a series of targets. In fact, there’s already a model for this in the “levels” of autonomy that are used as a framework for autonomous vehicle evolution.
With that framework, the metaverse we discuss today is just one “verse” while others involve specific evolutionary steps (see video). And looming over all of this is terminology. Companies will co-opt (and have co-opted) the m-word, as it often goes in land grabs and hype cycles.
But there are other terms that are technically more demonstrative and less likely to be co-opted. Bar-Zeev proposes a few options such as web3.D or co-reality. But he admits the terminology will evolve on its own, just like the “web” (versus “cyberspace”), and we can’t force it too much.
We’ll pause there and cue the full presentation below…