Excitement over the coming of the Metaverse, an all-encompassing digital world meant to keep us constantly connected to the internet and to one another, has turned a bright new spotlight on AR technology. VR tethers users to one spot, usually indoors: headsets block out the real world to keep us immersed in a virtual world.
Augmented reality, in contrast, lets users move freely through the real world. With AR eyeglasses, we can live as avatars in a virtual world and stay engaged with real life at the same time. This distinction makes AR essential to experiencing the Metaverse. Of course real life happens outdoors as well as inside, which leads to a key question: Is AR ready to take the Metaverse outside?
In his Metaverse Primer, Epyllion CEO Matthew Ball predicts that the Metaverse will be as “transformative” as mobile computing has been, advancing “the role of computers and the internet in our lives” by putting us “inside an ‘embodied’, or ‘virtual’ or ‘3D’ version of the internet and on a nearly unending basis.”
But we’ll need more than a mobile phone to enter that next-level internet. Spatial computing—mapping data to our eyes when we’re out and about in the real world—is vital to the Metaverse, and AR is a core component of spatial computing.
Waveguide technology is the predominant optical technology in AR today, with diffractive waveguides the optics most commonly used in near-to-eye displays. But diffractive waveguides have drawbacks, like color breakup, “waveguide glow,” and up to 50 percent forward light leakage. (Leakage allows digital content to show through the lenses, compromising appearance and privacy).
Most problematic, the limited brightness of diffractive waveguide displays hinders their performance in daylight. The surrounding environment must be darkened dramatically, or the digital screen won’t show up at all. And with that, the primary benefit of AR—seeing the physical world clearly along with digital content—is lost.
However, there is another, more innovative waveguide solution that succeeds in preserving the full advantage of AR, even in the brightest sunlight. Based on 2D expanding reflective waveguide architecture, this breakthrough solution is ideal for powering aesthetically-appealing smart eyeglasses.
Its optic technology uses transflective (partially reflective) mirrors to direct light into the viewer’s eye. Reflective waveguides adapt to any environment, always providing a clear view of the physical world—including the great outdoors. It’s the only AR solution that keeps users grounded in the real world as they engage with digital content.
Reflective waveguides also eliminate the main shortcomings of diffractive waveguides, i.e., color breakup, waveguide glow, and light leakage. Further, the battery efficiency of reflective waveguides outperforms diffractive waveguide solutions, lasting up to ten times longer.
The Metaverse can indeed work outside – and it must do so to fulfill its promise. It’s a matter of deploying effective technology that meets the challenge of spatial computing in two separate worlds with equal simplicity. That’s how AR will bring the Metaverse into our daily lives – and out the door.
David Goldman is VP of Marketing at Lumus.