
Going all the way back to Google Glass, the past ten years in AR have been defined by excitement, future gazing, and big things perpetually around the corner. And through that industry evolution, there’s been no shortage of flashy demo videos to showcase AR’s promise.
But what’s different today? Is there real market traction to point to? More importantly, how does that translate to go-to-market strategies for companies up and down the AR stack? These are questions that Mike Festa tackles on the latest Future Of 3D podcast, with guest Eric Johnsen.
Johnsen is the right person for these questions, as he’s lived through XR’s current era, which we define as Meta’s (then Facebook) acquisition of Oculus in 2014. Throughout that time, Johnsen has spent time at Google, Amazon, 6d.ai and, more recently, as a fractional BD pro in XR.
Business Case
Getting back to the question of “what’s different” today than past aspirational stages of AR’s current cycle, one answer is AI. It has elevated AR and made many of its past promises possible. For example, it unlocks all-day ambient intelligence that’s delivered visually.
The Ray-Ban Meta Display Glasses unveiled this week are one example. They take the multimodal AI at the heart of the hit Ray-Ban Meta smart glasses (RBMS), and add visuals. Meanwhile, RBMS has validated a business case with a projected 4 million units sold this year.
These and other events – such as the rise of display glasses like Xreal and VITURE – start to answer the question of why things are different today than in AR’s recent past. But that leaves the question of go-to-market realities. Johnsen shines a light in his discussion with Festa.
See the full episode below, brought to you by SuperDNA 3D Lab…
