
Going all the way back to Google Glass, the past ten years in AR have been defined by excitement, future gazing, and big things perpetually around the corner. And through that industry evolution, there’s been no shortage of flashy demo videos to showcase AR’s promise.
But what’s different today in signaling actual market traction More importantly, how does that translate to practical go-to-market strategies for companies up and down the AR stack? These are questions that Mike Festa tackles on the latest Future Of 3D podcast, with guest Eric Johnsen.
Johnsen is the right person for these questions, as he’s lived through XR’s current era, which we define as Meta’s (then Facebook) acquisition of Oculus in 2014. Throughout that time, Johnsen has spent time at Google, Amazon, 6d.ai and, more recently, as a fractional BD pro in XR.
Business Case
Getting back to the question of “what’s different” today than past aspirational stages of AR’s current cycle, one answer is AI. It has elevated the AR and made many of its past promises possible, such as all-day ambient intelligence that materializes in visual line-of-sight utilities.
For example, though this episode aired prior to Meta’s Ray-Ban Meta Display Glasses unveiling this week, they take the multimodal AI that makes Ray-Ban Meta smart glasses (RBMS) shine, and add a display. Meanwhile, RBMS has validated a business case with millions sold to date.
These and other events — such as the rise of display glasses like Xreal and VITURE — start to answer the question of why things are different today than in AR’s recent past. But that leaves the question of go-to-market realities and best practices. Johnsen shines a light in the full video below.
