XR Talks is a series that features the best presentations and educational videos from the XR universe. It includes embedded video, as well as narrative analysis and top takeaways. Speakers’ opinions are their own.
Facebook announced a lot of things at OC6 this week. That most notably included its admission of long-rumored work on AR glasses, as well as its own version of an AR cloud, known as Live Maps. These are small steps towards what it explicitly calls the next computing platform.
“Small Steps” is the key term as Facebook was a bit more reserved than past years in its proclamations about AR and VR’s imminent revolution. Though typical keynote gloss was present, including LiveMap’s wildly ambitious components, there was a bit more realism this year.
This was most evident in CTO Michael Abrash’s calculated backpeddling from prior years’ predictions. He graciously admits that predictions in tech can be unforgiving, so he eschewed his standard forecasting in favor of a look at today’s engineering work, and his desired endpoints.
Speaking of backpeddling, Mark Zuckerberg’s opening talk mostly avoided overblown statements of the past such as the infamous 1-billion figure. Instead, he discussed things that have to happen in the near-term to get VR to healthy adoption levels, a classic chicken & egg challenge.
“On the ecosystem side, the first step is reaching critical mass in the community. And once we get to a certain size, it’s gonna become economical for all developers — from independent folks up to the biggest AAA developers — to build content for VR. And once we reach that point, the amount of content is just going to explode and that’s going to push adoption further. So, in getting there, Quest is off to a great start. It’s only been on sale for four months now, and we are selling them as fast as we can make them.”
To quantify that, Oculus has sold $100 million in content to date. 20 percent of that was purchased on Quest. That indicates accelerated sales, given that 20 percent of cumulative sales were in the past 6 months. The next step will be continuing to refine UX and inputs, like its new hand tracking.
“So there’s a lot of work that we still need to do to get to where we all want. But I think what you’re starting to see is the hardware is getting out of the way. And with each step, we are getting to a more immersive and natural experience… Hand tracking is great, it doesn’t require controllers. But it still requires you to use your hands. And in the future we want to get to an input where we can just think something and it happens — so what people call a neural interface.”
Hardware lead Andrew Bosworth likewise walked a fine line between blue-sky visions and realities of today. VR’s social endpoints like co-presence are great, he said, but they’re narrow in use because they isolate participants from people around them. So AR could fill a key gap.
“To get to this future, we are building AR glasses. We have a few working prototypes, but these are still a few years out, so in the meantime, we’re focusing on the deep tech stack necessary to bring these to life. Today, Spark AR is the largest augmented reality platform for the phone. We’re working on deepening the technology to bridge the physical and digital divides.”
That’s where Live Maps comes in, as we examined yesterday. Like Magic Leap’s Magicverse, this is Facebook’s version of an AR cloud. According to its concept video (grain of salt), it will have persistent, spatially-anchored data, uncovered through AR devices and permissioned access.
“At Facebook Reality Labs, our research teams are starting to build the core infrastructure that will underpin tomorrow’s AR experiences. We call this research Live Maps. It’s a shared virtual map of the world that involves machine perception, computer vision, and a bunch of core technologies, software and hardware… Rather than reconstructing your surroundings in real-time, the glasses are going to tap into pre-existing 3D maps of the space... At its core technology like this and on this scale, will rely on crowdsourced information captured by connected devices.”
The development of Live Maps seems a little vague, but that’s likely by design. Crowdsourced mapping is also partly reminiscent of 6D.ai (a potential acquisition target) and Bruce Wayne’s fictional surveillance system in The Dark Knight. We’ll have to see how this plays out, and when.
Back to the balance of reality and wide-eyed optimism, one theme in all of the above is that Facebook has internalized the chicken & egg dilemma. It’s mostly learned that lesson in a VR context, as reflected in Zuckerberg’s above comments. But it will apply to AR too.
In other words, prospective AR glasses need content and killer apps before users will buy. Meanwhile, a critical mass of installed hardware is required to incentivize developers to build that content. A Live Maps platform on which to build AR experiences could be the answer to both.
See the full presentation below.
Disclosure: AR Insider has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.
Header image credit: Facebook, YouTube