It’s been a complicated week for Meta. The company announced a price hike for Quest 2 and Q2 earnings, during which it experienced its first revenue decline ever. And it’s being sued by the FTC to block its acquisition of Within, maker of the popular VR fitness game Supernatural.
But it’s not all bad news for Meta, given that it continues to see year-over-year revenue growth in its Reality Labs division (MRL) that houses all-things XR. And though net losses are still the size of an entire nascent industry, its revenue growth outpaces losses (though at different scales).
But putting that crazy week aside, it’s useful to pan back and look at the big picture. Indeed, that’s what Meta is doing with its long-game metaverse play. Through that lens, what is it spending all those billions on? It recently gave us a sneak peek in a deep dive into its VR prototypes.
Following up on that under-the-hood look, which we covered in a previous XR Talk, the astute folks at TESTED took a trip to Seattle to get a deeper technical dive on all the XR tech that Meta is developing. This is the topic of this week’s XR Talks…call it Part II of the series.
Hardware is Hard
First, we’ll reiterate what we said in Part I: hardware is hard. Most of Meta’s life has focused on software…which ended up putting it in a deferential position to platforms like Android and iOS. Learning that lesson the hard way, Zuckerberg wants to own the next platform’s full stack.
And that’s what MRL is all about: not only hardware, but an XR-centric OS and software layers. Not only is this about the business advantages of vertical integration but, with XR in particular, sensor-fusion and purpose-built technologies are needed to overcome technical hurdles.
Mark Zuckerberg has explained this principle, including in past talks that we’ve broken down. But it evokes a different perspective to see the intricate work in process. And that’s what TESTED’s Norm Chan uncovered in his trip to Meta Reality Labs headquarters outside of Seattle.
Again, this builds on what Meta has already unveiled in a rare under-the-hood look at hardware prototypes. As we examined in Part I, it has 4 prototypes (and likely more behind the scenes), each tasked with solving a different VR technical hurdle, such as resolution, focal depth, and HDR.
The hope is that focusing on each of these with separate hardware lets it reach collective solutions faster. And it has separate teams with specific domain expertise working on each of these…a small glimpse of the investment Meta is making to bankroll the XR industry’s advancement.
Retinal Resolution
Taking those one at a time, vergence accommodation is usually discussed in AR contexts. But its focal depth challenges are present in VR, given that objects are rendered at fixed distances (the screen in front of your face). This prevents your eyes from focusing the way they do in real life.
One approach Meta is applying involves varifocal lenses. These move the screen forward and back to shift focus on objects rendered at near and far distances. Using eye-tracking as you naturally focus on objects, the varifocal lenses translate pupil movements to the right focal depth.
Another challenge is getting “retinal resolution,” which is 60 pixels per degree – a total overall pixel volume that is several times greater than high-end monitors today. Meanwhile, HDR is another goal, similar to the dynamic lighting and contrast seen in HD TVs…but more difficult in VR.
Through all these endeavors, one underlying goal discussed previously by Zuckerberg is to pass the “visual Turing test.” A variation on Alan Turing’s test to determine the endpoints of artificial intelligence, the goal is to have optical quality in VR that’s indistinguishable from real life.
We’re still a ways from that goal, but Meta is investing heavily to get there faster. In total, the prototypes include Butterscotch (retinal resolution), Half Dome (focal depth), Starburst (HDR), and Holocake (holographic display). See them all in the videos further below.
Big Pivots
Panning back, one prominent takeaway from Chan’s visit to Seattle is how hard some of these design challenges are. Another important piece of subtext is, again, the level of investment that Meta is making to bankroll the advancement of VR and AR. It exceeds $10 billion per year.
Meta’s other crimes tend to overhang everything it does, including these efforts. And though its work here is self-serving, it’s doing the AR and VR sectors a favor by accelerating user traction. VR would not be where it is today without the Quest 2 and other Meta loss leaders.
The other misconception around Meta’s investments in AR, VR, and all things metaverse is the notion that it’s a distraction or a trite endeavor while its core business declines. In fact, these investments are precisely because of its ad business declines. Meta is building lifeboats.
In fact, Mark Zuckerberg hinted at this during the Q2 earnings call. He says he wants Meta’s XR and metaverse businesses to surpass the revenues of its current ad business. This is analogous to where Apple sat before it launched what would become its new primary business: the iPhone.
We’ll pause there and cue the full video below…