common AR industry sentiment is that the smartphone will pave the way for smart glasses. Before AR glasses achieve consumer-friendly specs and price points, AR’s delivery system is the device we all have in our pockets. There, it can stimulate demand for AR experiences.
This thinking holds up, but a less-discussed product class could have a greater impact in priming consumers for AR glasses: wearables. Among other outcomes, AR glasses’ cultural barriers could be lessened by conditioning consumers to wearing sensors on their bodies.
Meanwhile, tech giants are motivated toward wearables. They’re each building wearables strategies that support or future-proof their core businesses, where tens of billions in annual revenues are at stake. For example, Apple’s wearables offset iPhone sales declines.
But how will wearables continue to penetrate consumer markets and benefit AR glasses? This is the topic of a recent ARtillery Intelligence report, Wearables: Paving the Way for AR Glasses. The device class continues to grow and acclimate the world to AR glasses still to come.
Picking up where we left off last week, what’s Facebook’s wearables play? Before answering that question, let’s pan back for context to examine its multi-dimensional spatial-computing ambitions. At the center of this master plan is the thing that kicked off the current AR & VR era: Oculus.
Though Facebook’s VR moves stand out, they’re just one piece of the puzzle. The eventual endpoint is for VR to future-proof Facebook’s social graph with another modality for people to connect. Starting with its social VR app Horizon, the goal is for immersive human interaction.
Facebook is also investing in mobile AR through the Spark AR lens development platform. This is a stepping stone to AR’s next era by getting users and developers acclimated to AR. And it doesn’t hurt if it generates real revenue in the process, as Facebook’s sponsored lenses do.
Lastly, Facebook has its eye on AR glasses, including explicit proclamations that it’s currently developing them. This includes supporting pieces such as Live Maps for AR-cloud support, and lots of deep research around the unchartered territory of spatial interaction.
That brings us back to wearables. Facebook’s AR endpoints are predicated on the assumption that people will comfortably wear AR glasses in public. As Google Glass learned the hard way, there are deep-rooted cultural barriers that stand in the way of social acceptance.
Enter Project Aria. Unveiled at last September’s Facebook Connect event, this program will test the waters for AR glasses’ social dynamics. How do glasses wearers behave, and how do non-wearers react? These questions can only be answered through field testing.
This is notably the same principle driving Snap’s wearables strategy. With Spectacles, it’s feeling out the social dynamics of camera glasses. From that, it will gain key insights around how AR glasses should be designed, starting with its first foray into the category last week.
Project Aria will also work to collect spatial maps for Live Maps. The challenge in any AR cloud initiative is that the world is a big place. So getting comprehensive spatial maps to support AR experiences can benefit from a crowdsourcing approach, which Facebook will do.
Yet another vector in Facebook’s AR development is sound. Facebook Reality Labs is investing heavily in research to unlock the promise of spatial audio. Similar to Apple’s work, this could take form in standalone hearables and “audio AR” devices; or as a component to its AR glasses.
Using beamforming, this research uncovers ways to selectively optimize soundwaves based on where you’re looking, along with some machine-learning magic. Goals include things like hearing your friend in a noisy bar by eliminating background noise, and amping up the good parts.
Facebook calls this “perceptual superpowers,” which could work in corrective ways for some people and optimize experiences for others — all in tandem with AR glasses to process signals like eye gaze. This could also work toward Facebook’s broader social goals by “defying distance.”
“The only reason we need for virtual sound to be made real is so that I can put a virtual person in front of me and have a social interaction with them that’s as if they were really there,” Facebook Research Lead Philip Robinson wrote in a blog post. “And remote or in person, if we can improve communication even a little bit, it would really enable deeper and more impactful social connections.”
We’ll pause there and circle back in the next installment to tackle more wearables strategies.