“Wearables Wars” is AR Insider’s mini-series that examines how today’s wearables will pave the way and prime consumer markets for AR glasses. Each installment will profile a different tech leader’s moves and motivations in wearables. For more, subscribe to ARtillery PRO.

Common wisdom states that mobile AR is the forbear to smart glasses. Before the latter achieves consumer-friendly specs and price points, AR’s delivery system is the device we have in our pockets. There, it can seed user demand for AR and get developers to start thinking spatially.

That’s still the case, but a less-discussed product class could have a greater impact in priming consumer markets for AR glasses: wearables. As we’ve examined, AR glasses’ cultural barriers could be lessened to some degree by conditioning consumers to wear sensors on their bodies.

Tech giants show signs of recognizing this, and are developing various flavors of wearables. Like in our ongoing “follow the money” exercise, they’re each building wearables strategies that support or future-proof core businesses where tens of billions in annual revenues are at stake.

Earlier in this series, we examined Google’s ambitions to create more direct user touchpoints (literally) that drive revenue-generating search by voice, visual and text. The story is similar for Amazon, Microsoft and Bose (kind of), and the 800-pound gorilla in wearables, Apple.

Orbiting Initiatives

Moving on to Facebook, it has several orbiting initiatives that comprise its spatial computing master plan. Before getting into how wearables fit in, let’s take a quick look at that entire picture. For one, it includes the thing that kicked off the current AR & VR era: Oculus.

But though Facebook’s VR moves are most prevalent, they’re just one piece of the puzzle. The eventual endpoint is for VR to future-proof Facebook’s social graph with another modality for people to connect. Starting with Horizon, the goal is for more immersive human interaction.

Facebook is also investing in mobile AR which takes form in the Spark AR lens development platform. This is a stepping stone to AR’s next era, whose job is to get users and developers acclimated to spatial experiences. And it doesn’t hurt if it generates real revenue in the process.

Lastly, Facebook has its eye on an AR glasses future, including explicit proclamations (the anti-Apple approach) that it’s developing them. This includes supporting pieces such as Live Maps for AR-cloud support, and lots of deep research around the unchartered territory of spatial interaction.

Facebook Connect: The AR Angle

Social Experiment

That brings us back to wearables. The above AR endpoints are predicated on the assumption that people will comfortably wear AR glasses in public. As Google Glass learned the hard way, there are deep-rooted cultural barriers to sensor-infused glasses gaining social acceptance.

Enter Facebook’s Project Aria. Unveiled at the recent Connect 7 event, this program will be all about testing the waters for AR glasses’ social dynamics. How do glasses wearers behave, and how do non-wearers react? These are questions that can only be answered through field testing.

If this sounds familiar, it was the same driving principle we discussed in Part VI of this series on Snap. With its Spectacles (not AR glasses), it’s likewise feeling out the social dynamics of camera glasses. From that, it will gain key insights around how AR glasses should be designed.

Project Aria will also work to collect spatial maps for Live Maps. The challenge in any AR cloud initiative is that the world is a big place. So getting comprehensive spatial maps to support AR experiences can benefit from a crowdsourcing approach (see Facebook’s Scape acquisition).

Perceptual Superpowers

Yet another vector in Facebook’s AR development is sound. Facebook Reality Labs is investing heavily in research to unlock the promise of spatial audio. Similar to Apple’s work, this could take form in standalone hearables and “audio AR” devices; or as a component to its AR glasses.

Using beamforming, this research uncovers ways to selectively optimize soundwaves based on where you’re looking, along with some machine-learning magic. Goals include things like hearing your friend in a noisy bar by eliminating background noise and amping up the important parts.

Facebook calls this “perceptual superpowers,” which could work in corrective ways for some people and enhance experiences for others — all in tandem with AR glasses to process signals like eye gaze. This could also work towards Facebook’s broader social goals by “defying distance.”

“The only reason we need for virtual sound to be made real is so that I can put a virtual person in front of me and have a social interaction with them that’s as if they were really there,” Facebook Research Lead Philip Robinson wrote. “And remote or in person, if we can improve communication even a little bit, it would really enable deeper and more impactful social connections.”

More from AR Insider…