A
common AR industry sentiment is that the smartphone will pave the way for smart glasses. Before AR glasses achieve consumer-friendly specs and price points, AR’s delivery system is the device we all have in our pockets. There, it can stimulate demand for AR experiences.
This thinking holds up, but a less-discussed product class could have a greater impact in priming consumers for AR glasses: wearables. Among other outcomes, AR glasses’ cultural barriers could be lessened by conditioning consumers to wearing sensors on their bodies.
Meanwhile, tech giants are motivated toward wearables. They’re each building wearables strategies that support or future-proof their core businesses, where tens of billions in annual revenues are at stake. For example, Apple’s wearables offset iPhone sales declines.
But how will wearables continue to penetrate consumer markets and benefit AR glasses? This is the topic of a recent ARtillery Intelligence report, Wearables: Paving the Way for AR Glasses. The device class continues to grow and acclimate the world to AR glasses still to come.
Sensory Nerves
After covering Apple last week, what moves is Google making in wearables? So far, it has entered the wearables race in several ways including its Pixel Buds portable headphones. It also acquired Fitbit late last year as a move to buttress its lingering WearOS platform.
But as with Apple, a key question is ‘why?’. And like Apple, Google’s motivations for wearables are to protect and future-proof its core business. For Google that of course means search. So widening the funnel for bringing people into search is the name of the game.
Several Google moves over the past decade had that same underlying goal. That includes Android (drives mobile search) and voice search (varied search formats). Google’s biggest AR play is visual search, which similarly traces back to the goal of driving search volume.
With wearables, the same endgame is in play. But unlike the above software-based initiatives, wearables involve hardware. That means a literal touchpoint to users. Think of this like a trojan horse for positioning Google’s core product closer to users’ sensory nerves.
Sound Investment
Google’s most notable wearables investment is with hearables. Though Apple has a big head start, Google appears to be intent on a hearables future given its less-popular Pixel Buds. Though not as sleek as AirPods they’re a vessel for a superior voice/AI engine: Google Assistant.
In fact, Apple’s Achilles heel for AirPods is the famously inept Siri. Google Assistant will win the voice search and general-knowledge AI game, based on the extensiveness of Google’s knowledge graph. It can process voice queries and answer questions with much greater reliability.
And this could be a winning factor. Hardware sleekness can be improved easier than a quality AI engine can be built. So Apple will have to counterbalance Siri’s detriments with more killer apps for AirPods; or by opening up to developers like it’s done with ARkit and other SDKs.
Google also wins on sheer scale. Apple’s AirPods have a total addressable market of about 900 million global iPhones. Android has a much larger global base of devices that is closer to 2.5 billion. Those aren’t all compatible with Pixel Buds but it’s a larger shell to grow into.
Killer Audio Apps
So what will Pixel Buds do with that knowledge-graph backbone? This is where potential “audio AR” killer apps come into the conversation. Hearables today are more about phone calls or music, but the real potential is in situationally-aware and intelligent notifications.
For example, AR can add lots of value in local discovery. This is an area Google has already cultivated with local search, given that proximity drives search intent. Audio AR will play into this with audible cues in commerce contexts such as a store aisle or finding a bar.
More broadly speaking, the vision is for an all-day ambient audio channel for personalized messaging. This can happen through traditional Google searches (in this case via voice); and through predictive alerts, which Google is already developing with Assistant.
Speaking of Google Assistant, a potentially compelling audio AR use case is real-time language translation. Google Assistant already does this, but when brought directly to your ear, this could be a true utility for seamless foreign language translation on the fly.
Share of Ear
The above scenarios align with Google’s smartphone-era construct of “micro-moments.” These are the content snacking moments in the grocery line or subway — pulling out your phone for a quick fix of email, Instagram or Twitter. It created a media (and ad) delivery greenfield.
But audio’s advantage is discreetness. It’s less cumbersome than pulling out your phone. And because AR glasses are held back by cultural and stylistic factors, the subtlety of ambient audio could fill a key gap before they arrive. All-day use also creates lots of content “inventory.”
This raises another concept we’ve been toying with: share-of-ear. Given that we’re inundated with visual stimuli, there’s a zero-sum competitive landscape for capturing that attention. But ambient audible stimulus throughout the day is still a greenfield. This is where Google is salivating.
The place where share-of-ear gets the most attention is smart speakers, which is the wrong discussion. Though they’re a favorite topic for the tech press, the 200-million installed base is dwarfed by a prospective all-day wearable that piggybacks on 3.4 billion smartphones.
We’ll pause there and return next week with more from this report. Meanwhile, check out the entire thing here.