Google is blitzing AR & VR, along with other tech giants. But it doesn’t like those terms, instead preferring “immersive computing.” This week at its Pixel 2 event, Google unveiled several products along that immersive computing spectrum.

For this week’s featured video, we’ve clipped the relevant parts where these products were demo’d. You can see those videos embedded below, coded to start at the right points. We’ve also provided color commentary on what it means for AR & VR… ahem, immersive computing.


Google Lens

Google Lens falls into the category of “visual search.” A close cousin of AR, it identifies objects in the real world. But instead of interactive graphics, it overlays search-like information on targeted objects, such as purchase info. It will be a key component of Google’s mobile AR strategy.

The Lens demo at Google I/O spotlighted the ability to identify local storefronts. But this week we got a more in-depth look. Tapping into Google’s AI engine (the same thing that powers Google Assistant), it showed ways that Lens will identify things like dog breeds, architecture and wall art.

Lens will be accessible in various apps, first on Pixel phones but rolling out eventually to the broader Android universe. Its availability will be signified by the lens logo, first seen in the camera app — a fitting context to scan an environment when one’s phone is already held up.

(click embedded video, coded to start at the right point)

AR Stickers

As the first deployment of ARCore, AR stickers will be available on Pixel phones and initially in the camera app. They’ll include movie and TV tie-ins — such as Stranger Things characters — that have animation and dimensional interactivity with the real-world spaces they inhabit.

Though stickers are a bit trite, Google is smart in following the market validation — set through Snapchat and others — for socially-shared and cartoon-like AR objects. We’ll see more utilitarian or practical AR graphics evolve as developers get their hands on ARCore.

These stickers are also a subtle reflection of Google’s AR strategy: to reduce friction for AR experiences. Compared to Apple’s App-heavy approach, ARCore assets will be atomized and available in several ways such as the mobile web and (in this case) the camera.

(click embedded video, coded to start at the right point)

Daydream View

A new Daydream View was also unveiled this week. It has incremental, though meaningful, improvements including field of view, heat dissipation and a better fit. The price was bumped to $99, but Google knows it has some pricing leeway given the comparable Gear VR’s $129 price tag.

(click embedded video, coded to start at the right point)

Pixel Buds

In our recent report about ARCore and ARkit, we predicted Google would launch an AirPods competitor. And that’s what we saw this week. Like Airpods, Pixel Buds create another user touch point to serve information — in this case, audio from Google Assistant and other channels.

What’s the AR angle? Though they aren’t explicitly for AR, Pixel Buds align with our predictions for AR’s trajectory. Like Apple’s Airpods, they represent an unsung AR modality: sound. Ambient audio information about one’s surroundings could be a prominent type of AR “overlay.”

For example, using Google Assitant, Pixel Buds can perform real-time language translation. Think of it like the ear-whisper translation system used by UN delegates, but for the rest of us. In fact, live audible foreign-language translation is a great example of what we call “AR audio.”

(click embedded video, coded to start at the right point)


For a deeper dive on AR & VR insights, see ARtillry’s new intelligence subscription, and sign up for the free ARtillry Weekly newsletter. 

Disclosure: ARtillry has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.