I

mmersive shopping is proving to have experiential impact for consumers, and revenue impact for brands. Related to – but separate from – AR advertising, this is when AR is used as a tool to visualize and contextualize products to engender more informed consumer purchases.

This is a subset of AR that we call camera commerce. It comes in a few flavors, including visualizing products on “spaces and faces.” It also includes visual search – pointing one’s smartphone camera at a given product to get informational, identifying, or transactional overlays.

In each case, AR brings additional context and confidence to product purchases. And this value has been elevated during a pandemic, as AR brings back some of the product dimension and tactile detail that’s been taken away from consumers during retail lockdowns.

Synthesizing these factors, ARtillery Intelligence recently produced a report to dive into the drivers and dynamics of camera commerce. How is the field shaping up? Who’s doing what? And how big is the market opportunity? We’ve excerpted the report below for AR Insider readers.

Camera Commerce: AR Monetization Materializes

Search What you See

One promising form of camera commerce is visual search. This is when users hold up their phones and point their cameras at a given real-world object. The idea is to identify, contextualize or even buy the same or similar items. It’s an AR shopping format we’re bullish on.

Though visual search is less prevalent than AR shopping’s other main format – product visualization (a.k.a., “try-before-you-buy”) – it has greater potential due to its high-intent orientation. Visual searches happen when consumers want to actively identify an item visually.

This makes visual search a natural evolutionary step from web search. Indeed, one of the things that’s made web search so lucrative for Google and others is the same “high-intent” orientation where consumers explicitly indicate a specific need. That makes contextual advertising natural.

Visual search takes that principle into the next generation of camera-based experiences and visual media. This won’t replace web search of course, but it will supplement it with an alternative visual input. This will resonate among camera-forward millennials and Gen-Z.

In fact, these are reasons that Google is so keen on visual search. Along with voice search, it sees it as a way to boost search query volume by letting people search from more places and modalities. It’s also a play to future-proof its core search business by leaning into emerging tech.

What’s Google’s AR Incubation Play?

Bearing Fruit

Google’s visual search efforts are already bearing fruit considering that its flagship visual search tool, Google Lens, now recognizes 15 billion products and is used 3 billion times per month. This still pales in comparison to web and mobile search volume but is a strong start.

Beyond sheer numbers, this growth validates Google Lens’ broadening capability. Launched initially with use cases around identifying pets and flowers, the eventual goal — in true Google fashion — is to be a “knowledge layer” for monetizable searches like shoppable products.

This raises the question of what types of products shine in visual search. Early signs point to items with visual complexity and unclear branding. This includes style items (“who makes that dress?”) and in-aisle retail queries, which could position it strong for the post-Covid world.

Another fitting use case is local discovery. Visual search could be a fitting tool to find out more about a new restaurant — or book a reservation — by pointing your phone at it. The smartphone era has taught us that search intent is high when the subject is in proximity.

In fact, Google has already begun to develop this opportunity with its Live View urban navigation feature. When using it, consumers can see businesses along their route identified visually through AR overlays – the first step towards a visually-driven local search tool.

Data Dive: Google Lens is Used 3B Times Per Month

10 Blue Links

Google is primed for the above efforts given its knowledge graph, assembled from 20+ years as the world’s search engine. This arms it with a training set for image matching, including products (Google Shopping) general interest (Google Images), and storefronts (Street View).

But these efforts could take a while to materialize — at least the monetization components. Google is in the process of testing visual search, optimizing the UX, and devising interfaces for sponsored content insertion. A key question: what will be the “results page” of visual search?

The challenge — just like with voice search — is that there isn’t a “10 blue links” results page. So monetization will defy the traditional search model. This could involve enhanced results (think: “buy” buttons) when a visual search advertiser is discovered on Google Lens.

Until then, Google can use visual search behavior to optimize web search results. In other words, you won’t see sponsored results in a visual search flow, but you’ll see visual-search-informed results when back on web search – assuming you’re signed in to the same Google account.

To further grease the adoption wheels, Google continues to develop visual search “training wheels.” This includes making Google Lens front & center in well-traveled places such as Google’s iPhone app. This could reduce some friction and help incubate the visual search.

We’ll pause there and circle back with more analysis in the next report excerpt. Meanwhile, check out the full report here and the video companion below. 

More from AR Insider…