AR comes in many flavors. This is true of any technology in early stages as it twists around and takes shape. The most prevalent format so far is social lenses, as they enhance and adorn sharable media. Line-of-sight guidance in industrial settings is also proving valuable.
But a less-discussed AR modality is visual search. Led by Google Lens and Snap Scan, it lets users point their smartphone cameras (or future glasses) at real-world objects to identify them. It contextualizes them with informational overlays… or “captions for the real world.”
This flips the script for AR in that it identifies unknown items rather than displaying known ones. This makes potential use cases greater – transcending pre-ordained experiences that have relatively narrow utility. Visual search has the extent of the physical world as its canvas.
This is the topic of a recent report from our research arm, ARtillery Intelligence. Entitled Visual Search: AR’s Killer App?, it dives deep into the what, why, and who of visual search. And it’s the latest in our weekly excerpt series, with highlights below on visual search’s drivers & dynamics.
Naturally Monetizable
Many of the visual search use cases examined in the previous part of this series have one thing in common: shopping. The endgame is monetizable visual searches for shoppable items. This can be seen in use cases developing at Snap, such as local discovery and “outfit inspiration.”
The thinking is that visual search is naturally monetizable because of the lean-forward commercial intent that’s inherent in its activation. Actively holding up one’s phone to identify real-world items flows naturally into transactional outcomes, making it a natural fit for brand marketing.
Amplifying these benefits is another big factor: Generation Z. They have a high affinity for the camera as a way to interface with the world. And this will only grow as Gen-Z gains purchasing power and phases into the adult consumer population. The oldest Gen-Zers are almost 30.
Lastly, all the above accelerated in the Covid era. This goes for general digital transformation and the rise of eCommerce during Covid lockdowns. But it also applies to the post-Covid era, when we’ll see lots of technologies that blend the physical and digital. This bodes well for visual search.
So if we synthesize visual search’s benefits, they include…
– Easy to comprehend
– Tangible utility (like web search)
– Broadly applicable (like web search)
– High-frequency (like web search)
– Broad appeal
– High-intent use case (monetizable)
– Gen Z-aligned
– Covid-accelerated
– Google-accelerated
By the Numbers
All the above advantages have been factored into ARtillery Intelligence market sizing around visual search’s revenue. But first, how will it be monetized? In short, it will be similar to web search in that companies will bid for placement in sponsored results that accompany organic results.
Of course, the dynamics will be different than web search in that results pages (SERPs) won’t include several ad slots. There will be limited ad inventory. That scarcity, plus visual search’s high-intent use cases outlined above, will mean that sponsorship carries a premium.
With all that in mind, ARtillery Intelligence projects in its recent mobile revenue forecast that visual search will grow from $166 million last year to $2.26 billion in 2026. Though it’s under-monetized today, it will grow to a leading share of mobile AR ad revenue by 2026.
Why is it under-monetized? It’s not as established as other AR ad formats such as social lenses. Visual search players like Google are still experimenting with the right UX and consumer traction before they flip the monetization switch. It’s already seeing 10 billion monthly searches.
Speaking of Google, it will dominate visual search, just like web search. But there will be ample opportunity for more specialized players like Snap Scan and Pinterest Lens. They will focus on a narrower set of focused and high-value use cases such as fashion and food discovery.
We’ll pause there and circle back in the next installment with more visual search analysis and examples.