Immersive shopping is proving to have experiential impact for consumers, and revenue impact for brands. Related to – but separate from – AR advertising, this is when AR is used as a tool to visualize and contextualize products to engender more informed consumer purchases.
This is a subset of AR that we call camera commerce. It comes in a few flavors, including visualizing products on “spaces and faces.” It also includes visual search – pointing one’s smartphone camera at a given product to get informational, identifying, or transactional overlays.
In each case, AR brings additional context and confidence to product purchases. And this value has been elevated during a pandemic, as AR brings back some of the product dimension and tactile detail that’s been taken away from consumers during retail lockdowns.
Synthesizing these factors, ARtillery Intelligence recently produced a report to dive into the drivers and dynamics of camera commerce. How is the field shaping up? Who’s doing what? And how big is the market opportunity? We’ve excerpted the report below for AR Insider readers.
Search What you See
After looking at Google’s “search what you see” play (Google Lens) last week, what other players are innovating? Is there room for specialty players in business verticals, or in horizontal use cases like shopping? If so, a clear candidate is increasingly becoming evident: Pinterest.
To set the stage on Pinterest’s overall positioning, AR is one of many initiatives as it continues to enjoy notable growth and Wall Street performance. This was seen in its Q4 2020 earnings where it achieved 76 percent year-over-year revenue growth, partly riding the Covid-era eCommerce wave.
The number of advertisers on Pinterest meanwhile grew 6x while product searches grew 20x and active users grew 37 percent to 459 million. This includes its flagship pins and boards for sharing visual media like food and fashion, which is the foundation for its AR opportunity.
Specifically, Pinterest Lens is its visual search feature that lets users point their phones at items to identify them. Last year, it extended its “Shoppable Pins” functionality to Lens so that visual search could seamlessly lead into transactions. This fits directly into Pinterest’s broader ethos.
“We see shopping as this bridge between the two halves of our mission, inspiration and action,” Pinterest CEO Ben Silbermann said during the company’s Q4 earnings call. “For pinners, we’ve made progress by expanding the number of surfaces to let them shop.”
Training Set
Going deeper on Pinterest Lens, it recognizes 2.5 billion products, and its engagement continues to grow. To achieve this, Pinterest utilizes the visual product database it’s developed over years of user-pinning behavior. This serves as a sort of AI training set for visual object recognition.
If this sounds familiar, it’s similar to how Google Lens works. As noted last week, Google is positioned well for visual search because it has served for 20+ years as the world’s search engine, including images. Pinterest can’t rival that breadth of images….but maybe it doesn’t need to.
In other words, Pinterest’s visual database is strong where it needs to be: shoppable products. This is narrower than Google’s “all the world’s information” mission, but it aligns with monetization. That could induct Pinterest into the small club of revenue-generating AR players.
In that way, visual search supports Pinterest’s road map to increase ad inventory by making the physical world “pinnable.” That’s particularly true in Pinterest-strong verticals like fashion, home goods, and food, where these products surround us and can trigger shoppable engagement.
Pinterest is likely closer to that point than anyone else, given its established use case for product-based image search. This means that the leap to AR isn’t a big one. Camera commerce in the above product categories is a natural extension to everything Pinterest has already built.
In View
The remaining question is how visual search will gain traction with a broader base of mainstream consumers. Is holding up your phone easier than typing or speaking a search query? It depends on the search subject, as holding your phone up to a lamp is more effective than describing it.
But visual search isn’t culturally mainstream yet, and it requires a behavioral shift that’s physical in nature (holding up a phone). History has taught us that this is a difficult and slow-moving process. And it will only apply in situations where the subject is in view/proximity, versus recalled.
Visual search will continue to gain traction as a feedback loop reinforces its value and reliability. This will be accelerated by camera-native Gen Z as they gain purchasing power. Whether by Pinterest, Google, or Snap, visual search has strong potential as a camera-commerce format.