This post is adapted from ARtillery Intelligence’s latest report, AR Commerce: Monetization Comes Into View. It includes some of its data and takeaways. More can be previewed here and subscribe for the full report.
Many questions surround the sometimes-overhyped AR sector. The biggest is how it will make money. There are several answers to that question including enterprise productivity, AR advertising, gaming (e.g. in-app purchases) and others we’ve examined in past reports.
But the area of AR monetization that’s perhaps most logical – and already underway – is commerce. This is the segment of AR in which graphical overlays inform consumers while shopping. It provides contextual product information to inform and incentivize purchases.
This will represent an impactful technology for consumer spending. ARtillery Intelligence projects that $6.1 billion in annual transaction value will flow through AR interfaces by 2022. This means AR will be used somewhere in the consumer shopping journey for that volume of transactions.
Picking up where we left off last week, we’ll dive deeper on visual search — just one form of AR shopping — particularly its use outside of retail environments. That includes a look at what Google is doing with its visual search play, Google Lens, which is the heir apparent leader in the sector.
Google Lens: Organizing the World’s Imagery
The term “best of both worlds” was used earlier in this report to reference AR’s fusion of online and offline commerce. Here we’ll use it again to reference Google’s efforts. While capturing the retail AR opportunity with VPS, it’s also capturing the e-commerce opportunity with Google Lens.
So when you’re away from VPS’ in-store forte, Lens can perform visual searches on a wider variety of products you encounter in the physical world. This will include the ability to not only get information on items, but to purchase them through Google Shopping or its search advertisers.
Its use cases will expand, but currently include general interest (dogs, flowers) and commercial (products) searches. In fact, just like online search, Lens will be a free utility for general interest queries. But the business case is in the smaller share of commercial-intent searches.
For example, point your phone at a store or restaurant to get business details overlaid graphically. Point your phone at a pair of shoes you see on the street to find out prices, reviews and purchase info. All of these use cases will apply Google’s vast image database and knowledge graph.
“So now, when your friend is wearing a cool new pair of sunglasses, or you see some shoes you like in a magazine, you can use Lens to find them online and browse similar styles,” said Google’s Brian Rakowski in October. This style example has “legs” and Google knows it.
Google Lens will also use computer vision and machine learning to ingest and process text. For example, it will scan restaurant menus to search for the ingredients in a dish. It will do the same for street signs and other use cases that develop in logical – and eventually monetizable – ways.
This can be considered an extension to Google’s mission statement to “organize the world’s information.” But instead of a search index and typed queries, it utilizes machine learning and computer vision to process visual queries captured (arguably more intuitively) by the camera.
“The camera is not just answering questions, but putting the answers right where the questions are,” said Google’s Aparna Chennapragada at Google’s I/O conference in May.
Case Study: Test Driving Google Lens
As Google Lens and its machine learning brains develop, it’s being dispatched to more access points. First available on Pixel 2, then Google Photos on iPhones, it’s now on Google’s core iOS app. So we decided to test it out on a variety of objects – everything from electronics to dogs.
The first thing that’s different about the latest iOS update is that it’s activated right in the search box of Google’s app. Unlike the previous Google Photos integration on iOS (which made users take a picture first), this Lens version lets you just tap on objects in the camera’s field of view.
This makes it a lot easier and dynamic, including walking around with the phone and identifying items on the fly. Its machine learning and image recognition go to work fairly quickly, while tapping into Google’s knowledge graph to identify items as they come into the camera’s view.
Most of this is hidden until you actively decide what to identify. Object search-prompts take form in Google-colored dots that, when tapped, launch a card from the bottom of the screen. This employs Google’s familiar Material Design cards that are seen in apps like Google Assistant.
We tested Lens on a range of household objects like laptops, packaged goods and dogs, most of which it identified accurately. This range is what will make Google shine in visual search compared with, say, Amazon which will excel in products, but not general interest searches.
Google is rightly holding users’ hands by promoting select categories and use cases, like dogs & flowers (fun) and identifying signage & menu items (useful). Utilizing its vase knowledge graph, these vertical specialties and use cases will develop and condition consumer behavior over time.
We found it works best in “structured” visual queries. This includes packaged goods where it can pick out a logo, as opposed to generic items like a pillow. Products also represent the most reliable images (professionally shot, clean) in the database from which it matches objects.
It’s also interesting that the Google Lens button sits right next to voice search in the search bar. This is symbolic of their common goal. Like voice, visual search’s job is to create more search modalities and touch points. This equates to more query volume (meaning revenue) for Google.
Google will continue to invest heavily in AR for the above reasons. It’s off to a good start with a training-wheels approach to educate users, and to plant Lens in easy access points. Bringing it to the main Google iOS app is a big step in that direction. We’ll see many more in 2019.
See more details about this report or continue reading here.
For deeper XR data and intelligence, join ARtillery PRO and subscribe to the free AR Insider Weekly newsletter.
Disclosure: AR Insider has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.
Header image credit: Google