
When it comes to AR efforts among tech giants, Google isn’t always noted at the top of the list but it could be better positioned than anyone. For example, its image recognition capabilities from years of knowledge-graph indexing make it a leader in visual search – a flavor of AR.
Visual search – identifying objects you point your camera at – checks all the boxes for potential killer app. It’s a utility that’s activated with high frequency (just like web search). And it’s highly monetizable, given the explicit intent that’s present in every user interaction (just like web search).
Consequently, Google is driven to invest in visual search to future proof its core search business. But it’s not the only one. Visual search is a breakout feature of Ray-Ban Meta Smartglasses and its multimodal AI capabilities. And Apple’s getting into the game with Visual Intelligence.
As you can tell from these examples, visual search is one area where AR continues to converge with AI. Object recognition is heavily dependent on machine learning and large-scale visual-database training. Again, that’s one reason Google and its knowledge graph are in prime position.
This can be seen in traction so far, with Google Lens – the company’s visual search app – used 20 billion times per month. Building on that foundation, a few events emerged in the past week that advance Google’s visual search and broader AR efforts – mostly around shopping.
Vision Match
Taking those updates one at a time, Google’s new Vision Match feature lets users enter prompts to describe desired fashion gear. In addition to returning an imaginative image, a la generative AI, Google will search its shopping database to match the image with related gear.
What makes this different than standard Google search is that an image is generated based on the user’s search terms. That image is then presented to the user as the basis for the visual search that follows next. Think of it as if Generative AI and visual search had a baby.
To use it, searchers can type product attributes into the search bar and then scroll to the “Can’t find it? Create it” button. They can also end up in the same place through a “Create & Shop” option on the left panel of Google’s Shopping tab. From there they can start entering prompts.
Using an example provided by Google, users can enter descriptive terms like “colorful midi dress with big daisies.” That prompt will then generate a made-up image, as noted, which serves as a sort of quick feedback loop for users to validate that it’s what they’re looking for.
That image is then used to find the right real-life products using image matching and object recognition – similar to the tech behind Google Lens. Altogether, it takes keyword search and fuses it with generative AI and visual search for a new spin on shopping and product discovery.
2025 Predictions: Visual Search Moves Closer to the Mainstream
Comprehensive Simulation
Sticking with the visual aspects of this latest development, Google has coupled it with a few additions to its AR beauty feature. For those unfamiliar, this is an existing Google feature that lets users virtually try on cosmetics from a range of brands including E.L.F., Fenty, and Glossier.
With the latest update, they can try on multiple products at the same time. This offers a more comprehensive simulation of beauty try-ons. It’s also truer to life, where users can try on lipstick and eyeshadow to see how they look together, rather than visualizing them one at a time.
It also uses advanced natural language search. Before getting to the virtual-try-on stage, users have to first find the products they want. And they can do so by describing specific styles and looks in natural language – such as “soft glam” – rather than rigid keywords or color attributes.
Lastly, Google is expanding its virtual try-on experience that lets searchers see how garments look on a variety of models and body types. First launched in 2023 to simulate garment sizing and overall look, it has added several new brands and items to be compatible with this feature.
As for when you can start to see all of the above, updated cosmetic try-ons are available in Google search while Vision Match is only available for Google Labs users in the U.S. But based on its performance, you could soon see it more commonly in a Google search near you.
