Pull up a photo of yourself, tap “Try it on,” and watch your sneakers turn into stilettos — or at least a pretty convincing version of them. Google this week expanded the little bit of magic it’s been slipping into shopping results: the company now lets you virtually “try on” shoes from Google Shopping using generative AI, and it’s also opening the wardrobe a little wider internationally. The result feels at once futuristic and a little uncanny — a fitting snapshot of where online shopping tech is headed.
What it does (and how it looks)
Here’s how: when you find a pair of shoes on Google Shopping, a new “Try it on” button can generate a preview of how those shoes would look on you. You upload a full-length photo, Google’s model detects the footwear currently in the frame, and then it replaces them with the pair you’re browsing — sneakers, sandals, heels — rendered to match the pose, light and angle in your picture. Google’s examples show white trainers being swapped for several different designs, including black open-toe heels.
Crucially, Google says you don’t have to upload detailed photos of your feet. The same family of tools that powers the try-on can generate a realistic set of feet and a lower leg when needed, so the system will fill in whatever it needs to make the shoe sit believably in the shot. That’s useful — and a little weird. The company has been experimenting with this “selfie-to-try-on” approach for months and launched broader clothing try-on features earlier this summer.
This feature leans on the same technical playbook as Doppl, an experimental Google Labs app that lets you try on outfits and even generates short AI clips of you wearing them. Doppl made waves this year because, while it often produces impressive drape and motion for clothes, it sometimes invents details to complete a look — including “fake” feet or unexpected changes to garments. Testers and reviewers flagged those oddities quickly: Doppl can be fast and fun, but it’s still a generative system that hallucinates to fill gaps.
Why retailers like this (and why they should)
From a merchant’s perspective, this is a tidy trick: when a customer can quickly visualise a shoe on their own body, they’re more likely to feel confident about buying. Google argues this reduces friction and returns by helping shoppers pick things that look right for them. For smaller brands and marketplaces, it’s a chance to get product photos in front of people in a much more personal way — and for Google, it’s another reason to keep users inside its Shopping graph.
But don’t mistake pretty pictures for a perfect fit
There are two obvious limits. First, the visual appearance is not fit. AI can render how a heel looks from a photo, but it can’t measure arch height, foot width, or whether a loafer will pinch after a mile. Second, the generative step — where the model invents feet or adjusts clothes — can introduce errors: proportions that are off, fabric that doesn’t behave like the real thing, or shoes that look convincing in still images but betray themselves in motion. Early hands-on reviews and tests of Doppl showed these exact quirks — fun and occasionally eerie, but not yet a replacement for trying shoes on in person.
Privacy, biometric data and the awkward ethics of “your” picture
Uploading a full-body photo to a commercial service raises privacy questions worth thinking about. A body photo can reveal biometric markers (face shape, posture, gait) and — depending on where you live — that can fall into legally sensitive territory. Privacy advocates and legal analysts note that companies need to be transparent about how long they keep images, whether they use them to train models, and how they protect that data. Google presents Doppl and the Try-It-On features as experimental and AI-powered; still, it’s reasonable for users to ask for clarity on retention, reuse, and whether images feed future model training.
So what should shoppers do?
Treat the tool like a style assistant, not a measurement tape. Use the preview to get a sense of color, height of heel and overall silhouette, but double-check size charts, reviews, and return policies before you click “buy.” If a company’s virtual try-on lets you save or share images, consider whether you’re comfortable with that photo existing in somebody else’s cloud. For now, the tech is best for lowering uncertainty about how something looks; it’s not a drop-in for fit-sensitive buying yet.
The bigger picture
Google’s push is part of a broader move to fold generative AI into everyday shopping: imagine search results that don’t just point you to a listing but show you a version of the product personalized to your body, style and preferences. That is enticing for consumers and profitable for platforms. It also amplifies old debates about data, consent and the accuracy of algorithmic representations. Expect retailers, regulators and consumers to test the technology in the months ahead — how they respond will shape whether virtual try-ons become a helpful accessory or another source of friction in e-commerce.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
