Google’s new “Try On” feature promises to take the guesswork out of online clothes shopping—no more wondering if those high‑waisted jeans will fit right or if that summer dress will hang the way it looks on a model. At Google I/O in May, the company offered a sneak peek at an AI‑powered tool designed to let shoppers literally see themselves in apparel items. Now, after a brief limited preview, that experiment has graduated into a full rollout for U.S. users across Google Search, Google Shopping, and Google Images. Here’s what you need to know about placing yourself front and center of your next shopping spree.
The essence of “Try On” is deceptively simple: upload a single, full‑body photo of yourself, and the AI will drape your digital likeness in any garment that appears on Google’s shopping surfaces. Tap the “try on” icon on a product listing—on Google Search, Google Shopping, or any apparel result in Google Images—and you’ll be prompted to pick a photo from your camera roll or take a quick selfie. In moments, you’ll see a rendered image of how those gingham-print pants or that trench coat might look on you. You can scroll through different views you’ve tried, save your favorites, or share them with friends for a second opinion.
Powering this experience is a custom image‑generation model Google says was trained to account for how different materials fold, stretch, and drape across a variety of body shapes. Rather than slap a flat “sticker” of clothing onto your body, the model simulates the physics of fabrics—silk will shimmer and crease differently than denim, and a knit sweater will hug curves in a more forgiving way. According to Google, the AI taps into billions of items in its Shopping Graph, a vast index of product metadata and imagery. That breadth means it should recognize most pieces from popular retailers right away, though you might occasionally encounter an item that trips up the model—think highly intricate lace or avant‑garde silhouettes that aren’t well represented in its training data.
Alongside the virtual try‑on capability, Google has supercharged its price‑tracking tools. Previously, you could track price changes on a product page; now, within the Try On interface, you can specify not only the size and color you want but also the exact price point that makes sense for your budget. Hit “track price,” enter your desired threshold, and Google will ping you the moment that bag, pair of sneakers, or floral midi dress dips below that amount. “The Shopping Graph has products and prices from all across the web—so we’ll let you know when there’s an offer that meets your criteria,” Google explains. No more bookmarking a page and nervously checking back—set an alert and carry on with your day.
To start, U.S. users should look for a small, unassuming “try on” icon (a stylized outline of a shirt or trousers) on product listings in Search results, under the “Shopping” tab, and on image results tagged as apparel. Tap it, upload your photo, and voilà—you’re modeling. The rollout began in late July 2025, and Google says it should reach the majority of U.S. shoppers by the end of the week.
Understandably, giving an app a full‑body photo raises questions about data use and privacy. Google assures users that your uploaded image remains on your device; it’s not stored in your Google account or used for any other purposes. The AI processing happens client‑side or is ephemeral, meaning your likeness doesn’t become fodder for training more models or targeted ads. If you ever want to revoke access, simply delete the photo from the Try On interface and clear your browser cache or app data.
While Google’s AI is surprisingly adept at rendering everyday styles—denim, T‑shirts, basic knits—more complex garments can pose a challenge. Shoppers trying on pleated skirts with elaborate patterns, multi‑layered outfits, or highly reflective fabrics may see skewed results or artifacts. Google acknowledges that it’s a work in progress and encourages user feedback on mismatches or glitches. Most items from major retailers (think Macy’s, Kohl’s, H&M) should work smoothly, but niche brands or bespoke labels might not be immediately supported.
The Try On launch is only the first act in Google’s roadmap. Later this year, the company plans to integrate more AI‑driven shopping experiences into “AI Mode,” the dedicated tab it unveiled for Search this spring. By fall, you’ll not only be able to try on clothes but also seek curated outfit ideas and interior‑design suggestions directly from an AI chat interface. Imagine asking, “Show me three ways to style these boots for fall,” or “What pillows would go with this sofa?” and getting shoppable visuals in return. It’s a direct challenge to inspiration-heavy platforms like Pinterest—only now, the images are not just aspirational, they’re immediately purchasable.
E‑commerce has long struggled with one glaring drawback: you can’t touch, feel, or try on a product before you buy. High return rates—often upwards of 20–30% for apparel—are a costly consequence of that uncertainty. By giving shoppers a clearer sense of fit and style before checkout, Google aims to reduce returns, boost retailer confidence, and ultimately make online shopping more efficient for everyone. For consumers, it means less guesswork—and maybe fewer awkward return labels.
If you’re in the U.S. and ready to play virtual dress‑up, head to Google Search or Shopping on your mobile device, find a clothing item you like, and look for that “try on” icon. Upload your best full‑body shot—ideally taken in good lighting against a plain background—and see yourself in that next favorite shirt, jacket, or pair of pants. Then set a price alert, kick back, and wait for the AI to tell you when the deal is just right. Happy shopping!
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
