Google’s new generative AI lets you preview clothes on different models

Google, ever eager to lean into generative AI, is launching a new shopping feature that shows clothes on a lineup of real-life fashion models.

A part of a wide range of updates to Google Shopping rolling out in the coming weeks, Google’s virtual try-on tool for apparel takes an image of clothing and attempts to predict how it would drape, fold, cling, stretch and form wrinkles and shadows on a set of real models in different poses.

Virtual try-on is powered by a new diffusion-based model Google developed internally. Diffusion models — which include the text-to-art generators Stable Diffusion and DALL-E 2 — learn to gradually subtract noise from a starting image made entirely of noise, moving it closer, step by step, to a target.

Google trained the model using many pairs of images, each including a person wearing a garment in two unique poses — for instance, an image of someone wearing a shirt standing sideways and another of them standing forward. To make the model more robust (i.e., combat visual defects like folds that look misshapen and unnatural), the process was repeated using random image pairs of garments and people.

Google try-on

Google’s new AI-powered try-on feature, which taps generative AI to adapt clothing to different models.

Starting today, U.S. shoppers using Google Shopping can virtually try on women’s tops from brands including Anthropologie, Everlane, H&M and LOFT. Look for the new “Try On” badge on Google Search. Men’s tops will launch later in the year.

“When you try on clothes in a store, you can immediately tell if they’re right for you,” Lilian Rincon, senior director of consumer shopping product at Google, wrote in a blog post shared with TechCrunch. She cites a survey showing that 42% of online shoppers don’t feel represented by images of models while 59% feel dissatisfied with an item they shopped for online because it looked different on them than expected.

“You should feel just as confident shopping for clothes online,” Rincon continued.

Virtual try-on tech isn’t new. Amazon and Adobe have been experimenting with generative apparel modeling for some time, as has Walmart, which since last year has offered an online feature that uses customers’ photos to model clothing. AI startup AIMIRR takes the idea a step further, using real-time garment rendering technology to overlay images of clothing on a live video of a person.

Google itself has piloted virtual try-on tech in the past, working with L’Oréal, Estée Lauder, MAC Cosmetics, Black Opal and Charlotte Tilbury to allow Search users to try on makeup shades across an array of models with various skin tones.

Google try-on

Image Credits: Google

But as generative AI increasingly encroaches on the fashion industry, it’s been met with pushback from models who say it’s exacerbating long-standing inequalities.

Models are largely low-paid independent contractors, on the hook for high agency commission fees (~20%), as well as business expenses including plane tickets, group housing and the promotional materials required to land jobs with clients. And, reflecting biased hiring preferences, they’re pretty homogenous. According to one survey, 78% of models in fashion adverts were white as of 2016.

Among others, Levi’s has tested AI tech to create customized AI-generated models. Levi’s defended the tech in interviews, saying that it would “increase the diversity of models shoppers can see wearing its products.” But the company didn’t respond to critics asking why the brand didn’t recruit more models with the diverse characteristics it’s seeking.

In the blog post, Rincon stressed that Google opted to use real models — and a diverse range, spanning sizes XXS-4XL representing different ethnicities, skin tones, body shapes and hair types. But she didn’t address the elephant in the room: whether the new try-on feature might lead to fewer photo shoot opportunities for models down the line.

Google try-on

To train the generative AI model behind its new try-on feature, Google assembled a data set of models and garments.

Coinciding with the rollout of virtual try-on, Google is launching filtering options for clothing searches powered by AI and visual matching algorithms. Available within product listings on Shopping, the filters let users narrow their searches across stores using inputs like color, style and pattern.

“Associates can help with this in a store, suggesting and finding other options based on what you’ve already tried on,” Rincon said. “Now you can get that extra hand when you shop for clothes online.”