Google is turning Google Photos into a virtual stylist. The new wardrobe feature lets users see outfit combinations pulled from their own photo library, then try them on digitally before getting dressed.

The tool works by scanning photos you've already taken and identifying clothes you've worn. It then suggests different ways to mix and match those pieces together. You can preview how outfits look on a virtual mannequin or in a fitting room view before committing to wearing them in real life.

This lands just over a week after Google added AI-powered "beautification" filters to the same app, showing the company's aggressive push to embed generative AI into everyday tools. Google Photos isn't just organizing your memories anymore. It's becoming a style consultant that knows your actual wardrobe.

The wardrobe feature taps into something obvious but underdeveloped: your phone already contains a visual record of everything you own. Most people never leverage that data. Google is betting they will if the friction disappears. No need to think about what goes with what. The app does it for you, based on patterns in your own behavior.

It's a play aimed at both convenience and consumption. Make outfit planning frictionless, and people might buy more clothes knowing the app will help them use what they have. Or they might buy less. Either way, Google gets deeper into the daily rhythms of how you dress.