Adobe’s Project Stardust is a sneak preview of its next-gen AI photo editing engine

The latest version of Adobe’s Project Stardust AI photo editor already thoroughly leaked earlier this month, but at its MAX conference, the company now officially launched this project as a sneak preview of what a next-generation AI-powered photo-editing engine could look like. Powered by Firefly Model 2, which is also launching today, Stardust enables users to easily delete objects and people in a scene, change backgrounds and more. The idea here, at its core, is to enable anyone to get creative with their image editing using Adobe’s AI tools.

Just to be clear: For now, this is only what Adobe would consider a “sneak” — this is a public preview of some of the technologies it is cooking up behind the scenes that may or may not ever make it into a final product. In this case, though, because Stardust is essentially taking a lot of existing Firefly-based AI tools and putting a new wrapper around it, we’ll likely see more of this in the near future. It combines Adobe’s object recognition models with existing AI-powered features like generative fill (once you move an object, something has to take its place, after all).

In many ways, it’s similar to what Google is trying to do with its Magic Editor on Android. Both tools aim to make what used to be painstaking image editing work seem easy.

Adobe’s director of product management for Project Stardust gave me a brief live demo of the service last week. To get started, users can upload their own photo or have Firefly create one. Then, Firefly will automatically analyze the image in the background and create layers for the various objects it finds. From there, moving things around is simply a matter of dragging and dropping, with the AI tools filling in the blanks. Just like in Photoshop today, it’s also easy enough to add new objects to a scene, with the service providing four different options for every prompt. In total, Adobe said, there are over a dozen different AI models at work to power Stardust’s various features.

Maybe it’s a sign of how quickly this technology is moving that I watched the demo and wasn’t even blown away by what, undoubtedly, would’ve felt like magic only a few years ago. Now, with Google already demonstrating similar capabilities (if not actually releasing them), it feels like the discussion has already moved beyond being wowed by the technology to what this means for photography in the long run.