A new 20% time Google project has just launched called Google Similar Images. It’s pretty self-explanatory — when you search for an image and find one close to what you’re looking for, Google can now find ones that it believes to be the same, or similar. This type of visual search is similar to what like.com has been doing for a while, but is potentially much more powerful with Google, as the search giant currently looks at hundreds of millions of pictures of just about anything you can imagine, across the web.
But it’s interesting that this is not done by way of optical recognition. That technology is what Apple, for example, is using in the newest version of iPhoto to look at faces in images and determine if other images contain the same people. Because Google already has its vast database of web images and a lot of metadata thanks to projects like the game in which you tag photos, and more recently, sorting images by color, it has a bunch of different data points it can work with to make Similar Images work.
Let’s take this new toy for a spin. “Apple” is a good query to test this because there is the aforementioned company Apple, as well as the fruit. When you find an image you are looking for after the initial query, it’s easy to click on “Similar images” and get more of the same. This is where Google really takes over and uses its technology to search for images it believes to be similar. For example, if I decide I want the silver version of the Apple logo, I’ll see this:
But maybe I want another version of the Apple logo shown on the initial page. In that case, I may see this:
Or maybe I decide that I don’t want an Apple logo at all. Maybe I just want a picture of an apple, the fruit. Then I may see this:
As you can see, the filtering works very well. Here’s an interview with Radhika Malpani, the director of engineering working on this project.
Watch more in the video from Google below: