Microsoft’s Bing today announced an update to its autosuggest feature that expands the number of categories that will trigger this feature. In May, Bing first introduced this tool for celebrities, politicians, athletes and publicly available LinkedIn profiles. Today, it is adding brands, movies, albums, places, software, sport teams, animal species “and more.”
Just like in its first iteration, Bing features the ability to disambiguate searches right in the search box – before you even start a search. Say you search for “pitbull.” Bing doesn’t know if you’re searching for the dog breed or the artist, so it will give you the opportunity to explain what you’re looking for in the search box. Thanks to its Satori entity engine, which is similar in ambition to Google’s Knowledge Graph, Bing can understand the difference between the two and then show you the right search results.
For some results, Bing also shows relevant information right in the search box so “you have an initial sense of the results you’ll get when you click through.”
As Microsoft’s Stefan Weitz told me when the company first launched this feature, the company’s goal is to “make sense of the physical world by using the digital world as a very high-definition proxy.” Google, in a way, is trying to do the same thing with its Knowledge Graph, though the two projects are taking different paths to get there. Microsoft hopes that one day you’ll be able to use Bing to find out anything about any object. Today’s update takes it yet another small step in that direction.