Emerging search engines such as Like.com allow searches via image queries, but are far from becoming mainstream. Most image search today is conducted by an analysis of the metadata around the image. So an image search on Google for “flower” returns results that have the tag “flower” in the photo metadata – 11.4 million results. But if you only speak Spanish, a search for “flores” returns just 2.2 million results.
Search engines are slowly beginning to allow cross-language queries to return more results to people who speak less popular languages. Google’s effort launched earlier this year, handles just a dozen major languages and does not address image search.
PanImages, by contrast, supports over 300 languages. Users simply type in the query and the language they are speaking and see a result set that includes translations. Clicking on a result returns Google Image and Flickr search results for that term.
The data itself is still somewhat thin, and users are asked to add translations when the site doesn’t already know what you mean. But the early results are useful, particularly if you speak a fairly obscure language. An images search for the Zulu word for refrigerator returns just two results. A similar search on PanImages returns 472,000 results.