I’m here at the Computer History Museum in Mountain View, CA where Google is holding an event to showcase its upcoming search technologies. Google hasn’t said much about what we’ll be hearing about today, but it’s clear they’ve got some big things planned — they’ve decked out the museum with colorful Google logos, have set up some search kiosks just outside the main stage, and are projecting recent search queries on giant screens set up throughout the main room. I’ll be liveblogging my notes below.
The room is filled with press, and there are a few other notable faces in the crowd. MySpace’s Chief Product Officer Jason Hirschhorn just walked in.
There are also some interesting desks laid out in front of the auditorium. These clearly have something to do with the technology we’re learning about today, but Google employees aren’t talking. I’ve included some photos of them below.
Marissa Mayer has just taken the stage.
MM: We see search in four ways: Modes, media, language, and personalization.
Modes– Right now you type your search into your desktop computer. But what happens when you move to mobile phones? In the future there will be many ways of searches.
On the flip side, there’s media. The web has gotten very rich with photos, news, maps. Our search results have to mirror that same richness.
The next piece is language. At Google we’re trying to break the language barrier. This focus on language and translation. We really foresee a world in the future where you can search and look at content through any language.
Personalization: Search will become increasingly personalized to show results relevant to you.
There’s a fifth component to the future of search: the rate of progress. We like to launch early and often.
We’ve launched 33 search innovations in 67 days. Music search, malware details, flu shot search, global roll out of Google suggest. Lots more.
Today’s event will focus mostly on Modes and Media.
VP of Engineer Vic Gundotra has taken the stage.
VG: The first trend we’re seeing. Moore’s law. Next year the devices you have will be faster than ever. The second trend we’re seeing is far more recent. Connectivity. A decade and a half ago, if someone came to you and said there’s a time that’s coming when they said every device would be connected to every other device. We’re seeing the emergence of powerful clouds. Huge amounts of computer resources devs have at their disposal.
Let’s talk about mobile devices. *pulls out a Droid*
Has a camera, has a GPS chip, has a speaker, accelerometer. By themselves these sensors aren’t that extraordinary. When you take it and connect to the cloud, become much more powerful. What we’re seeing this morning is innovations combining these trends and mobile phones. Search by sight. Search by location, and search by voice.
VG: We launched voice recognition on the iPhone a while ago. This is Google mobile app. (shows the google mobile app for iPhone). We took the voice file you entered, send it up to the cloud, ranked against all the potential matches. All within fraction of a second. We’ve improved accuracy. Also more language. Last month we launched Mandarin voice search. *Now Vic is showing off the app in Mandarin*.
Today joining English and Mandarin, we’re launching support for Japanese.
VG: Imagine you could say something in one language, have it translated in the cloud, and come back read aloud in a different language. This is a tech demo of a feature we have to offer in 2010. In English, Vic just asked where the nearest hospital was. The phone, 2 seconds later, then read aloud in Spanish a translated version. In effect, this is a audio real time translator. Very cool.
VG: We want to make location a first class component of all the products we build. We can make results more personal, faster. On average we save 9 seconds per user because on Google mobile maps we know where you are and render it immediately. Some of the things we’re doing in other properties. In Google Suggest, our data shows that over 40% of mobile queries are from people selecting a suggest item.
*Vic pulls out two iPhones*.
Both are using a new version of Google.com mobile homepage. They’ve hardcoded one phone to believe it’s in Boston, the other thinks it’s in SF. A search for “Re” on the Boston phone yields results for “Red Sox”. RE in SF shows “REI” as the top suggestion. Customizing Suggest based on location. Also can do product search. In addition to shopping online, there are times you want to know if that product is available locally. Combining inventory feeds of local stores and product search will be offered in Product search. Top two results in a search fo “Cannon EOS” shows stores nearby that have the products in stock. Best Buy has it 1.3 miles away, Sears has it 7 miles away.
VG: We’re going to bake location into the Google.com homepage. New feature: “Near me now” on Google.com mobile. Hit Near Me now and it shows you nearby restaurants, coffee shops, Bars, ATMS. Hit the down arrow, and it will show all the locations nearby. This is very cool. Strong alternative to Yelp mobile. Today we have a new version of Google Mobile Maps for Android. Among new features are What’s Nearby feature. Longpress on a location hit What’s Nearby, and it has a list of nearby POIs.
VG: 2/3 of our brain is involved in visual processing. I’m happy to announce a new product available in Google Labs. Google Goggles represent our earliest efforts in fields of computer vision. Take a picture of an item, and use that picture as the query. Say you have a bottle of wine to see if it’s any good. Take a picture. Looks it up, shows it has hints of apricots, etc. It’s in labs for two reasons. It’s nascent. Works in certain types of objects in certain categories. We want to be able to do any image. Today you have to frame a photo. In the future, just have to point at an object. We’re a long way from that. But today marks the beginning of that journey.
Marissa Mayer has just taken the stage again. Along with what we just saw, Media is going to chance the fundamental way we search. It’s our relevance problem — can you find the right information?
Google Fellow Amit Singhal has taken the stage.
AS: What we are going to announce is one of the most exciting things I’ve seen in my career. Outlines the process of improving document transport and creation. Internet was a great revolution. In early days of Google we would crawl that index every month. Clearly a month was not fast enough. Now we crawl every few minutes. In today’s world that’s not fast enough. The world is producing information from around the globe every second. Creating web pages, writing blog. Information is being created a pace I have never seen before. Seconds matter in this environment.
We are here today to announce Google Real-Time Search. It’s Google’s relevance technology meeting the real time web. Relevance is the foundation of this product. It’s relevance, relevance, relevance. That is key to product like this. (Jason’s note: Twitter isn’t so great at that…).
Does a query for “Obama” into Google, results page comes up. Latest results for Obama streaming in. There is a widget on the standard results page, with results sliding by. They’re nested in the standard results page. This is the first time any search engine has integrated real time web into the results page. Google’s Matt Cutts just tweeted something, and it immediately showed up in the search results. This is huge.
We have been testing this internally with our Googlers and as we have been testing this product I have received good examples from my friends of how they are experiencing this product. Type “GM” into Google. Saw recent news updates. But noticed that breaking news, CEO Fritz Henderson had just stepped down seconds ago. This is the first time we are presenting real-time web on the results page. What you see in this realtime section. Is a scroll bar to the right. You can scroll back and go forward. Shows the source (twitter.com). This is a comprehensive real-time web. With Tweets. News Articles, blog articles.
New link under search options. “Latest results”. In addition to old ones, “Past hour, past 24 hours, past week, past year”. Available today to some users, rolling out over the next few days.. Works on both iPhone and Android Google trends is also leaving labs today. For anyone who wants to try it now, you can go to Google.com/trends page and clicking on a hot topic will show Google’s real time results.
To build this, we had to develop dozens of new technologies. Language models. Decide if a certain tweet had information, or was just something tweeting automatically. Much more. Lots going on behind the scenes to make this work.
Marrisa Mayer has taken the stage. Two huge new partner announcements. Facebook will be providing us with a feed from Facebook Pages (shared publicly, obviously). Appearing in Google’s real time results. The second is MySpace.
Last year on our tenth anniversary, we outlined our vision of search. When you look at today’s announcements. Search engines with eyes and that can understand you when you talk.
Given Google’s acquisition of Naven. To what extent can you recognize faces?
A: His previous company did a lot with face recognition. Faces is one of the objects we recognize. but for this product, we made the decision not to do facial recog. We are delaying it until we have more safeguards in place.
Q: What is the advertising opportunity around realtime search?
A: Right now we are concentrating on bringing most valuable results to users. As time goes on new models will develop. The companies we’re working with are experimenting with multiple models. I think all of these companies have added tremendous value to the world.
Q: How much real-time data are you crawling?
A: We’re crawling a lot of content ~1 billion pages a day. Many sources. Both new sources, and if a company announces a new product and does a release, we get that. And new blog posts. So we’re casting a very wide net. The key here is comprehensiveness of realtime integration.
Q:What about availability of real-time in non-US?
A: This first launch is available in all English speaking locales. Very soon, some time in Q1, we plan to launch many different languages.
Q: How do you prevent spammers from taking advantage of real-time search results.
A: We have the best systems in place to prevent gaming of the system. Our spam lead out here (Matt Cutts) runs the best spam prevention team that there is out there. We have had experience with this for so long. We’ve developer algorithms so we can counter things almost before they happen. Real-Time is moving from minutes to seconds.
Q: Will you be able to log in to Google using Facebook Connect, Oauth, etc. Can you customize based on the people you know?
A: We are excited about what’s happening. We are just getting started. Now you have a realtime product.
Q: Regarding Facebook/MySpace privacy.
A: Updates from FB will come from Facebook Pages. For MySpace comes from all user profile pages and any updates designated as public. The users can decide what they want to see offered.
Q: Do you need relationships with these providers? What if there was a content provider you didn’t have a partnership with?
A: Our goal is comprehensiveness. The more comprehensive we can be, the better we can serve our users. That’s why we’re making ours the most comprehensive available.
Q:Financial deals with FB/Twitter?
A: We can’t disclose the financial arrangements or if they exist.
Q: Are there any sources you wouldn’t include?
A: We don’t exclude any source. Any source of real time information, we would like to have integrated into our system.
Q: Does Google intend to reduce support for non-google devices?
A: Absolutely not. At times we choose different priorities in terms of which ones we do first.
Q: Do you feel realtime search will be the death of journalism? If knowledge is power, does that make Google the most powerful in the world?
A: Your questions are clearly very loaded… I can’t even think of putting “death of journalism” based on real time search because you bring so much value to the world. Regarding the second question, our goal has always been to bring timely information to our users.
Q:What is your vision from user perspective. Realtime result vs longer term.
A: Power of universal search is that users don’t have to think about it. We don’t think of it as ‘this search’ vs ‘that search’. Whatever needs to be seen by the user now should be integrated on the results page.
A: Web thrives on openness. We have all this data because the web is open. To say we’ll develop the most authoritative page on this might not be good. But there does a point where we treat a restaurant as an entity, show results with reviews. We want to get people places faster, but don’t want to say we’re the most authoritative on a page.
Q: Can you bring truth into the equation with recent and relevancy.
A: Right now we emphasize quantity and relevance, and that often brings the truth. But there may be times when the truth is not black and white, but it’s a very hard problem because language understanding is still an unsolved problem.
Q: Is there a way to pause real time search results?
A: Based on lots of testing, and experimentation, I think the current interface conveys the realtime nature. But we’ll be working on the user experience going forward. Clearly real-time search is a new feature we’re very excited about.
MM: The UI may ultimately change. A few years ago we had PDF results in too often, lots of users emailed in asking to change. Over time the relevancy got better.
Q: For geo, do you have good coverage internationally?
A: Our time to locate varies depending on the source of geo data. If it’s GPS data, some cell phones can sometimes take a long time to do this. In those cases we fall back on cell tower. In some cases can use AGPS, can almost instantly tell you a reasonably accurate placement. Most places internationally are covered, we’ve had that grow by an order of magnitude in the last year.
Q: Will real-time be supported in API?
A: We haven’t’ yet looked into the details of that, but we will.
Q: When will all this be live?
A: Live today is Japanese voice search. Google mobile maps. Also Google Goggles. Already available. Some new innovations on Google homepage will be in the coming weeks ahead, won’t know the exact dates yet. Product search a few more weeks beyond that. Except for the translation demo, that’s a concept demo, first products to include will be some time in Q1.
A: Realtime search, starting process today, some users will see by the end of the day, over the next few days all users will see it. For now if you want to access real time search, go to google.com/trends, and you can click on the new hot topics panel on the left or type query under the panel and you get real-time results now.
Q:Can you say more about the integration of product inventory availability?
A: That will be integrated into product search some time in Q1. Partners we mentioned are Sears and Best Buy, and we’re working with many other partners.