In an encouraging act of collaboration, Google, Yahoo and Microsoft announced tonight that they will all begin using the same Sitemaps protocol to index sites around the web. Now based at Sitemaps.org, the system instructs web masters on how to install an XML file on their servers that all three engines can use to track updates to pages. This should make it easier to get your pages indexed in a simple and standardized way. People who use Google Sitemaps don’t need to change anything, those maps will now be indexed by Yahoo and Microsoft.
The protocol is offered under an Attribution-ShareAlike Creative Commons License, so it can be used by any search engine, derivative variations using the same license can be created and it can be used for commercial purposes.
Any time competitors agree on open standards, that’s an enabler of further innovation and something to celebrate. It’s also great to see Creative Commons receiving all the more validation.
Search engine guru Danny Sullivan wrote the following tonight about the move.
Overall, I’m thrilled. It took nearly a decade for the search engines to go from unifying around standards for blocking spidering and making page description to agreeing on the nofollow attribute for links in January 2005. A wait of nearly two years for the next unified move is a long time, but far less than 10 and progress that’s very welcomed. I applaud the three search engines for all coming together and look forward to more to come.
Several people have made early public statements indicating that the next move will be to develop meaningful standards support for robots.txt files. Imagine a future when these players agree on standards for user control of data, microformats or truly neutral party click-fraud tracking and prevention. Maybe that’s crazy.