Monday, 16 April 2007

How to get indexed by the main search engines

A good piece of news announced last week by Google and the other main search engines, Yahoo! and MSN, was a further development with the standardised Sitemap protocol, which should enable websites to get their pages indexed much easier and faster.

We reported in our December newsletter the significant step forward made by the 'big 3' search engines in agreeing a standardised protocol for submitting a Sitemap document - in a number of different formats - which would enable these search tools to index pages from a website more efficiently and more often, if required.

Now Google and the others (who have also been joined by Ask) have taken this another step forward by announcing a simple addition that can be made to a robots.txt file, which will enable these Sitemap files to be identified and indexed by the indexing 'spiders'. This means that you don't necessarily need to register to use Google's Webmaster tools, for example, (although this can still be the best option) but you can just edit your robots file to include the location of the Sitemap file and then let the search engines do the rest.

There's more information about this on the official Sitemaps protocol website.


Blogger Unknown said...

That's great news - it should make it simpler to get the site map indexed and new content updated, plus every website should be using the robots tag anyway!

19 April 2007 at 4:49 pm  

Post a Comment

Subscribe to Post Comments [Atom]

<< Home