Search Engine Optimization and Marketing for E-commerce

Mission Control we have liftoff!

by Andrew Kagan 29. April 2009 04:57

Launching the Searchpartner.pro website was an interesting experiment in measuring Google's crawl rate. The domain had been parked at a registrar for some time, nearly a year, so Googlebot and other crawlers would have known about it, but would not have found any content. This may have been a negative factor in the subsequent crawl rate.

Before launching the website, all the appropriate actions were taken to insure a rapid crawl and index rate:

 

  • Creation of all relevant pages, with informational pages of high quality and narrow focus
  • Implementation of appropriate META data
  • Validation of all links and HTML markup
  • Implementation of crawler support files such as robots.txt and an XML sitemap 
Finally a sitemap was registered with Google and the site brought online...and then the waiting began. 
  • It took more than two days (approx. 57 hours) after registering the sitemap for Google to actually parse it. Google found no errors.
  • It took three more days after parsing the sitemap for Googlebot to actually crawl the site. 
  • More than 24 hours after crawling the site, Google had added only three pages to its index.
It seems that the days of "launch today, indexed tomorrow" are in the past. Even with publishing a website based on Google's best practices, it seems that Google is somewhat overwhelmed at this point and crawl rates for new sites are being delayed.

Two unknowns:
  • Does leaving a domain parked for a long time negatively impact the initial crawl rate?
  • Does the TLD -- "COM", "NET", "PRO" -- affect the crawl rate? Does Google give precedence to well-regarded TLDs over new/marginal TLDs?

I will be testing this hypothesis with additional sites in the near future. 

 

Tags: , , ,

General | SEO

Powered by BlogEngine.NET 3.3.0.0
Theme by Mads Kristensen updated by Search Partner Pro