Anyone with an average understanding of web development will know a few things about web spiders (or crawlers) and it’s directive file; robots.txt. Those of you who are not yet up to speed on this topic might find this post on the Official Google Blog to be of benefit.
Before going crazy with allowing full and often indexing of your web site, you should take into account the bandwidth and other system resources consumption variables as well. Believe me, you wouldn’t want to have half your monthly bandwidth allocation being gobbled up by spiders crawling your site.
Once you have a better understanding of spiders and search engine indexing, you might want to learn how to optimise crawl rates via sitemaps.