I run google adsense on several of my drupal sites. That is a whole other discussion for a whole other time. Suffice to say, adsense now gives your site a health ranking or score for ad display.
Crawler Errors are one particular problem that can hurt your adsense score.
Drupal, by default, has a pretty complete “robots.txt” file right out of the box. However I have found that it needed a few lines added to it.
A robots.txt file in your site’s root directory is read (and respected) by most web indexing search crawlers. It tells the crawlers that they shouldn’t index parts of our site (like admin pages and other things that aren’t meant to be content).
It is probably because the sites I have provide file download links from these locations. If you are not using these folders for file downloads, then you may never get this particular crawler error. I doubt anyone would really ever want these directories crawled though so it shouldn’t hurt to update your robots.txt file with the following lines:
You can just tack those into the #directories section along with the other “Disallow:” lines.