I have no problem with Google crawling any of my sites.
In my Options-Firewall rules, I have the following crawler settings:
Immediately block fake Google crawlers: (unchecked)
How should we treat Google's crawlers: Verified Google crawlers have unlimited access to the site
If anyone's requests exceed: 60 per minute then throttle it.
If a crawler's page views exceed: 60 per minute then throttle it.
I would imagine that on larger sites then the request/page views upper limit might need to be increased to allow for genuine crawlers.