Reduce the Googlebot crawl rate
Google has sophisticated algorithms to determine the optimal crawl rate for a site. Our goal is to crawl as many pages from your site as we can on each visit without overwhelming your server's bandwidth. In some cases, Google's crawling of your site might be causing a critical load on your infrastructure, or cause unwanted costs during an outage. To alleviate this, you may decide to reduce the number of requests made by Googlebot.
If you need to urgently reduce the crawl rate for short period of time (for example, a couple
of hours, or 1-2 days), then return 500
, 503
, or 429
HTTP
response status code instead of 200
to the crawl requests. Googlebot reduces your
site's crawling rate when it encounters a significant number of URLs with 500
,
503
, or 429
HTTP response status codes (for example, if you
disabled your website).
The reduced crawl rate affects the whole hostname of your site (for example,
subdomain.example.com
), both the crawling of the URLs that return errors, as well as
the URLs that return content. Once the number of these errors is reduced, the crawl rate will
automatically start increasing again.
Keep in mind that a sharp increase in crawling may be caused by inefficiencies in your site's structure or issues with your site otherwise. Check our guide about optimizing crawling efficiency.
If serving errors to Googlebot is not feasible on your infrastructure, file a special request to reduce the crawl rate. You cannot request an increase in crawl rate.