If you’re a regular user for Google Search Console you’ve probably come across this a few times. The Crawl Stats report shows the visit stats for Googlebot to your website. The data stretches back just 90 days.
In the top graph (blue) you can see how many pages are being crawled on any given day during the reporting period. The middle graph (red) shows how many kilobytes are being downloaded per day. The final graph at the bottom shows the time downloading the data from the website.
In the last day or two of these graphs you can see a very clear upward spike in both the pages crawled and time spent downloading, but the data downloaded didn’t change that much (if at all).
Any sudden upward spikes in any of these graphs can be an indicator of something wrong, but not every upward spike is a problem. Knowing some background info about this website is important.
In the months prior to this event Google had been progressively de-indexing pages in the site due to an issue with a Country Blocker. The blocker had prevented Googlebot from crawling the website. In short – a disaster for organic rank and referrals from search.
This client came to us for a solution, which we promptly resolved, but that leaves the client with little or no pages left in search, so we re-submitted sitemaps and used Fetch as Googlebot to crawl the site. The result was too much for the server to handle and still serve content to users as well, because our request caused Google to begin swamping the server with page requests.
The fix: very simple. Reducing the crawl rate in the Crawl Rate control for the site effectively preventing excessive requests from Googlebot and thereby relieving the server to tend to the all-important user requests.
Read the full article here.