When viewing the Google Search Console stats for the amount of time Googlebot spends downloading a page, it is common to see wide fluctuations in that time. But while it is sometimes easy to guess the reasons, sometimes it isn’t always clear.
John Mueller from Google answered a question about the report, specifically why there are changes for a site that hasn’t made any changes, yet the report shows these wild fluctuations.
The fluctuations aren’t necessarily from a site undergoing a lot of changes, needing more crawling. It is based on the actual pages, and if some pages are bigger, or have more resources that requires Googlebot to process, you can end up with a report that has large swings in the time spent downloading a page.
Of course, the reason can also be specific to the site, such as the site’s server being slower on responding to Googlebot crawl requests on a page as well.
Here is the tweet:
Some URLs might take longer to download than others (either because they're bigger, or require more server processing on your side), and depending on which ones we crawl, you might see effects like this.
— John ☆.o(≧▽≦)o.☆ (@JohnMu) May 24, 2018
Latest posts by Jennifer Slegg (see all)
- Google Updates Quality Rater Guidelines Targeting E-A-T, Page Quality & Interstitials - May 17, 2019
- Google Local Service Ads Display Pricing Estimates for Specific Locations - August 31, 2018
- Google Testing “Relevant History” Section in Mobile Search Results - August 31, 2018
- Google Converts PDFs, DOCs, XLS etc into HTML for Indexing - August 30, 2018
- Why Google Shows Featured Snippets With Images from Another Site - August 29, 2018