In yesterday’s Google Q&A with John Mueller and Andrey Lipattsev, I asked a question about Googlebot suddenly increasing the amount of crawling it does on a site, when there are no obvious reasons – such as a new URL structure – for Google to do so. Is it a sign of a Google update or an issue with the site? Or is it something that a site owner can’t really read anything into.
Some have claimed that increase crawl rate is directly related to an upcoming Penguin, Panda or other algo update. Some claim it is a sign that Google is evaluating a site for a possible manual action or being hit with an algorithmic issue by Google. But is this really the case?
First, Andrey Lippattsev joked and asked John Mueller if he would like to talk about the secret crawler that goes to a website if it has been naughty, which John responded “there is none.”
This is totally unrelated. So, the crawling that you might see is not related with manual actions, definitely not with manual actions, that’s something that is completely separate from crawling and generally not related to any other kind of algorithmic changes with a web site.
And I guess some of the reasons why we might crawl more are things that sometimes don’t even have much to do with the website, where our algorithms just decide oh, I want to double check all those URLs that I found on that website a while back and it just goes off and does that and suddenly we’re crawling twice as much for a couple of days and it doesn’t really mean that anything is a problem or anything that you need to worry about.
It’s essentially just our algorithms deciding that maybe today is a good day to look at this awesome website or maybe today is a good day to look at all these old URLs that we’ve found maybe 3 or 4 years ago and that we never got any content for and maybe there’s good content there today. So that’s something that’s not necessarily related to anything on the site or happening to the site.
So next time you see a spike in the number of pages being crawled on your site when there isn’t a specific reason why, it isn’t a sign that your site has been bad and is about to get hit with a manual action or some other Google search algo.
Latest posts by Jennifer Slegg (see all)
- Google: When to Choose Subdirectories vs Subdomains - May 22, 2018
- Google Sending New Wave of Mobile-First Indexing Enabled Notifications - May 17, 2018
- Google Testing Disambiguation Box for Local Packs - May 17, 2018
- Google & Optimizing for Local “Near Me” Searches in Search Results - May 16, 2018
- Removing Sitemap Files From Google Search Console - May 16, 2018