When it comes to crawl frequency, everyone wants their pages crawled multiple times per day, even if nothing changes on the page, under the assumption that more is better. But do higher crawl rates translate into better rankings? The answer is no.
John Mueller commented in another recent hangout that individual pages can have very different crawl rates. But this question was more specific to crawl frequency of a site overall.
Here is what John said in response to a question about whether Google crawling 5,000 pages per day on a site with 30,000 pages was a good thing or a bad thing.
What is worth keeping in mind with the crawl rate of a website is we’re not crawling random URLs from the website, like random 5,000 URLs or lining all the URLs up and going through those 5,000, we are crawling some pages a lot more frequently and other pages a little less frequently. So within those 5,000, you’ll probably have a lot of pages that get crawled maybe daily, and a bunch of pages that get crawled maybe once a week, and then a bunch of other pages that get crawled maybe a lot less frequently.
So it’s kind of a mix there, you can’t say well 5,000 a day and 30,000 total therefore it’ll take, i don’t know, 6 days for the whole website to be crawled, well that’s not how it works. 1,000 a day for a website of that size is a good thing.
Another thing to maybe keep in mind with the crawl rate is crawling more doesn’t necessarily mean that your website will be seen as more relevant or rank higher. So there is no need to artificially push a higher crawl rate if we’re already picking up all of the content you are providing.
That said, you don’t want to throttle Googlebot either, unless it is crawling too much and causing server performance issues. But if Google is successfully crawling all your content and indexing it correctly, there is no reason to try and push a higher crawl rate strictly from a ranking perspective.
Of course, there are always exceptions, such as pages that change frequently, although Google is pretty good about crawling those pages more frequently than ones that seldom change. Also a change of the site, such as a change to HTTPS, will trigger Googlebot to recrawl a site faster to process the new secure URLs.
Latest posts by Jennifer Slegg (see all)
- Using Subdomains for Low Quality Google Panda Pages - June 27, 2017
- Google Considers Branded Anchor Text Unnatural in Widget Links - June 27, 2017
- Google Converting Non-Mobile Friendly Pages to Mobile Friendly for Searchers - June 27, 2017
- Google to Stop Scanning Gmail Messages to Target Personalized AdWords Ads - June 26, 2017
- Using CAPS or Lower Case Letters in URLs for Google SEO Purposes - June 23, 2017