When people are assessing their sites for whether Google sees it as quality or not, crawl frequency is one that many SEOs look for. If Googlebot visits internal pages very seldom or doesn’t go very deep into a site when crawling, these are signs that Google probably sees the site as low quality.
John Mueller confirmed it into today’s hangout.
In a lot of cases, what will happen when it comes to spammy links, like if we’re just looking at spammy links in general, then what happens there is if this website is kind of a low quality site that we don’t really care about, then we’re not going to value that that much anyway. So crawling faster isn’t really something that we think makes sense there, its not going to have a big effect anyway.
Now, this isn’t a total surprise. After all, we know that sometimes it can take up to 9 months for Google to crawl some of those really spammy links that have been disavowed, which is why Google recommends disavowing on a domain level instead of simply disavowing the individual links. But low crawl frequency can definitely be a very visible sign to a webmaster that Google doesn’t hold their site to a high – or even lowish – quality.
Of course, if a site owner is doing something to throttle or block Googlebot, then in those cases where the site owner is artificially suppressing Googlebot doesn’t mean the site is necessarily low quality. But if the site owner is letting Googlebot crawl naturally, and it crawls very infrequently without any other reason for it to do so, then site quality is definitely high on the priority of things to fix.
Latest posts by Jennifer Slegg (see all)
- Why Googlebot Shows Fluctuations on Time Spent Downloading Page Report - May 24, 2018
- Google: When to Choose Subdirectories vs Subdomains - May 22, 2018
- Google Sending New Wave of Mobile-First Indexing Enabled Notifications - May 17, 2018
- Google Testing Disambiguation Box for Local Packs - May 17, 2018
- Google & Optimizing for Local “Near Me” Searches in Search Results - May 16, 2018