When people are assessing their sites for whether Google sees it as quality or not, crawl frequency is one that many SEOs look for. If Googlebot visits internal pages very seldom or doesn’t go very deep into a site when crawling, these are signs that Google probably sees the site as low quality.
John Mueller confirmed it into today’s hangout.
In a lot of cases, what will happen when it comes to spammy links, like if we’re just looking at spammy links in general, then what happens there is if this website is kind of a low quality site that we don’t really care about, then we’re not going to value that that much anyway. So crawling faster isn’t really something that we think makes sense there, its not going to have a big effect anyway.
Now, this isn’t a total surprise. After all, we know that sometimes it can take up to 9 months for Google to crawl some of those really spammy links that have been disavowed, which is why Google recommends disavowing on a domain level instead of simply disavowing the individual links. But low crawl frequency can definitely be a very visible sign to a webmaster that Google doesn’t hold their site to a high – or even lowish – quality.
Of course, if a site owner is doing something to throttle or block Googlebot, then in those cases where the site owner is artificially suppressing Googlebot doesn’t mean the site is necessarily low quality. But if the site owner is letting Googlebot crawl naturally, and it crawls very infrequently without any other reason for it to do so, then site quality is definitely high on the priority of things to fix.
Latest posts by Jennifer Slegg (see all)
- Over 60% of Google Search AMP Clicks Go To Non-News Sites - February 13, 2018
- Bing Testing Stacked Sitelinks in Search Results - February 12, 2018
- Google Chrome to Flag Sites as Not Secure in July 2018 - February 8, 2018
- Google Changes Image Size Requirements for AMP articles - February 8, 2018
- Google Search Console Alerts for Sites Without High-Res Favicons - February 8, 2018