When people are assessing their sites for whether Google sees it as quality or not, crawl frequency is one that many SEOs look for. If Googlebot visits internal pages very seldom or doesn’t go very deep into a site when crawling, these are signs that Google probably sees the site as low quality.
John Mueller confirmed it into today’s hangout.
In a lot of cases, what will happen when it comes to spammy links, like if we’re just looking at spammy links in general, then what happens there is if this website is kind of a low quality site that we don’t really care about, then we’re not going to value that that much anyway. So crawling faster isn’t really something that we think makes sense there, its not going to have a big effect anyway.
Now, this isn’t a total surprise. After all, we know that sometimes it can take up to 9 months for Google to crawl some of those really spammy links that have been disavowed, which is why Google recommends disavowing on a domain level instead of simply disavowing the individual links. But low crawl frequency can definitely be a very visible sign to a webmaster that Google doesn’t hold their site to a high – or even lowish – quality.
Of course, if a site owner is doing something to throttle or block Googlebot, then in those cases where the site owner is artificially suppressing Googlebot doesn’t mean the site is necessarily low quality. But if the site owner is letting Googlebot crawl naturally, and it crawls very infrequently without any other reason for it to do so, then site quality is definitely high on the priority of things to fix.
Latest posts by Jennifer Slegg (see all)
- Google Quality Rater Guidelines: The Low Quality 2021 Update - October 19, 2021
- Rethinking Affiliate Sites With Google’s Product Review Update - April 23, 2021
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
- Analyzing “How Google Search Works” Changes from Google - July 8, 2020