When it comes to crawling the web, Googlebot crawls some pages with high frequency, with many crawls per day, while other pages can go months between Googlebot visits. John Mueller from Google made some interesting comments regarding web pages that are crawled infrequently and why that might be, and if it is a concern for SEOs.
First, he says that regularly crawling is not a requirement for any infrequently crawled pages to rank or be visible in the search results. So just because Googlebot doesn’t crawl a page daily, weekly or even monthly, that Google will exclude the page from the search results.
I suspect the crawling period is fine for things like that. Regular crawling isn’t a requirement for visibility or ranking in search.
However, the lack of Googlebot crawling certain URLs regularly can sometimes be a signal that there is a quality issue. We know on pages with low quality links that it can take Googlebot some time to recrawl those URLs for disavow purposes. However, Googlebot can also crawl quality evergreen pages less frequently, especially when they are static pages with a history of not having any changes when Googlebot crawls.
However, in the case that John Mueller was responding to, the site owner stated that those infrequently crawled pages also get a small percentage of traffic, meaning they could be low quality or are pages Google is filtering out for being duplicates – something that is fairly common on ecommerce sites with near identical products.
(Often the URL count is also misleading, especially on ecomm a lot of those are often “unnecessary” URLs; filters/sorting/etc)
So if you have pages that are infrequently being crawled by Googlebot, it doesn’t mean Google won’t show them or rank them in their search results. But it is advisable to see if there are underlying reasons that is causing Googlebot to crawl them less frequently, which could affect how well they do rank.
I suspect the crawling period is fine for things like that. Regular crawling isn't a requirement for visibility or ranking in search. (Often the URL count is also misleading, especially on ecomm a lot of those are often "unnecessary" URLs; filters/sorting/etc)
— John ☆.o(≧▽≦)o.☆ (@JohnMu) June 4, 2018
Latest posts by Jennifer Slegg (see all)
- Google Treats Hreflang in Sitemaps and HTML the Same - June 13, 2018
- Google Dropping Meta Tag Description Length Warnings from Search Console - June 7, 2018
- Google Search Console Crawl Stats to Update Soon - June 6, 2018
- Google: Don’t Block Googlebot from Slow Resources on Page - June 5, 2018
- Google’s Googlebot Crawling, Search Visibility and Rankings - June 4, 2018