During Gary Illyes keynote with Stone Temple one of the questions was regarding how Google handles a long term noindex on a page in regards to Googlebot crawling that page, if the crawl frequency would decline over time.
Gary Illyes said that yes, they do drop the crawl frequency, not only for noindex, but for other problems with page availability as well.
Yes. Typically it will decline for any page we cannot index for whatever reason. Basically, we will try to crawl a few more times to see if the noindex is gone or if the page recovered from a 500 or whatever, and if the noindex is still there, then we will slowly start to move or to not crawl that page that often.
We will still probe that page every now and then, probably every two months or every three months, we will visit the page again to see if the noindex is still there. But we will very likely not crawl it that often anymore.
So the “for whatever reason” could also include cases where there are other long term issues on pages within a site, such as a database connection error, or other server responses such as Forbidden or 500 level server errors, not just pages that are noindexed.
Did you have a page with a noindex or was otherwise inaccessible for some time and are struggling to get it reindexed? Your best bet is to do a fetch and submit in Google Search Console to attempt to jump start the crawling and indexing of the page.
Latest posts by Jennifer Slegg (see all)
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
- Analyzing “How Google Search Works” Changes from Google - July 8, 2020
- Google Quality Rater Guidelines Update: New Introduction, Rater Bias & Political Affiliations - December 6, 2019
- Google Updates Quality Rater Guidelines: Reputation for News Sites; Video Content Updates; Quality for Information Sites - September 13, 2019