Having a website go offline is one of the top concerns for nearly every webmaster. Not only do you have the fact you are losing traffic to your site and causing your visitors to go to your competitors instead, there is also the very real issue of how it affects your Google rankings.
The question came up in the last Google Webmaster Office Hours. John Mueller recommends using a 503 if at all possible when you need to take your website offline.
But what if you can’t serve Googlebot a 503 and it ends up thinking the entire site is 404 and perhaps offline permanently? Mueller talked about exactly how Googlebot handles a website it discovers as being offline when a 503 server code isn’t given.
With regards to this situation where maybe 500 errors were showing or the server was down, that’s something that when we recrawl those pages, we’ll be able to take that into account again and index them and rank them as we did before.
So it’s not something where we kind of artificially hold a website back but it’s more of a technical issue that we have to recrawl those pages and recognize that they’re okay and put them back in our index together with the old signals that we had.
To some extend we try to recognize a kind of failure when we see it happening and keep those pages in our index anyway, just because we think maybe this is temporary and the website will be back soon, so some of that might have actually worked here but some of it might be that we actually recrawl those pages a bunch of times and they dropped out and we don’t have them for ranking anymore.
The good part here is that if we recognize that a page is kind of important for your website, we’ll generally crawl it a bit more frequently. So if it drops out of the index because of a failure like this, then we’ll generally crawl it a bit more frequently and bring it in a little bit more faster than if we would with some random page on your website that never changed for the last few years.
My guess is something like this where if you have to take the server down for a day, you might see maybe a week, two weeks, at the most maybe three weeks time where things are kind of in flux and settling down again, but it shouldn’t take much longer than that.
So if possible, serve Googlebot a 503, but if it isn’t possible, as long as your website isn’t down for a long period of time, it shouldn’t affect your website too badly, aside from the initial few weeks after the downtime.
Latest posts by Jennifer Slegg (see all)
- New Google Quality Rater Guidelines, Update Adds Emphasis on Needs Met - October 16, 2020
- Google Updates Experiment Statistics for Quality Raters - October 6, 2020
- Analyzing “How Google Search Works” Changes from Google - July 8, 2020
- Google Quality Rater Guidelines Update: New Introduction, Rater Bias & Political Affiliations - December 6, 2019
- Google Updates Quality Rater Guidelines: Reputation for News Sites; Video Content Updates; Quality for Information Sites - September 13, 2019