Not serious about the downtime of your website?
Here’s a Google warning for you.
Google will de-index your web pages from search results if your site is down for more than a couple of days.
This is announced by John Mueller, search advocate at Google, during the Google Search Central SEO Hangouts on Dec, 10.
During this meet-up, Mueller was asked by an SEO pro how to minimize the potential effect of website downtime for a week or more while they resolve some bugs.
But the answer by Mueller was shocking.
He said that the website would be delisted from the search engine if it is down for a few days as he replied–
“I don’t think you’ll be able to do it for that time, regardless of whatever you set up. For an outage of maybe a day or so, using a 503 result code is a great way to tell us that we should check back. But after a couple of days, we think this is a permanent result code, and we think your pages are just gone, and we will drop them from the index.”
He further said that the pages will be crawled again once they come back.
“And when the pages come back we will crawl them again and we will try to index them again. But it’s essential during that time we will probably drop a lot of the pages from the website from our index, and there’s a pretty good chance that it’ll come back in a similar way but it’s not always guaranteed.”
Increased Downtime will Lead to Increased Fluctuations
While Google can re-list the pages once they are “fixed”, the longer downtime can impact the search rankings.
Here’s what Mr. Muller has to say on this…
“So any time you have a longer outage, where I’m thinking more than a couple of days, I would assume that at least temporarily you will have really strong fluctuations and it’s going to take a little bit of time to get back in. It’s not impossible because these things happen sometimes. But if there’s anything you can do to avoid this kind of outage, I would try to do that.”
How to Deal with an Extended Downtime
First of all, you should try to avoid it. If it happens anyhow, make sure to have a static version of the main website where you can redirect your users to. But that shouldn’t be used for more than a day.
In this context, Muller says…
“…setting up a static version of the website somewhere and just showing that to users for the time being. But especially if you’re doing this in a planned way I would try to find ways to reduce the outage to less than a day if at all possible.”
HOW TO PREVENT YOUR WEBSITE FROM GETTING DE-INDEXED
Remember that a delisted page by Google can lead to multiple impacts and metrics apart from affected page views. Several other factors can be affected, such as deals, business opportunities, and conversion rate, and affected internal and external pages. For example, a down page might not enjoy the same ranking as it did before being de-listed by Google.
That’s why it is important to stay watchful and try all to keep your pages live and indexed. Try to fix the issues as soon as you notice them. Here are some strategies you can use to avoid being de-indexed.
Stay Watchful:
Make sure that your pages are live. Check them for bugs.
Fix the Issues as Soon as You Notice Them:
If you notice an issue on your website, try to fix it as soon as possible, and use a 503 result code to let Google know that it is a temporary outage. If the page doesn’t go live in a short period, it can be delisted.
Keep Your Alternative Ready:
If the page is down for maintenance or any different reasons, make sure to set up a static version of the website for the time being.
So you must have understood that how a long outage can make Google de-index your page. While you can fix it, make sure to check your website for the issues that can lead to downtime.