StatusCake

The Impact of Website Downtime on SEO

When your website goes offline there are many factors for you to consider as a priority, such as the loss in traffic and sales and/or leads. However, another factor to consider is how a period of downtime will affect your standing in search engines such as Google.

Website downtime correlates directly with lost rankings in search, which in turn leads to a long-term loss of traffic for weeks and months after the initial period of downtime.

But what if your website only goes offline for a short period of time, would you still be facing a decline in your hard-won search rankings? In this article, we take a look at how Google analyses and interprets downtime, and what some of the senior figures at Google have said on how downtime relates to SEO.

Website crawling and Googlebot

When your website goes offline, how will Google know there is an issue?

Google indexes websites for its search engine using Googlebot – a website crawler bot that collects data from the web to add to Googles’ search engine.

If your website is experiencing a period of downtime when Googlebot comes to crawl your site, it will be met with the same error code as an end-user, probably a 500 internal error response.

Just as an end-user will react negatively to a website that is unavailable, so too will Googlebot. A study by SEO specialists Moz found that intermittent 500 internal server errors caused tracked keywords to plummet in search, often dropping out of the top 20 completely.

The study also found that affected pages eventually received fewer crawls per day, suggesting that Googlebot visits pages less frequently, the more often it is served a server error. This means that the damage done by downtime increases exponentially the longer the period of downtime lasts.

Google’s user-friendly mission

Why is it the case that websites which are frequently offline decline in rankings over time?

The most obvious answer is Google’s obsessive focus on providing users of the search engine with the best possible user experience.

The Google search algorithm calculates the rank of pages for any given search term based on hundreds of ranking factors, each of which is weighted according to the importance placed on them by Google. Many of the most important ranking signals, such as page speedmobile-friendliness, and bounce rate are factors that help to evaluate the overall experience a particular page is likely to provide to the end-user.

As such, a site that is frequently found to be offline by Googlebot will be evaluated as serving a sub-optimal user experience, and this will eventually be reflected in its ranking in search if the issue is not addressed in time.

What Google says about website downtime

Top Google employees are often pretty tight-lipped about search engine ranking factors, so when they do speak, the SEO world listens.

Matt Cutts was one of the most senior employees in the search team at Google and has provided a lot of insight into how downtime can affect your rankings. Cutts explained that if your website is down for just a day then there is unlikely to be any negative impact on your search rankings. However, an extended period of downtime, stretching over a number of days and weeks, could result in your website losing search rankings for the simple fact that Google does not want to send a user to a website that is frequently offline.

Cutts also said that they make allowances for websites that are experiencing sporadic downtime, with Googlebot generally returning to a site that was offline 24 hours later to see if it is back online.

Google engineer John Mueller offered a slightly different perspective. He claims that search rankings will see a period of flux, lasting 1-3 weeks, after just one day of server downtime. As long as downtime lasts no longer than a day, rankings will return to normal levels following the period of flux.

The delay in rankings returning to normality is accounted for by Googlebot having to recrawl the site, and making a judgment on its stability. It is for this reason that pages or websites which are offline frequently decline so significantly in rankings over time as Googlebot will reduce its crawl frequency and may even de-index a site if it believes it is offline permanently.

In conclusion, both in studies from third parties such as Moz, and in the words of senior Google employees, website downtime can wreak havoc on your hard-won search rankings. This is not an absolute rule, however, and allowances are clearly made for websites that experience a short period of downtime, although it may also be the case that even in this scenario rankings may fluctuate for a period of weeks before returning to normality, However, it is websites which experience sustained periods of downtime which are most liable to see a significant drop in search rankings, something which appears to increase in severity the longer a website is found to be offline.

StatusCake provides a suite of uptime monitoring tools that are easy to set up and use and provide you with insights you need to prevent website downtime. Our free plan includes a range of free tools, including page speed monitoring, while our paid plans include SSL Monitoring, Server Monitoring, Domain Monitoring, and Virus Scanning.

Share this

More from StatusCake

When Code Becomes Cheap: The New Reliability Constraint in Software Engineering

4 min read How AI Is Shifting Software Engineering’s Primary Constraint For most of the history of software engineering, the primary constraint was production. Code was expensive, skilled engineers were scarce, and shipping features required concentrated human effort. Velocity was limited by how fast people could reason, implement, test, and deploy. That constraint shaped everything from team size,

Buy vs Build in the Age of AI (Part 3)

5 min read Autonomous Code, Trust Boundaries, and Why Governance Now Matters More Than Ever In Part 1, we looked at how AI has reduced the cost of building monitoring tools. Then in Part 2, we explored the operational and economic burden of owning them. Now we need to talk about something deeper. Because the real shift isn’t

Buy vs Build in the Age of AI (Part 2)

6 min read The Real Cost of Owning Monitoring Isn’t Code — It’s Everything Else In Part 1, we explored how AI has dramatically reduced the cost of building monitoring tooling. That much is clear. You can scaffold uptime checks quickly, generate alert logic in minutes, and set-up dashboards faster than most teams used to schedule the kickoff

Buy vs Build in the Age of AI (Part 1)

5 min read AI Has Made Building Monitoring Easy. It Hasn’t Made Owning It Any Easier. A few months ago, I spoke to an engineering manager who proudly told me they had rebuilt their monitoring stack over a long weekend. They’d used AI to scaffold synthetic checks. They’d generated alert logic with dynamic thresholds. They’d then wired everything

Alerting Is a Socio-Technical System

3 min read In the previous posts, we’ve looked at how alert noise emerges from design decisions, why notification lists fail to create accountability, and why alerts only work when they’re designed around a clear outcome. Taken together, these ideas point to a broader conclusion. That alerting is not just a technical system, it’s a socio-technical one. Alerting

Designing Alerts for Action

3 min read In the first two posts of this series, we explored how alert noise emerges from design decisions, and why notification lists fail to create accountability when responsibility is unclear. There’s a deeper issue underneath both of those problems. Many alerting systems are designed without being clear about the outcome they’re meant to produce. When teams

Want to know how much website downtime costs, and the impact it can have on your business?

Find out everything you need to know in our new uptime monitoring whitepaper 2021

*By providing your email address, you agree to our privacy policy and to receive marketing communications from StatusCake.