SEO

11 Reasons Why You’ve Seen a Sudden Drop in Website Traffic

5 years ago

The most common cause of a sudden drop in website traffic is a recent search algorithm update. Penalties, redirects, incorrect robots.txt rules and ranking losses are all other legitimate reasons why you may see a drop in website traffic.

Fortunately, in most cases, if you’re affected by a sudden decline in traffic there are a couple of things that you can check in addition to what I’ve mentioned above. Hopefully, by the end, you’ll be able to diagnose why things might have changed.

Website traffic loss checklist

Here are 11 points I like to check over if I notice that a site is experiencing lower monthly visits or a sudden decline in traffic:

  1. Algorithm updates
  2. Tracking errors
  3. Robots.txt rules
  4. Redirects
  5. Crawl errors
  6. Ranking losses
  7. XML sitemap changes
  8. Manual penalties
  9. De-indexing
  10. Cannibalisation
  11. SERP layout changes

1. Algorithm updates

Google doesn’t hide away from the fact that it releases multiple updates throughout the year, some more significant than others. Unfortunately, trying to get solid details of the changes is quite frankly like trying to get blood from a stone.

However, an easy way to gauge whether your site may have been impacted by an algorithm update is to keep a close eye on confirmed changes from Google themselves

But, by far the easiest way to get information on algorithm changes is to make use of tools such as Mozcast - from Moz.com and the SEMrush Sensor from SEMrush. If neither of those takes your fancy, Algoroo is another algorithm tracking tool available free of charge.

If you find that there has in fact been a recent update, I’d highly recommend spending some time analysing the sites that have been affected the most. Try to spot any correlation between them and ensure that your site doesn’t suffer the same fate.

2. Tracking errors

Even now, I’m amazed at how many webmasters and site owners manage to pull their tracking codes from the site and wonder why traffic nosedives.

Fortunately, it’s a mistake that can be easily fixed, but in the long run, you will miss out on data - so the quicker you spot this and get it sorted, the better!

If you notice that there’s suddenly no sessions being recorded in Google Analytics or a Tag isn’t firing then chances are the tracking codes either have an error or have been removed entirely. If you have access, check to make sure that the code is present and correct.

Alternatively, contact your developers and confirm that the tracking code is where it needs to be and is working.

3. Incorrect robots.txt rules

Are you sure that your site isn’t blocking search engines from crawling in the robots.txt file?

It isn’t uncommon for developers to leave robots.txt files unchanged after migrating from a development or staging website. Most of the time, when this happens it’s completely accidental.

Go to your sites robots.txt file and make sure that the following rule isn’t present:

Example of a robots.txt file

If it is, you’ll need to remove the Disallow rule and resubmit your robots.txt file through Google Search Console and the robots.txt tester.

4. Redirect errors

Most sites, especially large websites will have redirects in place. They’re most frequently added via a .htaccess file, or if you’re using WordPress, a plugin to make life a little easier.

Whenever you add a new permanent redirect (301) to your site, I’d highly recommend testing it before pushing it to a live environment, even more so if you're adding large quantities of redirects.

To make sure that redirects are still working as they’re expected to, I simply use a web crawler (my preference is Screaming Frog) and using the list mode (Mode > List) paste my list of URLs that are being redirected and crawl them, then analyse the response codes and final destinations:

Example of a URL list to crawl with Screaming Frog

5. Crawl errors

Using the new Search Console, open up the Index Coverage Report and check for any URLs that have an Error.

Any URLs in the coverage report that have an error associated with them won’t be included in the index. Typical errors that are found in this report include:

  • Server errors
  • Redirect errors
  • URLs that are blocked by robots.txt
  • URLs that are marked with a noindex tag
  • Soft 404 errors
  • URLs that return an unauthorised request
  • URLs that aren’t able to be located (404s)
  • Crawling errors

More information on these reports can be found here.

6. Ranking losses

Another really common reason for seeing website traffic decline is due to a loss in organic rankings.

Now, if you’re tracking your performance using a rank tracker, then troubleshooting this will be a lot easier. If you’re not, then utilising data from Search Console will be your best bet.

I use the following process to get an idea of any ranking changes:

  1. Using Google Analytics and Search Console or your preferred rank tracking tool (my preference is AccuRanker), identify when traffic started to drop
  2. Take an export of the ranking keywords before and after the drop
  3. Using Excel or G Sheets create a table and paste in the data side by side
  4. Compare the change in positions
  5. Retarget dropped terms with keyword research and mapping

Example of ranking loss comparison spreadsheet

Alternatively, tools such as SISTRIX are also really useful to help identify keywords that have dropped from page one or even the top 100 results.

Here are some more technical SEO tips to help your site rank, if you'd like to read up a little more on the subject.

7. XML Sitemap changes

If you’re an SEO you’ll know (hopefully) that only URLs that return a 200 response and are indexable should be visible in your sitemaps, unless you’ve purposely left redirected URLs to ensure that search engines find them quicker.

One reason why you could be seeing traffic plummet is a change in your XML sitemap.

Crawl the sitemap URLs and ensure that they all return a 200 OK response and that any new landing pages or articles are included too. If your site contains 200 URLs and there are only 50 in the sitemap you’ll want to regenerate and resubmit it using Search Console.

8. Manual actions and penalties

A manual action will be issued against your site if one of the eagle-eyed human reviewers finds content on the site that goes against Google’s guidelines. You can find more information on their webmaster guidelines here.

Screenshot of Google Search Console's manual actions

You can see if your site has been affected by manual actions by using the manual actions report in Search Console.

9. URLs being de-indexed

Google recently Tweeted about a reported ‘de-indexing’ bug that was causing sites to see important pages appear to be removed from the index almost overnight. But this isn’t just a recent problem.

Finding those important URLs are no longer available in the search results can be a massive factor when investigating a sudden website traffic loss.

  1. Check the index coverage report in Search Console for any errors
  2. Using the URL inspection tool, check that important pages are still in the index
  3. If not, use the ‘REQUEST INDEXING’ option in Search Console

10. Keyword cannibalisation

If you’ve recently created a lot of new content around a specific topic without considering the keyword targeting, you may have accidentally fallen victim to a keyword cannibalisation issue.

Cannibalisation occurs when a website appears for a keyword with multiple URLs. For example, Ahrefs.com has a lot of content around broken link building:

Example of keyword cannibalisationSource: Ahrefs.com

If traffic is being spread across multiple pages or posts, you could be losing valuable organic traffic. The easiest way I’ve found to highlight cannibalisation errors has been through the use of BigMetrics.io and the cannibalisation report.

Simply create an account (trial or paid version) and connect it to your Search Console property and export.

11. SERP layout changes

A recent change in the way Google and search engines display organic results can have an impact on your traffic levels. So making sure that you’re adaptable and willing to make changes will go a long way.

Google, in particular, has made a number of changes to the way results are displayed; showing Featured Snippets, Knowledge Graphs and making ads more prominent to name a few, making life for SEO agencies and professionals very frustrating.

Example of a competitive SERP layout

In the screenshot above you’ll see that before you see any sign of an organic result you need to compete with Ads, Knowledge Graphs, Featured Snippets and Google’s Suggestions. This doesn't even take into consideration a number of other SERP features.

Analyse the keywords that you’re targeting; just because they once weren’t triggering a SERP feature doesn’t mean that they don’t now. AccuRanker's SERP checker is great for this.

If the keywords you’re targeting are triggering featured snippets and instant answers, and you’re not the featured snippet, you’re going to be losing clicks and traffic to your site.

Summary

Seeing website traffic drop can be very disheartening, but there is always a reason why, and if there’s a reason, it can usually be fixed.

If you take one thing away from this post, it’s that a sudden decline could be due to a number of reasons combined, or even just one key traffic-rich page that has fallen from the index.

Make sure that you investigate every possible avenue, and you’ll quickly discover the cause and get a recovery plan in place.

Download our credentials deck.

Pop it in my inbox.
Digital Marketing
SEO

Getting started is as easy as having a conversation.

crosschevron-down