Finding a significant drop in organic traffic is the stuff of SEO nightmares.
Naturally, panic sets in… and stress… and embarrassment… and the fear that it’s somehow your fault as the SEO.
If that’s where you find yourself now, take a moment and a big, deep breath. (It’s going to be okay, friend.)
Every reaction has a cause, including organic traffic decline. And when it comes to organic traffic, many causes are out of an SEO’s control. There are even times when losing traffic isn’t a problem.
The best way to validate whether there’s an issue - and find the contributing factors - is to conduct an SEO investigation.
When you’re ready, we’ll guide you through a proven investigation process that’s both measured and methodical — including where to focus and what to look for along the way.
Every SEO likely has a laundry list of fixes or optimizations they’d like to see implemented. (You know, the stuff that’s probably in your SEO roadmap.)
When organic traffic starts to tumble downhill, it’s easy for the first reaction to be “It’s that thing I’ve been telling them about!”
Here’s why you still need to validate your hunch. Imagine this:
You make a quick, off-the-cuff recommendation because “this has to be it!” Four weeks after implementation, traffic is still in freefall. Even worse, the resources that could have been put toward the right work just went to the wrong fix.
But if that thing isn’t it, why is organic traffic down? Where do I even start?
SEO can feel like a labyrinth of “it depends” scenarios. It’s hard to suss out clarity. Nevertheless, that’s our job. And while it’s not fun to face the team without an immediate answer, taking the time to find the REAL answer is fundamental to reversing the downtrend.
First, we need to figure out whether this drop in traffic has to do with a real problem or simply a tracking issue. In our experience, faulty analytics and tracking account for ~15-20% of cases. The great news is, an analytics snafu is typically a quick fix.
Here are a few real-life scenarios we’ve seen with our clients, just to give a sense of some common issues:
Whether you’ve implemented Google Analytics tracking code directly on your site or you’re using Google Tag Manager, there are a couple of things to look for:
Note: Internal analytics tools measure differently than an external tool like Google Search Console, so small disparities are normal.
A third-party GA and GTM audit can help make sure everything’s working right.
If your analytics are trustworthy, then the real fun kicks off. It’s time to dig through the gray area and find some answers. But first, we have to find the source.
Not all organic traffic is a result of SEO work — it has a huge brand component too. The bigger a brand, the more organic traffic it gets from search terms with the brand name in the string.
Before launching into a full-fledged SEO investigation, the question is whether organic traffic loss is happening because fewer people are directly searching for the brand. That’s easy to find by looking at user queries in GSC. Filter to queries containing the brand and compare performance over different windows of time.
If traffic from non-brand terms is relatively flat or up, while traffic from brand terms is significantly down, it’s probably the result of changes at the brand or channel level. We’re talking about stuff like:
No matter the channel, the impact on organic traffic is probably going to show up in brand traffic. After all, the fewer people who see your name, the fewer people who search for it.
Are a majority of clicks disappearing from non-brand terms? The next step is to figure out exactly where the leaks are. Look for connections between queries or pages seeing the most loss.
When digging into the data, identifying patterns becomes easier when you filter and sort by different variables. More often than not, the patterns you find will be the most telling clues.
While internal analytics and third-party tools like Semrush can make certain pieces easier, GSC should be all you need to look into the items below. Mind all four metrics for each: clicks, impressions, CTR, and average position.
Seasonality - Zoom out and look year over year - or at an even longer window - to check if there’s a consistent decline at the same time. (ex: Are you a gifting-heavy eCommerce business that experiences an annual post-holiday hangover?)
URL - Is the loss mostly coming from one page? A few? Many? Are these URLs part of the same subfolder? Do they share the same page template?
Query - Which search terms are contributing to the decline? Do they seem to be part of the same keyword cluster or topic area?
SERP Feature - Losing SERP features to competitors can cause significant traffic loss, even when organic results stay mostly strong, because it means your SERP results are probably below the fold now.
Shopping & Media - Image and video searches, as well as organic results in Google Shopping, can drive significant traffic to relevant URLs. Look at each individually to parse out relevant issues.
Geography - Have there been changes in ranking at a regional level? This could indicate a potential decline due to shifts in regional interest or localization of search results. (Note: You’ll need a third-party tool other than GSC to drill down beyond the country level.)
Woah, woah, woah… how could losing traffic not matter? Well, in most cases, traffic isn’t the primary metric that a business cares about. Most likely, that metric is what the business considers a conversion in the measurement plan.
Always prioritize conversion leaks over traffic leaks! Before sounding the alarms about traffic loss, check whether the impacted URLs are also losing conversions.
Your site may have a couple of home-run, top-of-funnel blog posts that drive a considerable amount of traffic. If they slip just one or two rankings, that could have a big traffic impact.
But is that traffic qualified? Is it valuable in the long term? This is where internal analytics tools can help dig into conversions.
Looking at a first-touch attribution model helps understand how well upper-funnel pages path people toward conversions (since we can’t reasonably expect users to convert higher up the customer funnel).
If pages with high first- or last-touch conversion value are losing traffic, that’s a big deal.
If traffic loss isn’t resulting in conversion loss, it might not be.
Your knowledge of SEO is the scale upon which you’ll weigh your findings and how significant each issue is. Before jumping to any conclusions, make sure to get ALL of the issues out on the table.
The steepness of the trendline can give some clues off the bat:
At this point, you know where the site is losing traffic from and whether it’s affecting critical business metrics. Now, let’s turn our focus to why it’s happening. Our first stop is the site, where we’ll focus on content and technical SEO.
If traffic saw a steep drop-off, we’re probably looking for major changes to content or something that broke in the technical setup.
If the decline is gradual, a “smoking gun” is less likely. There could be multiple variables with contributions that add up over time.
Unless swaths of content disappear from the site (or crawlers can no longer crawl the content, which is a technical issue), a content issue will likely lead to a gradual decline.
Search engines like Google are constantly refining their understanding of what users want and mean when they use certain search terms — and in turn, what constitutes helpful content.
Even if a site continues to improve the content it creates, it’s almost inevitable that some past content will start to slide without updates. Yesterday’s content won’t always meet today’s standards.
Conducting a content audit can help understand if you’re following content best practices, and whether issues like cannibalization or thin content are at play.
Any patterns in URLs you found when analyzing your data can help guide where to look for technical SEO issues.
If a good chunk of the URLs belong to a particular page template, then there’s a good chance there’s an issue specific to that template. Make it a point of focus in your tech audit and conduct SEO QA based on the type of page.
The most likely culprit behind a steep decline is going to be a change in how pages are either indexed or crawled. Think: page rendering, robots.txt changes, widespread meta robots changes, or canonicalization issues.
Time to pull the blinds up and take a good look outside — outside of the site, that is. Even a squeaky-clean domain with great content can slip based on external factors. If your rankings stay consistent, traffic can still slip for a variety of reasons.
Backlinks operate like word-of-mouth recommendations that help google understand a site’s “reputation.”
You can think of the site as a restaurant and each referring domain as a different person who recommends the restaurant. Backlinks are the number of times they recommend it to someone else.
An acquaintance at work might recommend a restaurant to all of your coworkers (lots of backlinks), but that still doesn’t hold as much weight as one recommendation from a James-Beard-award-winning chef (high-authority, contextually-relevant referring domain).
That’s why we’re looking for either:
Third-party tools like Semrush, Ahrefs, or Majestic SEO make this pretty easy to find.
If you’ve been in the SEO game long enough, the words “algorithm update” probably just sent a little shudder down your spine. That’s because an inopportune algo update can tank traffic fast.
Even worse, not every update gets a formal announcement. If you don’t see anything directly from Google, the algorithm may have still changed in meaningful ways.
This is where tapping into the rest of the SEO community can help. There are lots of experts out there tracking shifts in the Googlesphere and adapting best practices, including the types of sites and content that are most impacted.
Many even develop tools that make it easier to understand and identify impact from algorithm updates. Here are a couple of my personal favorites:
At this point, we’re all pretty familiar with the Knowledge Panel on Google SERPs.
Way back in 2012, the introduction of the Knowledge Panel sent SEOs into a tizzy. Many users no longer had to click through to organic results — even the one featured in the knowledge panel — because the information they needed was right there on the SERP.
Over the years, Google has released a slew of SERP features, made ads more prominent on the SERP, and further deprioritized traditional organic results.
This isn’t to say that all of this is bad or good. It’s merely to point out that the SERP is in every sense a variable — it’s in constant flux, and we have to consider that. Changes can have a big impact on whether organic users see a brand or make it to a site.
Here’s some rather alarming news: just because you’re not actively implementing shady back-alley SEO tactics doesn’t mean your domain is exempt from Google penalties. Yes, even SEOs with hearts of gold find themselves scrambling against penalization from time to time.
You can check for a penalty notification in your GSC account by navigating to Security > Manual actions. If there is one, Google’s documentation on manual penalty might come in handy, along with this complete list of Google SEO penalties.
One quick note: It’s important to make the distinction between impacts from algorithm updates and penalization. The latter is Google making a change that it feels will serve users with more helpful, relevant content. Sometimes that can cause a sharp drop, but you’re not being penalized.
If you’re penalized, it’s because Google sees something on your site as harmful, manipulative, or extremely low-quality. Whereas if an algorithm updates sends traffic spiraling, it’s because the target moved.
Preferences change, competition evolves, and organic traffic is at the whim of both. Digging into the market landscape as a factor is a matter of asking two questions.
You’re competing against other domains for organic rankings. What they do has an impact on you too.
If competitors overtake you for key terms, rankings, and SERP features, that means what they’re doing is working. Take note of their advantages by conducting a competitive analysis looking at both page- and domain-level factors.
At the page level, compare your URL(s) on the SERP versus competitors who now outrank you for the same term. Ask:
To assess domain-level factors, consider using a domain scorecard approach for a useful, at-a-glance assessment of site-level factors.
If you’re in the U.S., you might remember the great toilet paper scare of 2020. With the COVID-19 lockdown imminent, consumers went scrambling for TP, leading to bare shelves and backorders.
Interestingly, that led to bidet interest in the United States reaching a five-year high.
For obvious reasons, searches for concerts plummeted.
This is an extreme example, but it goes to show that consumer behavior changes — sometimes by necessity, sometimes by preference, but always with SEO impact.
How has volume for your most prominent keywords changed over time?
If fewer users are looking for your bread-and-butter terms, fewer users are going to make their way to the site. Using historical keyword data as digital market intelligence is a powerful way to understand whether the interest pool is shrinking.
Ultimately, the landscape is made up of a number of factors: what you’re doing, what Google is doing, what competitors are doing, what users are doing, and what’s driving that behavior socioeconomically.
The clues you find across them often weave into a telling narrative that’s a lot bigger than SEO. For a great example, just check out Erika Varangouli’s LinkedIn post breaking down the performance of competing brands in the creator economy.
That wasn’t so bad, right? You have a list of issues from your investigation, which is the foundation of the roadmap that’s going to help recover your traffic. (Aka the answer you need to get to leadership.)
From here, it’s a matter of weighing the value of each potential optimization by putting your SEO product management hat on:
As always, we’re here to help.