logoBook a call
Home/Blog/Why Is My Website Traffic Dropping — and How to Find the Real Cause
seo
22.04.20269 min

Why Is My Website Traffic Dropping — and How to Find the Real Cause

Kirill Bashorin
Kirill Bashorin
Founder
Why Is My Website Traffic Dropping — and How to Find the Real Cause

The worst thing you can do when traffic drops is guess at the cause and start fixing things at random. I've seen teams publish 20 new posts, rebuild their site architecture, and run a paid campaign — all in response to a drop that turned out to be a broken GA4 tag. Three months of wasted effort because nobody checked the measurement layer first.

There are roughly six categories of causes for a traffic drop. They require completely different responses. The diagnosis comes first.

Check Whether the Drop Is Real Before Doing Anything Else

A significant portion of “traffic drops” I get asked about aren't drops at all — they're measurement failures. A caching plugin update strips the GA4 tag from page output. A developer pushes a change that breaks the GTM container. Someone adds a cookie consent banner without configuring it correctly and it blocks analytics for users who decline. The session count in GA4 falls 40%, and everyone assumes the site lost ranking.

The verification takes five minutes. Open Google Search Console and look at clicks over the same period. Search Console measures from Google's side — it counts clicks from search results before anyone arrives on your site, so it's completely independent of your tracking setup. If Search Console shows stable clicks and GA4 shows a drop, your tracking is broken. If both show a drop, you have a real traffic problem.

Also check GA4 Realtime: open your site in a private browsing window and watch whether your own session appears. Two GA4 events per pageview means a duplicate tag. Zero means the tag isn't firing at all. This is the fastest sanity check available and it catches the majority of measurement issues in under two minutes.

Algorithm Updates Are the Most Common Real Cause — and the Hardest to Hear

Google runs several major algorithm updates per year, each of which can materially redistribute rankings across thousands of sites simultaneously. If your traffic dropped in a specific week and you didn't change anything on the site, the first thing to check is whether a Core Update, Helpful Content Update, or spam update rolled out during that window. Google publishes these; cross-reference the dates.

The uncomfortable truth about algorithm-related drops: they usually mean the site has a real quality problem that the update surfaced, not that Google made an arbitrary error. Recoveries that come from “waiting it out” until the next update are rarer than the SEO industry implies. More commonly, recovery requires substantive changes to content quality, E-E-A-T signals, or site structure.

The diagnostic question is where the drop is concentrated. If it's across the entire site, an algorithm update is likely. If it's isolated to specific page types — blog posts, product pages, category pages — the cause is more likely to be topical rather than sitewide. Pull the Traffic acquisition report in GA4 filtered by Landing page, sort by sessions, and identify which specific pages lost the most traffic. That tells you whether you're dealing with a sitewide signal or a content-specific one.

One pattern I see repeatedly: sites that grew rapidly by publishing high volumes of thin, AI-assisted content and then got hit by a Helpful Content Update. The signal Google is penalizing isn't AI usage — it's content that doesn't demonstrate genuine expertise or add something a reader couldn't find on the first page of results. The fix requires auditing and either upgrading or removing the low-quality pages, not publishing more of the same.

Technical Issues Can Kill Traffic Silently Over Weeks

Unlike an algorithm hit, technical problems don't always produce a sudden drop. Crawl issues, broken redirects, and indexing failures accumulate gradually — and often start in a part of the site nobody checks regularly.

The audit starts in Google Search Console. The Coverage report (now called Indexing in newer GSC versions) shows how many pages are indexed versus how many Google has discovered but not indexed, and why. A sudden increase in “Crawled — currently not indexed” or “Discovered — currently not indexed” URLs often precedes a traffic drop by several weeks, because Google removes pages from rankings before they disappear from the coverage report.

Core Web Vitals deserve attention separately. A site that passes CWV thresholds doesn't get a ranking boost — it just avoids a penalty. But a site that drops below threshold after a theme update or image optimization change can lose rankings on competitive terms where page experience is the margin. The CWV report in Search Console flags which URLs are failing and which metric is the culprit.

Redirect chains created by multiple site migrations are another quiet traffic killer. A page that redirected from URL A to URL B after migration one, then from B to C after migration two, passes progressively less link equity with each hop. If your site has been through more than one CMS change or domain migration, run a crawl with Screaming Frog and filter for redirect chains longer than two steps. Collapsing them to direct redirects is a low-effort fix with real impact on pages that depend on passed authority to rank.

Content Decay Is Real and Predictable

Pages don't hold rankings indefinitely. Content that ranked well in 2022 on a topic that's evolved — SaaS pricing models, AI tools, regulatory requirements, technical best practices — will lose positions over time as fresher, more accurate content replaces it in results. This isn't punitive. It's Google surfacing more current answers.

The identification is simple: pull Search Console data for the past 16 months, filter by page, and look for pages where average position has been moving steadily downward over three to six months. A post dropping from position 5 to position 9 over six months is decaying. It hasn't fallen off the page yet, but the trajectory is clear. Catching it at position 9 is cheaper than catching it at position 24.

Content decay fixes aren't about adding more words. A post that's decaying because the information is outdated needs updated information, not expanded sections on tangential topics. The update signal that actually moves rankings is changing the content in ways that reflect how the topic has changed — new data, revised recommendations, removed advice that no longer applies. Update the published date only when the content change is substantive. Changing the date on stale content without changing the content itself is a tactic that stopped working years ago.

Lost Backlinks Are an Underdiagnosed Cause

Backlinks expire. A publication replatforms and loses all its external links. A site closes. An editor does a content audit and removes old articles. Any of these can strip link equity from pages that depend on it to hold competitive rankings, and the traffic loss follows weeks or months later when rankings slip.

In Ahrefs, the Lost Backlinks report filtered to the last 90 days will show you which links have gone. The ones that matter are links from referring domains with real authority pointing to pages that rank for competitive terms. A link from a DA-60 publication to your most important landing page disappearing is worth investigating. A link from a low-quality blog you don't recognize disappearing is probably net neutral.

If the lost links are from active sites that simply removed or updated content, outreach to request reinstatement or replacement is worth doing. If the referring domain has gone offline, you need to rebuild the lost equity through new link acquisition on the affected pages — which is a slower fix but the only one available.

SERP Feature Changes Reduce Clicks Without Reducing Rankings

You can hold position three for a high-volume keyword and still lose 30% of the clicks if Google adds a featured snippet, a Local Pack, a Shopping carousel, or an AI Overview above your result. The ranking didn't change — the click-through rate dropped because the SERP now answers the query before anyone reaches the organic results.

This is diagnosed in Search Console by looking at average position versus clicks side-by-side. If position is stable or improving while clicks are falling on the same pages, a new SERP feature has taken over the real estate above your result. Search the keyword yourself to see what changed.

The fix depends on the feature. For featured snippets, optimizing your content to win the snippet back is often the right move — being the snippet means zero clicks from competitors above you and a higher CTR from your result. For AI Overviews, the evidence on whether they structurally reduce clicks is still mixed, but the pattern I've seen is that informational queries are most affected while commercial and navigational queries are largely unchanged. Shifting content strategy toward commercial intent terms insulates traffic from zero-click search better than doubling down on purely informational keywords.

Seasonality Looks Like a Drop Until You Compare the Right Periods

Many businesses have seasonal traffic patterns they've never mapped because they only look at month-over-month comparisons. A B2B site that loses 25% of traffic in August every year isn't dropping — it's cycling. Comparing August 2026 to July 2026 looks alarming. Comparing August 2026 to August 2025 is the relevant measurement.

GA4 makes year-over-year comparison straightforward: in any report, switch the date range comparison from “preceding period” to “same period last year.” If the current period is down 5% versus last year and down 22% versus last month, you have seasonality, not a crisis. If both comparisons show decline, you have a real problem worth investigating.

The Diagnosis Determines the Fix

Traffic recovery without a correct diagnosis is just expensive experimentation. A site that lost traffic to an algorithm update won't recover from publishing more content. A site with a broken GA4 tag won't recover from anything — the traffic was never lost, only invisible. A page losing clicks to a featured snippet won't recover from a redirect fix.

The order of the investigation matters: verify measurement first, then cross-reference Search Console for real organic signals, then narrow the drop to specific pages or page types, then check the timeline against algorithm updates, technical changes, and backlink losses. Each step eliminates a category of causes and focuses the fix.

If you're seeing a drop and the cause isn't clear after running through this, take a look at our SEO services. A proper audit typically surfaces the cause within the first two weeks — and more often than not, the fix is less work than the initial panic suggested.

Get in touch

Related articles

seo
06.04.20268 min

Website Traffic Analysis Metrics: What to Track and What to Ignore

Most teams track website traffic. Few track it correctly. Pageviews and session counts tell you something happened — not whether it mattered. Here's how to read traffic data as a business signal, not just a number on a dashboard.

Read
seo
14.04.20268 min

How to Calculate Website Traffic: Methods, Tools, and What the Numbers Mean

Traffic numbers are only as useful as the method behind them. Most businesses either don't measure traffic at all, or measure it in a way that produces misleading data. Here's how to get accurate numbers — for your own site and for your competitors.

Read