The modern search landscape is undergoing a fundamental shift. For two decades, the relationship between search visibility and site traffic was linear: rank higher, get more clicks, which meant more site visitors, more revenue. Today, that connection is not as clear. Search behaviour is leading to fewer clicks because generative summaries and refined SERP layouts provide answers directly to the user.
A Pew Research Center analysis of March 2025 browsing behaviour found that when an AI Overview appeared, users clicked a traditional search result link 8% of the time vs 15% when no AI summary appeared; clicks on links in the AI summary itself occurred only 1% of the time. (Yikes. Nail in coffin? No. Barry Adams says, Google Zero is a lie. But if you ignore what’s happening then it is a self-fulfilling prophecy.)
If you’re seeing performance issues with your search traffic, here’s how to diagnose what’s happening. This is a 20-min walkthrough designed for marketers who are familiar with GA4 reports—but don’t want to “become an SEO person” overnight. You’ll learn how to measure what changed, pinpoint where the loss lives, and decide whether it’s a DIY fix or time for a professional audit. (Video not for you? Jump down to the Assessment Workflow.)
If you’re the kind of person who wants more context, keep reading.
Background context #1: Anchor Your Analysis to the 2025 Core and Spam Update Windows
Before auditing your content, you must establish your search-performance baseline. In 2025, Google’s ranking systems underwent significant shifts across four major windows.
- March 2025 Core Update: Mar 13 → Mar 27, 2025
- June 2025 Core Update: Jun 30 → Jul 17, 2025
- August 2025 Spam Update: Aug 26 → Sep 21, 2025
- December 2025 Core Update: Dec 11 → Dec 29, 2025
You can view recent updates using the Search Status Dashboard.
Google’s own guidance for assessing the impact of these updates is surprisingly practical: confirm rollout completion, then wait at least a full week after completion before analyzing—then compare a week after vs. a week before the rollout began. Also worth noting: recovery is rarely instant. Improvements following a core update can take months to materialize and may not be reflected in the data fully until the next major system update. Knee-jerk changes during active turbulence are almost always counterproductive.
Background context #2: Distinguish Between SEO and GEO (Generative Engine Optimization)
To survive and thrive in the new search landscape, you must separate traditional search visibility from AI-driven citations. This distinction is the key to a clear diagnosis.
I recently participated in a ReaderBound webinar on search, SEO, and GEO [watch the replay for a good explainer of these terms and commentary on the impact of GEO on publisher sites]. In short:
- SEO (Search Engine Optimization) is about improving how your webpages are discovered and ranked in Google’s index—so your book pages, author pages, catalogues, and news articles show up for relevant searches. Google frames SEO as “making your site better for search engines,” with the practical goal of attracting more relevant traffic. Huzzah, we like that.
- GEO (Generative Experience / Engine Optimization) is the newer layer (sometimes called AEO, answer engine optimization), and is optimizing so that your content is selected, summarized, or cited by generative systems (like AI Overviews, AI Mode, and other AI answer engines). Do we want this? Not for whole texts, but maybe for “what to read next” queries.
And here’s the key shift I see in analytics:
- If you lose impressions + clicks, it is often a ranking / indexing / relevance issue (SEO can help).
- If you keep impressions but lose clicks (CTR), it is often a SERP layout / AI Overview / snippet competition issue (overlapping SEO + GEO strategies are required).
The Assessment Workflow: a step-by-step process to identify traffic loss after Google Core Updates
Below is the workflow I’d follow if you hired Boxcar Marketing. But I’ve adapted the steps so that you can run it in-house first. Follow it in order—because the later steps won’t make sense if the early steps are wrong.
First off, identifying the cause of traffic loss requires reconciling “pre-click” data from Google Search Console (GSC) with “post-click” data from Google Analytics (GA4). We need to look at both, and ideally Google Analytics (GA4) and Google Search Console (GSC) are linked.
Step 1: The “Boring-But-Necessary” Technical Check
Before diving into any analysis, rule out malware and setup errors in your data:
- Use the Correct GSC Property: Ensure you aren’t monitoring the wrong protocol (HTTP vs. HTTPS) or subdomain (WWW vs. non-WWW) because this is the leading cause of “lost” traffic (a.k.a., it’s not lost, you aren’t looking in the right place)
- Manual Actions & Security: Check the “Security & Manual Actions” tab in GSC. If your site is suppressed by a penalty or malware, no amount of SEO or GEO strategy will help.
Step 2: The GA4 Phase: Channel Isolation
Confirm the traffic loss is actually Google organic search.
- Navigate to GA4 Reports → Acquisition → Traffic acquisition.
- Isolate Organic Search. Add a secondary dimension for Session source / medium and filter for google / organic. This ensures you aren’t blaming Google for a drop in Bing or DuckDuckGo traffic. (Alternatively, change the dropdown dimension from Session primary channel group to Session source / medium and look for google / organic)
- The Landing Page Rule: Use the Landing Page report rather than “Pages and screens” to understand the “first page in session” context. This is vital for seeing which entry points have actually been lost. Remember to add a secondary dimension for Session source / medium and filter for google / organic.
*If you’re lost, here’s how to use secondary dimensions.
Step 3: The GSC Phase: Three-Pass Analysis
Confirm whether the traffic loss is anchored to the core or spam update windows. Remember: You can view Google’s core and spam updates using the Search Status Dashboard.
- March 2025 Core Update: Mar 13 → Mar 27, 2025
- June 2025 Core Update: Jun 30 → Jul 17, 2025
- August 2025 Spam Update: Aug 26 → Sep 21, 2025
- December 2025 Core Update: Dec 11 → Dec 29, 2025
Now, we assess:
- Pass 1: Executive Summary: Compare one week post-update vs. one week pre-update. Note the trend: Are impressions down (Ranking Loss) or is just CTR down (SERP Layout/AI Competition)?
- Pass 2: Page-Level Diagnosis: Sort by the largest click deltas (the change in clicks). Identify if the loss is concentrated in specific silos, such as author bios, editorial resources, or product pages.
- Pass 3: Query-Level Diagnosis: Look for “Content Decay.” If impressions for a query are stable but clicks have fallen, you are facing a GEO problem where AI is satisfying the user’s intent on the SERP.
Step 4: Create Your Triage Dashboard with Looker Studio and export to Google Sheets
For publishers with vast websites, manual review is insufficient. Instead we want to build a diagnostic “Traffic Loss” spreadsheet to automate prioritization [watch the video walkthrough on diagnosing search traffic loss].
Google’s Search Central team provides a Looker Studio template for monitoring Search Console data. We can modify that template to see what pages have the greatest loss, and what search queries are factoring into that loss.
- Log into your Google Account (whatever you use to view GA4 and GSC)
- Go to the Google Search Console Looker Studio template
- Add your own data and Edit and Share so you have a copy you can modify.
- Copy the Landing Page table to a new page.
- Add the dimension Query
- Add the metric Average Position
- Set your reporting timeframe to the period you are investigating (one week post-update).
- Turn on compare and set the comparison period to one week pre-update.
- Adjust the style so you have a table with
- Page, Query dimensions and then the following metrics
- Impressions, Impression Change, Clicks, Click Change, CTR, CTR Change, Average Position, Average Position Change
- Make sure to set the delta columns to absolute change vs % change.
- Now export the data to CSV or Google Sheets (click the three dots in the top right corner of the table)
Step 5: Automatic Diagnosis Labels in Google Sheets
Now that your data is in Google Sheets, you can sort, filter, run pivot tables and complete your analysis using IF-based logic formulas. Here are the labels I assign to each row:
- Ranking Loss: If Position drop AND Impressions are down. This indicates an SEO/Relevance issue.
- CTR Problem: If Positions and Impressions are stable, but Clicks have dropped significantly. This is the sign of AI Overview competition.
- Demand Drop: If Positions are steady, but both Impressions and Clicks have fallen. This indicates the topic itself is losing search volume (seasonality or trend shift).
Here’s the IF statement I use to automate the label/analysis process. The example uses benchmarks for what is “stable” or “loss” in the dataset I was reviewing.
=IF(AND($F2>=0,$D2>0),”OK. Clicks are stable or growing”,
IF(AND($J2>2,$D2<-50),”Ranking Loss: Position drop AND Impressions are down. This indicates an SEO/Relevance issue.”,
IF(AND(ABS($D2)<=50,$J2<2,$H2<0),”CTR Problem: Positions and Impressions are stable, but Clicks have dropped significantly. This is the sign of AI Overview competition.”,
IF(AND(ABS($J2)<=2,$D2<0,$F2<0),”Demand Drop: Positions are steady, but both Impressions and Clicks have fallen. This indicates the topic itself is losing search volume (seasonality or trend shift).”,”Other”))))
Bonus: Embrace Pivot Tables
I like exporting to Google Sheets so that I can automate the diagnosis but I also love using pivot tables to identify:
- Types of Pages losing traffic (what content type has the loss? Book pages, author bios, blog posts, resources?)
- Quick SEO opportunities (what pages have fallen off page 1 of SERPs? Identify the pages in position 8-20 with high impressions?)
If your Traffic Loss spreadsheet says “Ranking Loss” across core revenue pages (book pages, author pages, submissions), that’s your cue for a technical audit. If it’s “CTR Problem” on informational content, that’s your cue for a refreshed GEO + SEO content strategy. Either way, panic is not a KPI. Let me know if you want help.
Long-Term Survival in the Generative Landscape
As we look forward, old SEO strategies must pivot from chasing clicks to building a resilient content destination.
- Adopt GEO Structures: Optimize for “citations” by using clear, concise answers to specific questions and adopting conversational language that generative models can easily parse and credit.
- Diversify Discovery: Reduce your reliance on Google. Shift resources toward email marketing, LinkedIn, Substack and YouTube to build direct audience relationships that don’t depend on a middleman.
- Unique Value Provision: Move away from commodity content (clickbait content, non-differentiated content). Exclusive research, deep-dive interviews, and unique analysis provide utility that an AI summary cannot replicate.
In conclusion
Traffic volatility is no longer a temporary glitch; it is the new baseline. As AI Overviews expand across hundreds of countries and dozens of languages, publishers must confront a difficult strategic question:
Does your content provide enough unique insight or utility to justify a click, even after an AI has already provided the “answer”?
If the answer is no, SEO is a short-term Band-Aid. The path to reclaiming lost traffic lies in becoming a destination that users seek out directly—a brand that users trust for the nuance that AI cannot provide.






