BUSINESS GROWTH SPECIALIST
CALL TODAY AND SELL MORE TOMORROW

Is Google’s AI Messing With Your Click-Through Rates?

Google’s shiny AI Overviews (the little chatbot-y blurbs at the top of search results) are stealing attention away from your website. No conspiracy theories or guesswork required. A recent Pew Research study now shows what publishers have been suspecting all along: when Google rolls out AI-generated summaries, users click far less on actual links.

The reality? Your meticulously SEO-optimized page might never get seen. Because AI's already given people what they think they were looking for.

Quick Recap: What Are Google’s AI Overviews Again?

In case you haven’t been keeping up, Google started placing AI Overviews (those answer boxes written by AI) at the top of search results. They summarize an answer using multiple sources, ideally giving people a quicker way to find info without clicking into individual pages.

Great for users in a rush. Not so great if your business relies on organic traffic. And according to Pew, the effect isn’t just anecdotal - it’s measurable.

The Stats Google Would Prefer You Didn’t Focus On

Let’s look at the numbers behind the impact. Pew analyzed browsing data from 900 people in March 2025, and the results show a sharp dip in clicks when AI summaries are on screen:

  • Only 8% of visits with an AI summary led to people clicking on a regular search result.
  • That nearly doubles - to 15% - on pages without an AI summary.
  • Just 1% of users actually clicked within the AI-generated answer. In other words: people treat it as the final stop.

But here’s a real eye-opener: more users quit entirely after seeing an AI summary. The study found that users were far more likely to end their search session when AI Overviews were present - 26% of sessions dropped off compared with only 16% on standard search results.

So much for giving users more “insight.” The result? They don’t dig deeper. They don't click. They don’t come to your website.

Who Does Show Up in the AI Summaries?

This won’t shock anyone. Big, broad trust-point sources show up the most - Wikipedia, YouTube and Reddit took the lion’s share of links both in AI summaries and regular search results. Together, those three made up about 15–17% of cited links.

However, there's a trend:

  • Wikipedia shows up more often in the AI parts.
  • Government (.gov) sites also popped more frequently: 6% in AI vs. 2% in standard results.
  • Actual news websites? Only 5% - not any higher in AI sections compared to regular ones.

If you run an editorial site, your odds aren’t just long - they're fading fast.

Why Some Queries Trigger AI Summaries  -  and Some Don’t

Turns out the structure of your query matters a lot:

  • Longer, wordier searches (think 10+ words) were much more likely to trigger an AI response (53% vs just 8% of 1–2 word searches).
  • Start your search with a question (“what is...”, “how do…”, etc.)? Even more likely: 60% brought up AI content.
  • Basically: if you're Googling like a person types into ChatGPT, you're getting ChatGPT-like answers back. Go figure.

Why This All Matters

Google's goal is to keep users in its walled garden. And so far, the AI experience is doing exactly that - keeping searchers inside Google's platform longer while cutting down on your traffic. The good news? This isn’t irreversible.

You just need to re-think your SEO for trustworthiness - your brand has to become a name people remember, not just discover in Google. Better yet, think about ways to create what AI cites - Google tends to pull from major information hubs.

Build content that's quote-worthy, stat-backed, and positioned as an authority piece in your field, or it won’t get a second glance.