AI Referral Traffic: Who Sends It, Who Hides It

ChatGPT sends 78-87% of all AI referral traffic, but much of it hides as "direct" in GA4. Here's what the data actually shows about volume, conversion, and attribution across ChatGPT, Perplexity, and Gemini.

AI SearchAI SearchAnalyticsChatGPTPerplexity

ChatGPT sends roughly 78-87% of all AI referral traffic to websites. Perplexity cites more sources but sends fewer visitors. Gemini barely registers today but is growing at 388% year-over-year. And none of it adds up to more than about 1% of total web traffic.

That last number matters. A 527% year-over-year increase in AI referrals sounds dramatic until you realize the base was nearly zero. The real story isn't volume. It's what happens when those visitors actually arrive.

Citations Don't Equal Clicks

The most counterintuitive finding in SE Ranking's research is the gap between being cited and getting traffic. Perplexity cited 8,047 sources across one case study site; ChatGPT cited 5,195. But ChatGPT delivered 73% of actual traffic versus Perplexity's 17%.

Why the disconnect? Users consume Perplexity's answers without leaving. Its citation-first design means the answer is the destination. ChatGPT, by contrast, displays clickable source cards that function more like search results. The platforms have fundamentally different relationships with the click.

This reframes what "visibility" means in an AI context. Getting cited in Perplexity is a brand awareness play. Getting cited in ChatGPT is closer to a traditional referral channel. Both matter, but they're not the same metric, and conflating them leads to bad strategy.

For the case study site that tracked this closely (80,000 annual sessions), AI platforms collectively sent about 200 sessions over 12 months. Google organic still drove 95% of all traffic.

We're talking about a signal that's genuinely tiny in absolute terms but disproportionately valuable per visit.

Small Volume, Extreme Conversion Gap

The volume numbers are modest. The conversion numbers are not.

AI-sourced traffic converts to sign-ups at 1.66% versus 0.15% from organic search; that's an 11x difference. Visitors from AI platforms spend 67.7% more time on pages: 9 minutes 19 seconds versus 5 minutes 33 seconds for organic search visitors.

Our read: this makes intuitive sense. Someone who clicks through from an AI answer has already been pre-qualified by the model. They didn't land on your page from a broad keyword match. They arrived because an AI system specifically cited your content as authoritative on a question they cared about. That's a fundamentally different kind of visitor.

The practical implication is that optimizing for AI citations is less about traffic volume and more about reaching high-intent users at exactly the moment they're evaluating solutions. A hundred AI-referred visitors converting at 11x the organic rate may be worth more than a thousand organic visitors who bounce.

Your Analytics Are Probably Wrong

Not all AI traffic shows up correctly in GA4, and the platform that sends the most traffic is the worst at attribution.

Dark AI traffic: AI-referred visits that analytics can't properly attribute. ChatGPT's Atlas browser frequently strips referrer headers, causing real visitors from ChatGPT to appear as "Direct" or "(not set)" in your reports. Your actual AI traffic is almost certainly higher than what GA4 shows.

Perplexity, by contrast, passes clean referrer data through perplexity.ai/referral, making it straightforward to track. The irony: the platform sending less traffic is far easier to measure accurately.

One way to investigate: look at your direct traffic trends. If direct sessions have increased while no other acquisition channel explains the growth, some of that "direct" traffic may actually be AI-referred visitors whose referrer headers were stripped. Segment by landing page and look for patterns. Pages that rank well in search (and are therefore likely to be cited by AI systems) showing unusual direct traffic spikes are a strong signal.

Setting up a custom "AI/LLM Traffic" channel in GA4 using regex patterns matching chatgpt.com and perplexity.ai as session sources is the baseline. But understand that this only captures the traffic that self-identifies correctly.

Gemini Is the Long Game

ChatGPT dominates AI referrals right now with roughly 78% market share. Gemini sits at 6.4%. But Gemini's growth trajectory and structural advantages make it the platform to watch for the next two years.

Gemini is embedded in Google Search AI Overviews. It's integrated into Android at the system level. It sits in the Chrome sidebar. These aren't app installs competing for user attention; they're ambient access points reaching 2 billion monthly searchers who never have to open a separate product.

Digiday's data shows Gemini referral traffic grew 388% year-over-year compared to ChatGPT's 52%. The absolute numbers are still small, but the distribution advantage is structural, not cyclical. Where ChatGPT needs users to actively choose it, Gemini intercepts them during searches they were already doing.

Content already ranking well in Google and Bing naturally appears as citations in AI responses. That's not a coincidence; it's the retrieval pipeline at work. Strong organic foundations feed AI citation eligibility, which is why our AEO guide emphasizes treating answer engine optimization as an extension of SEO, not a replacement.

Our read: The practitioners tracking AI referral traffic today are mostly measuring ChatGPT's contribution and calling it a day. That's a reasonable short-term approach but a poor long-term strategy. Gemini's integration into Google's existing surfaces means it will likely surpass ChatGPT in total AI interactions within two years, even if ChatGPT retains higher per-session click-through rates.

The metric that matters isn't which platform sends the most traffic right now. It's which one your audience will encounter most frequently in 2027. Build your measurement and optimization strategy for that world, not this one.

Frequently Asked Questions