What We’re Learning from Early AI Overview & LLM Visibility Tracking hero image
What We’re Learning from Early AI Overview & LLM Visibility Tracking hero image

AI-driven content discovery is evolving fast. Google AI Overview, ChatGPT, Perplexity, Gemini… these platforms are starting to influence what users see and what clicks through to your site. But tracking performance here is tricky.

Unlike traditional SEO or GA4 analytics, there’s no standard dataset, no universally agreed-upon methodology, and, for now, only “good, not great” ways to measure visibility.

In this blog, we’ll share our early experiments with GEO and AEO tracking, lessons learned from peers, and what you can practically do today to understand when and how your content shows up in AI-driven results.

Why GEO and AEO Tracking Is So Hard

AI tools personalise results for each user. Factors include:

  • Location: Geographically-specific content can surface differently.
  • Query context: User history, phrasing, and intent affect what they see.
  • Model behaviour: Each LLM has its own “citation” logic and internal weighting for sources.

This means that two users asking the same question might get different answers. There’s no way to know exactly what everyone sees, and no way to capture all prompts.

In short, signals exist, but they’re noisy. And yet, tracking early trends can be surprisingly useful.

Early Experiments: What We’ve Tried

From recent experiments and Reddit discussions with other SEOs exploring AI visibility, a few patterns have emerged:

1. Citation Patterns Vary Across LLMs

Each model seems to pull from different sources or emphasise them differently. Roughly 40–50% of the time, the same domains appear across Google AI Overview, ChatGPT, and Perplexity citations… but not always.

Reddit users noted that this may stem from an engine dependency: some LLMs indirectly surface information ranked by search engines. Others rely more on training data or “trusted” sources.

2. Structured Data Helps, But It’s Not Magic

Some early experiments suggest that structured data, clean HTML, and well-marked schema can increase the likelihood that your content will be surfaced. But as one Reddit user put it: “Simple. They do not weigh schema in a traditional sense.” It helps models better understand context, but it’s not a direct ranking factor.

3. Referral Tracking Is Still Useful

Even without full visibility into AI prompts, website analytics provide measurable signals:

  • Traffic from links surfaced in AI tools
  • Clicks from AI-assisted answers
  • Landing page behaviour

While most AI answers don’t generate direct clicks, traffic from sources you can measure is real and actionable.

4. Query Fan-Out and Observability

Reddit contributors described “sandboxing” as a way to track AI visibility:

  • Create controlled prompts to test how your content is surfaced.
  • Compare output across multiple LLMs.
  • Identify “data vacuums”, which are queries your content doesn’t cover, leaving the AI to pull from competitors.

This is more observability than traditional analytics: you log prompts, responses, citations, and embeddings, then correlate them to relevance and user intent.

5. Early Tools Are Emerging

Some SEOs are testing early AI-visibility platforms:

  • Peec / Aiclicks: Track brand mentions and AI citations.
  • Brandlight: Aggregated visibility metrics and recommendations.
  • Verbatim Digital: Simpler outputs closer to what users see in AI responses.

None are perfect yet, but they help spot trends before full datasets exist.

6. Bing Webmaster Tools Introduces AI Citation Tracking

One of the biggest recent developments in AI visibility tracking comes directly from the search engines themselves. Microsoft has launched AI Performance reporting in Bing Webmaster Tools (Public Preview), giving publishers their first native view into how their content appears in AI-generated answers across Bing, Copilot, and partner integrations.

This is a meaningful shift. For the first time, site owners can see when their content is being cited by AI systems, not just when it ranks in traditional search.

The new dashboard includes:

  • Total citations: How often your content is referenced as a source in AI-generated answers.
  • Cited pages: Which specific URLs are being cited most frequently.
  • Grounding queries: The phrases and topics AI systems associate with your content when generating answers.
  • Citation trends over time: How your visibility in AI answers is increasing, decreasing, or changing.

This doesn’t show rankings in the traditional sense, but it does reveal something arguably more important: whether your content is being trusted as a source. And that distinction matters.

In AI-driven discovery, visibility isn’t just about appearing as a blue link. It’s about being referenced, summarised, and used to construct answers. This is effectively the first native tooling for GEO.

What This Means in Practice

This release validates what many early experiments have already suggested, that AI visibility is measurable, just differently from traditional SEO. It also provides practical ways to improve your content:

  • Strengthen topical depth: Pages that are cited often tend to have clear expertise and focus.
  • Improve structure and clarity: Well-structured headings, FAQs, and clean formatting make it easier for AI systems to extract and reference information.
  • Keep content fresh: Updated content is more likely to be surfaced and cited.
  • Align entities consistently: Make sure your brand, products, and services are clearly and consistently described across your site.

Importantly, this moves AI visibility from guesswork toward observable performance. It’s still early. Bing’s data is only one piece of the ecosystem, and other platforms like Google AI Overview and ChatGPT don’t yet provide equivalent reporting. But it’s a clear signal that AI visibility is becoming a measurable channel.

We’re Learning that Tracking is Good, Not Great

Across experiments, a few truths are clear:

  • Results are personalised: Each user sees something slightly different.
  • Direct traffic is measurable, but limited: Not all AI answers generate clicks.
  • Observational data is messy: Anecdotal evidence helps spot patterns, but isn’t statistically robust.
  • Early signals are strategic, not tactical: AI visibility often reflects intent alignment rather than pure traffic potential.

This is why we call it “good, not great” tracking: it’s directional, it surfaces opportunities, but it won’t replace core SEO or paid media tracking.

How to Put This Into Practice

Here’s a framework for turning fuzzy AI signals into actionable insights:

  1. Track referral traffic from AI-sourced links in GA4 or Looker Studio.
  2. Monitor AI citations using early visibility tools and note recurring domains.
  3. Sandbox prompts to test which content gets surfaced across models.
  4. Identify gaps where your content isn’t appearing, call these gaps “data vacuums” to target with new content.
  5. Focus on intent-heavy queries: AI answers reflect clear user intent, which can translate to higher-value leads, even if traffic volume is low.
  6. Compare against traditional SEO data: question-based or long-tail phrases can guide AEO targeting.

From Unmeasurable to Actionable

Tracking AI visibility is messy, imperfect, and early-stage. But the signals are there, and they’re useful:

  • AI-driven visibility tells you where your content aligns with user intent.
  • Observing trends helps fill content gaps and maintain a competitive advantage.
  • Measurable traffic, though limited, provides concrete ROI signals.

Like early SEO in the 2000s, it’s fragmented, unstandardised, and sometimes frustrating. But for those willing to experiment, it’s an opportunity to get ahead of competitors and understand how AI sees your content.

Next Steps:

  • Start logging AI prompts and tracking citations for your content.
  • Build a dashboard for referral and LLM-driven signals.
  • Use insights to refine clusters, structured data, and content targeting.

It’s not perfect, but directionally, it’s better than flying blind.

If you’d like to understand how your content is performing across AI search, or identify opportunities to improve your visibility, get in touch with Yoghurt Digital. We’ll help you translate early AI signals into a clear, actionable strategy for growth.

Request A Free Consultation

No Obligations

    Grow my website trafficIncrease my conversion rateOrganise a training sessionForm a strategic partnershipIncrease my market share

    I want to...

    Grow my website traffic
    Increase my conversion rate
    Organise a training session
    Form a strategic partnership
    Please select a service
    Required
    Required
    Required
    Required