Intermediate

AI Overview Presence Tracker

Determine whether a target query triggers a Google AI Overview, who is cited inside it, and what content shape the cited sources share.

When to use this prompt

Run this when you are tracking a specific set of priority commercial queries and need to know whether Google AI Overviews are appearing on them, who is cited, and how the cited content is structured. Pair it with the citation share audit for a complete view of AI search visibility.

This is also the right prompt to run after Google rolls out an AI Overview update. Coverage and citations shift after each update, and being late to notice usually means losing share before competitors.

The prompt

<role>Search analyst checking Google AI Overview presence and citation patterns. You have live web access. Use it.</role>

<task>For each query, check the live Google search result. Record whether an AI Overview appears, who is cited, what shape the overview takes, and how the cited pages are structured.</task>

<inputs>
<target_domain>[YOUR DOMAIN]</target_domain>
<queries>
1. [QUERY 1]
2. [QUERY 2]
3. [QUERY 3]
4. [QUERY 4]
5. [QUERY 5]
</queries>
</inputs>

<instructions>
1. Run a live Google search for each query. Do not infer from training data.
2. For each query, record:
   - AI_OVERVIEW_PRESENT: Yes or No.
   - CITED_DOMAINS: comma-separated domains cited in the AI Overview, in the order they appear.
   - CITED_TITLES: a one-line topic summary for each cited page, in the same order.
   - ANSWER_SHAPE: exactly one of [Definition, Comparison, List, How-To, Recommendation].
   - WORD_COUNT_ESTIMATE: approximate word count of the AI Overview itself.
3. If a query does not produce an AI Overview, mark all subsequent fields "n/a" and move on.
4. After the table, output a summary block per the format below.
5. Do not speculate. If you cannot retrieve a value, write "could not retrieve" rather than fabricating one.
</instructions>

<output_format>
Markdown table with columns:
Query | AI Overview | Cited Domains | Cited Titles | Answer Shape | Word Count

After the table, output exactly:

Summary:
- AI Overview present on: X/5 queries
- Target domain cited in: X/X queries with overviews
- Most common answer shape: [Definition / Comparison / List / How-To / Recommendation]
- Pattern in cited page types: [1-2 sentences naming the dominant page type, e.g. "Comparison pages and definition pages dominate; pillar pages cited more than blog posts."]
</output_format>

How it works

XML tags separate inputs from instructions, which is critical when the prompt is run inside an agent that is also routing tool calls. The “use live web access” line in <role> blocks the most common failure mode: the model hallucinating overview content from its training data instead of running a real search.

The ANSWER_SHAPE field is the key insight. If 80% of your priority queries trigger comparison-style AI Overviews, the highest-leverage content investment is comparison pages, not blog posts. Most teams skip this step and write whatever they were going to write anyway.

The CITED_TITLES field tells you what type of page is winning. If Reddit threads dominate, your problem is not content structure, it is content surface area. You may need to seed discussion in places you cannot directly control.

Example output

QueryAI OverviewCited DomainsCited TitlesAnswer ShapeWord Count
best product analytics tool 2026Yesg2.com, capterra.com, mixpanel.comTop analytics list, Analytics buyer guide, Mixpanel pillar pageComparison~180
what is product analyticsYesamplitude.com, gartner.com, heap.ioProduct analytics definition, Gartner definition, Heap explainerDefinition~120
mixpanel vs amplitudeNon/an/an/an/a

Summary: 4/5 queries had an AI Overview present. Target domain cited in 1/4. Most common answer shape: Comparison (3/4). Pattern: Comparison pages and definition pages dominate citations. Pillar pages are cited more than blog posts.

Variations

  • Position tracking: Add a CITATION_POSITION column to track whether your domain is the first, second, or third citation. Position matters because the first citation gets the most click-through.
  • Page-type focus: Limit the panel to one page type (definitions, comparisons, how-tos) to deeply optimize for one shape at a time.
  • Weekly diff: Run weekly and add a “previous overview present” column to flag queries that gained or lost AI Overview coverage.