Rotate for Promagen

Promagen is built for landscape viewing. Turn your phone sideways for the best experience.

What is AI visibility?The measurable surface between your content and the AI engines that cite it.

AI visibility is whether ChatGPT, Claude, Perplexity, and Gemini can find your page, read it, understand it, and cite it inside the answers they generate for users. It is measurable, it drifts week over week, and it is not the same thing as ranking on Google. This page defines the term, names its five components, and explains the gap between AI visibility and the SEO playbook most sites already run.

By Martin Yarnold · Updated
Free Sentinel snapshot
Promagen Sentinel measures your AI visibility weekly and emails you the score. See the same audit Promagen runs on itself.
See how Sentinel measures it →

What AI visibility is — and is not

It is

Whether AI engines can crawl your pages, read the content, understand the entities, parse the structured data, and reach a page through internal links. A measurable composite score that drifts over time.

It is not

Google rankings, AI Overviews opt-in status, ad spend, social engagement, or backlink count. None of these predict whether a generative engine will cite your page inside its answer.

The five components Sentinel measures

Sentinel computes a composite 0–100 score from five components every Monday. The weights are fixed in `frontend/src/types/sentinel.ts` and have stayed constant since Sentinel went live. They sum to 1.0 by construction.

ComponentWeightWhat it measures
Availability40%Pages that return 200 to AI crawlers across the measurement window.
Metadata20%Pages where title, meta description, and canonical URL are all present.
Schema15%Pages emitting valid JSON-LD structured data AI engines can parse.
Regression burden15%Count of unresolved week-over-week drops, weighted by severity.
Orphan risk10%Pages with fewer than three inbound internal links from the same site.

Why this is not a rebadged SEO score

SEO grew up on the assumption that a human is going to read a list of ten blue links and click one. Every signal in classic SEO — keyword targeting, click-through rates, time on page, bounce rate — assumes a human reader on the receiving end. AI visibility assumes a generative engine on the receiving end. The engine does not click. It reads, decides whether your page deserves to be cited, paraphrases it, and writes the answer for the human.

That changes which signals matter. Crawl response codes, schema validity, entity clarity, internal-link topology, and freshness all matter more. Keyword density, anchor-text optimisation, and PageRank-style link authority all matter less. A page can score 95 on a traditional SEO audit and still be invisible to ChatGPT because schema is missing, or because the engine cannot disambiguate the page's primary entity.

Frequently asked questions

How is AI visibility different from SEO?

SEO measures whether a search engine ranks your page in a results list a human clicks. AI visibility measures whether a generative engine — ChatGPT, Claude, Perplexity, Gemini — cites your page inside a synthesised answer. The two correlate weakly: AI engines retrieve from different indexes, weight different signals, and present sources differently. A page can rank well on Google and be invisible to ChatGPT, or sit on page two of Google and still be cited by Perplexity. Treat AI visibility as its own measurement, not a derivative of SEO.

What are the five components of an AI visibility score?

The composite Sentinel publishes weights five components: availability (40%), metadata (20%), schema (15%), regression burden (15%), and orphan risk (10%). Availability is whether pages return 200 to AI crawlers. Metadata is whether title + meta description + canonical URL are all present. Schema is whether JSON-LD structured data describes the page. Regression burden penalises unresolved week-over-week drops. Orphan risk penalises pages with too few inbound internal links. The weights are defined in `frontend/src/types/sentinel.ts` and have stayed constant since Sentinel went live.

Can I check whether ChatGPT can see my site?

Yes — three quick tests. First, confirm `OAI-SearchBot`, `GPTBot`, and `ChatGPT-User` are allowed in your robots.txt (they have distinct purposes; one allow line is not enough). Second, ask ChatGPT directly: prompt it to find your most important page and report what it sees — if it cannot find the page or paraphrases stale content, the page is invisible or out of date in its index. Third, run a structured audit that checks crawl response codes, metadata, JSON-LD schema, and inbound link topology in one pass. The third is what Sentinel does every Monday.

Which AI engines cite sources most often?

Perplexity is the most citation-heavy by design — every answer carries numbered source links, and source quality directly affects ranking. ChatGPT Search shows source attribution on retrieval-augmented answers; pure model knowledge does not. Claude returns inline citations when invoked with search or document context. Gemini cites visibly in AI Overviews and Search Generative Experience but less consistently in the standalone Gemini app. The rule of thumb: the more retrieval an engine performs at answer time, the more your page can be cited.

How often do AI engine crawl and citation patterns change?

Crawl behaviour drifts week over week. Engines re-index on their own cadence, add and remove sources from training cuts, and change retrieval ranking quietly. A site that was cited prominently in one month can drop out of an engine's answer set the next without any change to the site itself. This is why a snapshot audit is insufficient: AI visibility needs continuous measurement on a recurring cadence (Sentinel runs weekly) so drift is detected before it costs traffic.

Does llms.txt improve AI visibility?

It helps, but it is not the lever most sites think it is. llms.txt is a machine-readable markdown summary AI engines can ingest as authoritative metadata for a site. Serving one is a Tier-2 differentiator most sites skip. But llms.txt does not replace robots.txt, sitemap.xml, structured data, or substantive on-page content — it sits on top of them. A site with broken availability, missing schema, and no inbound links will not become citable by adding an llms.txt file. Fix the foundation first, then add llms.txt as a final clarifying signal.

Does AI visibility translate to traffic?

It translates differently than SEO. Click-through from an AI citation is lower than from a search result, because the answer is often complete enough that the user does not click. But when they do click, intent is much higher — they have already read a summary that matches their question and want the source. Treat AI citations as a brand and intent surface, not a volume surface. Measure them separately from organic search traffic so trends are not masked by the larger SEO channel.

Get a free Sentinel snapshot →

ChatGPT is a trademark of OpenAI. Claude is a trademark of Anthropic. Perplexity is a trademark of Perplexity AI. Gemini is a trademark of Google LLC. Promagen Ltd is independent of these companies.

Methodology source: Sentinel health-score weights are defined in `frontend/src/types/sentinel.ts` and have not changed since Sentinel went live. See /what-is-ai-visibility/claims.json for the machine-readable claim ledger and provenance hash.

provenance: sha256:0461f1e9f753ea80