SISTRIX measured 82,619 ChatGPT prompts over 17 weeks and found that 74% of cited domains rotate out within seven days. A single probe tells you what ChatGPT cited on that day — not what it will cite tomorrow.
74% of cited domains rotate out within 7 days (SISTRIX, 82,619 prompts)
Single-run citation checks have 40%+ week-over-week variance
When ChatGPT and Perplexity are asked the same query, only 11% of cited domains overlap. This means any 'AI visibility score' that averages across engines is statistically meaningless — you cannot average across distributions with 89% divergence.
11% domain overlap between ChatGPT and Perplexity for identical queries (ziptie.dev, 2026)
Cross-engine composite scores have no statistical validity
Perplexity's citation probability drops sharply with content age. Pages updated in the last 30 days have a 45 percentage-point advantage over pages older than 18 months. Content older than 18 months drops entirely from Perplexity's citation set.
45pp citation advantage for content under 30 days old (ziptie.dev)
Content older than 18 months is effectively invisible to Perplexity