Help Center•March 22, 2026
XML Sitemap: Your Site's Roadmap for AI Crawlers
Why a valid XML sitemap helps AI bots discover all your important content.
What it measures
This check verifies whether your site has a valid, accessible XML sitemap — either at the default /sitemap.xml location or declared in your robots.txt file.
Why it matters for AI
An XML sitemap is a roadmap that tells AI crawlers about every important page on your site. Without it, crawlers have to discover pages by following links — which means they may miss pages that aren't well-linked. GPTBot, ClaudeBot, and PerplexityBot all check for sitemaps to ensure comprehensive coverage.
How to improve
- Create an XML sitemap — List all important pages with their last-modified dates
- Submit to Google Search Console — This also helps AI crawlers that piggyback on Google's index
- Keep it updated — Add new pages and remove deleted ones automatically
- Declare it in robots.txt — Add
Sitemap: https://yoursite.com/sitemap.xmlto your robots.txt
💡 Quick win
Most CMS platforms (WordPress, Ghost, etc.) generate sitemaps automatically — check if yours is already at /sitemap.xml.
