Help Center•March 22, 2026
Bot Access Rules: robots.txt & llms.txt for AI Crawlers
How robots.txt and llms.txt control which AI bots can access your content.
What it measures
This check analyses your robots.txt and llms.txt files to determine whether AI crawlers are allowed to access your content. It checks for blocks against specific AI user agents like GPTBot, ClaudeBot, and PerplexityBot.
Why it matters for AI
robots.txt is the primary way websites tell crawlers what they can and can't access. If you've blocked AI agents (either specifically or with a blanket "Disallow: /"), they will respect that and skip your content entirely. The newer llms.txt file can provide additional guidance specifically for AI agents.
| Status | Meaning | Impact |
|---|---|---|
| All agents allowed | No blocks in robots.txt or llms.txt | AI crawlers can access all content |
| Some agents blocked | Specific AI bots are disallowed | Those bots won't index your content |
| All blocked | Blanket Disallow: / for all agents | No AI crawler can access your site |
How to improve
- Review your robots.txt — Ensure it doesn't block GPTBot, ClaudeBot, or PerplexityBot
- Remove blanket Disallow: / — If you have this under User-agent: *, all crawlers are blocked
- Consider adding llms.txt — This newer standard lets you provide guidance specifically for AI agents
- Be selective — Block specific paths you don't want indexed, not entire domains
💡 Quick win
Check your robots.txt right now — visit yoursite.com/robots.txt and look for "Disallow" rules that might be blocking AI bots.
