HomeHelp CenterBot Access Rules: robots.txt & llms.txt for AI Crawlers
    Help CenterMarch 22, 2026

    Bot Access Rules: robots.txt & llms.txt for AI Crawlers

    How robots.txt and llms.txt control which AI bots can access your content.

    What it measures

    This check analyses your robots.txt and llms.txt files to determine whether AI crawlers are allowed to access your content. It checks for blocks against specific AI user agents like GPTBot, ClaudeBot, and PerplexityBot.

    Why it matters for AI

    robots.txt is the primary way websites tell crawlers what they can and can't access. If you've blocked AI agents (either specifically or with a blanket "Disallow: /"), they will respect that and skip your content entirely. The newer llms.txt file can provide additional guidance specifically for AI agents.

    StatusMeaningImpact
    All agents allowedNo blocks in robots.txt or llms.txtAI crawlers can access all content
    Some agents blockedSpecific AI bots are disallowedThose bots won't index your content
    All blockedBlanket Disallow: / for all agentsNo AI crawler can access your site

    How to improve

    1. Review your robots.txt — Ensure it doesn't block GPTBot, ClaudeBot, or PerplexityBot
    2. Remove blanket Disallow: / — If you have this under User-agent: *, all crawlers are blocked
    3. Consider adding llms.txt — This newer standard lets you provide guidance specifically for AI agents
    4. Be selective — Block specific paths you don't want indexed, not entire domains

    💡 Quick win

    Check your robots.txt right now — visit yoursite.com/robots.txt and look for "Disallow" rules that might be blocking AI bots.

    Was this article helpful?