Help Center•March 22, 2026
HTML Payload Size: Keeping Pages Light for AI Crawlers
How large HTML payloads can cause AI crawlers to truncate or skip your content.
What it measures
This check measures the total size of your page's HTML response (in KB) and DOM element count. Large pages with bloated HTML take longer to download and are more likely to be truncated by AI crawlers.
Why it matters for AI
AI crawlers have limits on how much HTML they'll process per page. Very large payloads (>200 KB) may be truncated, meaning content at the bottom of the page never gets indexed. DOM bloat from deeply nested layouts also slows parsing.
| HTML Size | Rating | Impact |
|---|---|---|
| < 100 KB | Good | Lightweight — fully processed by all crawlers |
| 100–200 KB | Moderate | Some crawlers may truncate |
| > 200 KB | Large | High risk of truncation — important content at the end may be lost |
How to improve
- Remove inline CSS and JS — Move styles and scripts to external files
- Reduce DOM complexity — Simplify nested div structures
- Remove HTML comments and whitespace — Enable HTML minification
- Lazy-load non-critical content — Below-the-fold content can be deferred
💡 Quick win
Check if your page has inline <style> or <script> blocks that could be moved to external files.
