The single file that tells ChatGPT, Claude, Perplexity, and Gemini exactly what your website does, what pages exist, and how to cite you accurately. Without it, AI agents guess — and guess wrong.
Free to generate · Deploys in 60 seconds · Supported by Claude, Perplexity & more
llms.txt file?llms.txt is a plain-text file you place at the root of your website — at yoursite.com/llms.txt — that serves as a machine-readable map of your entire site for AI language models.
Think of it as robots.txt for the AI era. Where robots.txt told search engine crawlers what not to index, llms.txt actively tells AI agents what your site is, what your pages contain, and how to accurately represent your brand.
Without it, when a user asks ChatGPT “who are the best AI SEO tools in the UK?” — the AI is guessing at your relevance. With a well-formed llms.txt, you control the narrative.
The llms.txt standard was proposed in 2024 and has already been adopted by Perplexity, Claude, and several major AI agents as a trusted source for site-level context.
llms.txtAI language models don't browse the web the way humans do. They need structured, token-efficient signals to understand who you are.
AI agents process thousands of tokens per second. Your llms.txt gives them your entire site's purpose, structure, and key pages in under 200 tokens — the most efficient signal you can send.
Without llms.txt, AI agents hallucinate details about your business — wrong prices, outdated services, incorrect locations. With it, they read your authoritative version of the truth.
You decide which pages get cited. llms.txt lets you surface your highest-value content — case studies, pricing, key services — ahead of everything else.
Once deployed, AI agents begin reading your llms.txt within hours. No waiting for crawl schedules, no hoping the algorithm finds you — it's a direct channel to AI reasoning.
The optional section lets you point AI agents to a separate file listing pages you don't want cited — login pages, internal tools, draft content. Full control, both ways.
Fewer than 5% of websites have an llms.txt today. Deploying one now puts you in an elite tier of AI-readable businesses while your competitors remain invisible to AI agents.
llms.txt in 60 secondsInnotek automates every step. No manual writing, no guessing at format — just a perfect, deployable file.
Paste your domain. Innotek begins crawling immediately — no account required for the first scan.
Our crawler identifies every indexable page, extracts titles and descriptions, and scores them by AI relevance.
Each page gets a token-optimised description — concise, factual, and structured in the exact format AI agents expect.
Download your llms.txt file. Upload it to your server root. Done. AI agents start reading it within hours.
| Feature | Writing manually | Other tools | Innotek |
|---|---|---|---|
| Auto-crawls all pages | ✗ | – | ✓ |
| Token-optimised descriptions | ✗ | ✗ | ✓ |
| Paired with JSON-LD schema | ✗ | ✗ | ✓ |
| GEO audit score included | ✗ | ✗ | ✓ |
| Updates automatically on re-audit | ✗ | ✗ | ✓ |
| MCP-compatible output | ✗ | ✗ | ✓ |
| Time to generate | 2–4 hours | 30 mins | 60 seconds |
| Cost | Your time | Varies | Free tier available |
Every file follows the official llms.txt spec — clean Markdown, token-efficient descriptions, and an optional section for pages you want to exclude.