llms.txt and MCP
Comparing llms.txt and Model Context Protocol (MCP) — static discovery vs dynamic protocol, mcpdoc bridge, evolution scenarios
What is MCP
Section titled “What is MCP”Model Context Protocol (MCP) — an open protocol for LLM interaction with external data and tools. Created by Anthropic (November 2024), transferred to Linux Foundation (December 2025).
Numbers as of early 2026 (source — modelcontextprotocol.io): 8M+ SDK downloads per week, 4,000+ MCP servers. Adopted by OpenAI (March 2025), Google, Microsoft.
MCP enables AI models to:
- Read files and databases
- Make API calls
- Execute tools
- Retrieve context from external sources
Comparison
Section titled “Comparison”| llms.txt | MCP | |
|---|---|---|
| Type | Static file | Dynamic protocol |
| Format | Markdown | JSON-RPC 2.0 |
| Discovery | Passive (HTTP GET) | Active (client-server) |
| Requirements | Any HTTP client | MCP SDK (client + server) |
| Content | Read-only | Read + write + tools |
| Complexity | Create in 15 minutes | Server development required |
| Security | Read-only, no risks | Tool poisoning, prompt injection |
Complementary Roles
Section titled “Complementary Roles”llms.txt and MCP solve different problems at different levels:
llms.txt = Discovery Layer → "What's on this site?"MCP = Interaction Layer → "Give me specific data"llms.txt — the entry point. An AI system learns what documentation is available, reads descriptions, selects needed pages.
MCP — deep interaction. An AI system calls tools, requests specific data, executes actions.
Bridge: mcpdoc
Section titled “Bridge: mcpdoc”mcpdoc (langchain-ai, 79+ stars) — an MCP server that takes llms.txt as a data source and serves documentation via MCP protocol.
{ "mcpServers": { "docs": { "command": "uvx", "args": [ "--from", "mcpdoc", "mcpdoc", "--urls", "Astro:https://docs.astro.build/llms.txt", "Stripe:https://docs.stripe.com/llms.txt", "--transport", "stdio" ] } }}Another tool: mcp-llms-txt-explorer — an MCP server for navigating any site’s llms.txt.
llms.txt as MCP Data Source
Section titled “llms.txt as MCP Data Source”Scenario: an AI assistant in an IDE (Claude Code, Cursor, Windsurf) connects documentation via MCP.
Step 1: Site publishes llms.txt
Section titled “Step 1: Site publishes llms.txt”# Astro Documentation
> Astro is a web framework for building content-driven websites.
## Docs
- [Getting Started](https://docs.astro.build/en/getting-started/): Quick start guide- [Configuration](https://docs.astro.build/en/reference/configuration/): astro.config.mjsStep 2: mcpdoc turns llms.txt into an MCP server
Section titled “Step 2: mcpdoc turns llms.txt into an MCP server”uvx --from mcpdoc mcpdoc \ --urls "Astro:https://docs.astro.build/llms.txt" \ --transport stdioStep 3: AI assistant requests documentation
Section titled “Step 3: AI assistant requests documentation”The AI receives structured context from llms.txt → loads needed pages → responds with up-to-date documentation.
Live Examples
Section titled “Live Examples”Sites implementing both approaches:
- mcpdoc.ru — MCP server documentation, implements llms.txt (llms.txt)
- n8n-mcp.ru — N8N + MCP integration, also with llms.txt (llms.txt)
Security
Section titled “Security”MCP introduces new attack vectors that don’t apply to llms.txt:
| Threat | MCP | llms.txt |
|---|---|---|
| Tool poisoning | Malicious server injects harmful instructions | Not applicable (no tools) |
| Prompt injection | Server data modifies LLM behavior | Minimal risk (read-only) |
| Rug-pull | Server changes behavior after gaining trust | Not applicable (static file) |
| Data exfiltration | Via tool calls | Not applicable |
llms.txt is a static, read-only file. Its security is equivalent to any public HTML page.
Evolution Scenarios
Section titled “Evolution Scenarios”MCP Absorbs llms.txt Use Cases
Section titled “MCP Absorbs llms.txt Use Cases”MCP adds a standard discovery mechanism. Sites register MCP servers instead of publishing llms.txt. Possible, but requires significantly more infrastructure from each site.
Coexistence
Section titled “Coexistence”llms.txt remains a lightweight discovery layer — a “business card” for AI. MCP serves those who need deep interaction. Most likely scenario: both formats complement each other.
llms.txt as Entry Point for MCP
Section titled “llms.txt as Entry Point for MCP”llms.txt includes a link to an MCP server:
# My Project
> Project description
## MCP
- [MCP Server](mcp://myproject.dev/mcp): Dynamic API access via MCPThis model already works through mcpdoc: llms.txt → MCP server → AI assistant.