Skip to content

llms.txt and MCP

Comparing llms.txt and Model Context Protocol (MCP) — static discovery vs dynamic protocol, mcpdoc bridge, evolution scenarios

Model Context Protocol (MCP) — an open protocol for LLM interaction with external data and tools. Created by Anthropic (November 2024), transferred to Linux Foundation (December 2025).

Numbers as of early 2026 (source — modelcontextprotocol.io): 8M+ SDK downloads per week, 4,000+ MCP servers. Adopted by OpenAI (March 2025), Google, Microsoft.

MCP enables AI models to:

  • Read files and databases
  • Make API calls
  • Execute tools
  • Retrieve context from external sources
llms.txtMCP
TypeStatic fileDynamic protocol
FormatMarkdownJSON-RPC 2.0
DiscoveryPassive (HTTP GET)Active (client-server)
RequirementsAny HTTP clientMCP SDK (client + server)
ContentRead-onlyRead + write + tools
ComplexityCreate in 15 minutesServer development required
SecurityRead-only, no risksTool poisoning, prompt injection

llms.txt and MCP solve different problems at different levels:

llms.txt = Discovery Layer → "What's on this site?"
MCP = Interaction Layer → "Give me specific data"

llms.txt — the entry point. An AI system learns what documentation is available, reads descriptions, selects needed pages.

MCP — deep interaction. An AI system calls tools, requests specific data, executes actions.

mcpdoc (langchain-ai, 79+ stars) — an MCP server that takes llms.txt as a data source and serves documentation via MCP protocol.

{
"mcpServers": {
"docs": {
"command": "uvx",
"args": [
"--from", "mcpdoc",
"mcpdoc",
"--urls",
"Astro:https://docs.astro.build/llms.txt",
"Stripe:https://docs.stripe.com/llms.txt",
"--transport", "stdio"
]
}
}
}

Another tool: mcp-llms-txt-explorer — an MCP server for navigating any site’s llms.txt.

Scenario: an AI assistant in an IDE (Claude Code, Cursor, Windsurf) connects documentation via MCP.

# Astro Documentation
> Astro is a web framework for building content-driven websites.
## Docs
- [Getting Started](https://docs.astro.build/en/getting-started/): Quick start guide
- [Configuration](https://docs.astro.build/en/reference/configuration/): astro.config.mjs

Step 2: mcpdoc turns llms.txt into an MCP server

Section titled “Step 2: mcpdoc turns llms.txt into an MCP server”
Окно терминала
uvx --from mcpdoc mcpdoc \
--urls "Astro:https://docs.astro.build/llms.txt" \
--transport stdio

Step 3: AI assistant requests documentation

Section titled “Step 3: AI assistant requests documentation”

The AI receives structured context from llms.txt → loads needed pages → responds with up-to-date documentation.

Sites implementing both approaches:

MCP introduces new attack vectors that don’t apply to llms.txt:

ThreatMCPllms.txt
Tool poisoningMalicious server injects harmful instructionsNot applicable (no tools)
Prompt injectionServer data modifies LLM behaviorMinimal risk (read-only)
Rug-pullServer changes behavior after gaining trustNot applicable (static file)
Data exfiltrationVia tool callsNot applicable

llms.txt is a static, read-only file. Its security is equivalent to any public HTML page.

MCP adds a standard discovery mechanism. Sites register MCP servers instead of publishing llms.txt. Possible, but requires significantly more infrastructure from each site.

llms.txt remains a lightweight discovery layer — a “business card” for AI. MCP serves those who need deep interaction. Most likely scenario: both formats complement each other.

llms.txt includes a link to an MCP server:

# My Project
> Project description
## MCP
- [MCP Server](mcp://myproject.dev/mcp): Dynamic API access via MCP

This model already works through mcpdoc: llms.txt → MCP server → AI assistant.