MCP LLMS-TXT Documentation Server
Overview
llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned. llms-txt
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
Quickstart
Install uv
- Please see official uv docs for other ways to install
uv.
Choose an `llms.txt` file to use.
- For example, here's the LangGraph
llms.txtfile.
Note: Security and Domain Access ControlFor security reasons, mcpdoc implements strict domain access controls:This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.
(Optional) Test the MCP server locally with your `llms.txt` file(s) of choice:
- This should run at: http://localhost:8082
- Run MCP inspector and connect to the running server:
- Here, you can test the
toolcalls.
Connect to Cursor
- Open
Cursor SettingsandMCPtab.
- This will open the
~/.cursor/mcp.jsonfile.
- Paste the following into the file (we use the
langgraph-docs-mcpname and link to the LangGraphllms.txt).
- Confirm that the server is running in your
Cursor Settings/MCPtab.
- Best practice is to then update Cursor Global (User) rules.
- Open Cursor
Settings/Rulesand updateUser Ruleswith the following (or similar):
CMD+L(on Mac) to open chat.
- Ensure
agentis selected.
Connect to Windsurf
- Open Cascade with
CMD+L(on Mac).
- Click
Configure MCPto open the config file,~/.codeium/windsurf/mcp_config.json.
- Update with
langgraph-docs-mcpas noted above.
- Update
Windsurf Rules/Global ruleswith the following (or similar):
- It will perform your tool calls.
Connect to Claude Desktop
- Open
Settings/Developerto update~/Library/Application\ Support/Claude/claude_desktop_config.json.
- Update with
langgraph-docs-mcpas noted above.
- Restart Claude Desktop app.
[!Note] If you run into issues with Python version incompatibility when trying to add MCPDoc tools to Claude Desktop, you can explicitly specify the filepath topythonexecutable in theuvxcommand.
[!Note] Currently (3/21/25) it appears that Claude Desktop does not supportrulesfor global rules, so appending the following to your prompt.
- You will see your tools visible in the bottom right of your chat input.
- It will ask to approve tool calls as it processes your request.
Connect to Claude Code
- In a terminal after installing Claude Code, run this command to add the MCP server to your project:
- You will see
~/.claude.jsonupdated.
- Test by launching Claude Code and running to view your tools:
[!Note] Currently (3/21/25) it appears that Claude Code does not supportrulesfor global rules, so appending the following to your prompt.
- It will ask to approve tool calls.
Command-line Interface
mcpdoc command provides a simple CLI for launching the documentation server. - Using a YAML config file:
- This will load the LangGraph Python documentation from the
sample_config.yamlfile in this repo.
- Using a JSON config file:
- This will load the LangGraph Python documentation from the
sample_config.jsonfile in this repo.
- Directly specifying llms.txt URLs with optional names:
- URLs can be specified either as plain URLs or with optional names using the format
name:url.
- You can specify multiple URLs by using the
--urlsparameter multiple times.
- This is how we loaded
llms.txtfor the MCP server above.
Additional Options
--follow-redirects: Follow HTTP redirects (defaults to False)
--timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)
Configuration Format
llms_txt URL and can optionally include a name:YAML Configuration Example (sample_config.yaml)
JSON Configuration Example (sample_config.json)
Programmatic Usage
MCP LLMS-TXT Documentation Server
Overview
llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned. llms-txt
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
Quickstart
Install uv
- Please see official uv docs for other ways to install
uv.
Choose an `llms.txt` file to use.
- For example, here's the LangGraph
llms.txtfile.
Note: Security and Domain Access ControlFor security reasons, mcpdoc implements strict domain access controls:This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.
(Optional) Test the MCP server locally with your `llms.txt` file(s) of choice:
- This should run at: http://localhost:8082
- Run MCP inspector and connect to the running server:
- Here, you can test the
toolcalls.
Connect to Cursor
- Open
Cursor SettingsandMCPtab.
- This will open the
~/.cursor/mcp.jsonfile.
- Paste the following into the file (we use the
langgraph-docs-mcpname and link to the LangGraphllms.txt).
- Confirm that the server is running in your
Cursor Settings/MCPtab.
- Best practice is to then update Cursor Global (User) rules.
- Open Cursor
Settings/Rulesand updateUser Ruleswith the following (or similar):
CMD+L(on Mac) to open chat.
- Ensure
agentis selected.
Connect to Windsurf
- Open Cascade with
CMD+L(on Mac).
- Click
Configure MCPto open the config file,~/.codeium/windsurf/mcp_config.json.
- Update with
langgraph-docs-mcpas noted above.
- Update
Windsurf Rules/Global ruleswith the following (or similar):
- It will perform your tool calls.
Connect to Claude Desktop
- Open
Settings/Developerto update~/Library/Application\ Support/Claude/claude_desktop_config.json.
- Update with
langgraph-docs-mcpas noted above.
- Restart Claude Desktop app.
[!Note] If you run into issues with Python version incompatibility when trying to add MCPDoc tools to Claude Desktop, you can explicitly specify the filepath topythonexecutable in theuvxcommand.
[!Note] Currently (3/21/25) it appears that Claude Desktop does not supportrulesfor global rules, so appending the following to your prompt.
- You will see your tools visible in the bottom right of your chat input.
- It will ask to approve tool calls as it processes your request.
Connect to Claude Code
- In a terminal after installing Claude Code, run this command to add the MCP server to your project:
- You will see
~/.claude.jsonupdated.
- Test by launching Claude Code and running to view your tools:
[!Note] Currently (3/21/25) it appears that Claude Code does not supportrulesfor global rules, so appending the following to your prompt.
- It will ask to approve tool calls.
Command-line Interface
mcpdoc command provides a simple CLI for launching the documentation server. - Using a YAML config file:
- This will load the LangGraph Python documentation from the
sample_config.yamlfile in this repo.
- Using a JSON config file:
- This will load the LangGraph Python documentation from the
sample_config.jsonfile in this repo.
- Directly specifying llms.txt URLs with optional names:
- URLs can be specified either as plain URLs or with optional names using the format
name:url.
- You can specify multiple URLs by using the
--urlsparameter multiple times.
- This is how we loaded
llms.txtfor the MCP server above.
Additional Options
--follow-redirects: Follow HTTP redirects (defaults to False)
--timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)
Configuration Format
llms_txt URL and can optionally include a name: