llms.txt documentation.com
llms.txt documentation.com logo

LLMS.txt Documentation

Provides AI systems with access to documentation from llms.txt files by fetching and parsing content from specified URLs...

Created byApr 22, 2025

MCP LLMS-TXT Documentation Server

Overview

llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.
MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.

llms-txt

You can find llms.txt files for langgraph and langchain here:
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]

Quickstart

Install uv

Choose an `llms.txt` file to use.

  • For example, here's the LangGraph llms.txt file.
Note: Security and Domain Access ControlFor security reasons, mcpdoc implements strict domain access controls:This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.

(Optional) Test the MCP server locally with your `llms.txt` file(s) of choice:

Screenshot 2025-03-18 at 3 29 30 PM
Screenshot 2025-03-18 at 3 30 30 PM
  • Here, you can test the tool calls.

Connect to Cursor

  • Open Cursor Settings and MCP tab.
  • This will open the ~/.cursor/mcp.json file.
Screenshot 2025-03-19 at 11 01 31 AM
  • Paste the following into the file (we use the langgraph-docs-mcp name and link to the LangGraph llms.txt).
  • Confirm that the server is running in your Cursor Settings/MCP tab.
  • Best practice is to then update Cursor Global (User) rules.
  • Open Cursor Settings/Rules and update User Rules with the following (or similar):
  • CMD+L (on Mac) to open chat.
  • Ensure agent is selected.
Screenshot 2025-03-18 at 1 56 54 PM
Then, try an example prompt, such as:
Screenshot 2025-03-18 at 1 58 38 PM

Connect to Windsurf

  • Open Cascade with CMD+L (on Mac).
  • Click Configure MCP to open the config file, ~/.codeium/windsurf/mcp_config.json.
  • Update with langgraph-docs-mcp as noted above.
Screenshot 2025-03-19 at 11 02 52 AM
  • Update Windsurf Rules/Global rules with the following (or similar):
Screenshot 2025-03-18 at 2 02 12 PM
Then, try the example prompt:
  • It will perform your tool calls.
Screenshot 2025-03-18 at 2 03 07 PM

Connect to Claude Desktop

  • Open Settings/Developer to update ~/Library/Application\ Support/Claude/claude_desktop_config.json.
  • Update with langgraph-docs-mcp as noted above.
  • Restart Claude Desktop app.
[!Note] If you run into issues with Python version incompatibility when trying to add MCPDoc tools to Claude Desktop, you can explicitly specify the filepath to python executable in the uvx command.
[!Note] Currently (3/21/25) it appears that Claude Desktop does not support rules for global rules, so appending the following to your prompt.
Screenshot 2025-03-18 at 2 05 54 PM
  • You will see your tools visible in the bottom right of your chat input.
Screenshot 2025-03-18 at 2 05 39 PM
Then, try the example prompt:
  • It will ask to approve tool calls as it processes your request.
Screenshot 2025-03-18 at 2 06 54 PM

Connect to Claude Code

  • In a terminal after installing Claude Code, run this command to add the MCP server to your project:
  • You will see ~/.claude.json updated.
  • Test by launching Claude Code and running to view your tools:
Screenshot 2025-03-18 at 2 13 49 PM
[!Note] Currently (3/21/25) it appears that Claude Code does not support rules for global rules, so appending the following to your prompt.
Then, try the example prompt:
  • It will ask to approve tool calls.
Screenshot 2025-03-18 at 2 14 37 PM

Command-line Interface

The mcpdoc command provides a simple CLI for launching the documentation server.
You can specify documentation sources in three ways, and these can be combined:
  1. Using a YAML config file:
  • This will load the LangGraph Python documentation from the sample_config.yaml file in this repo.
  1. Using a JSON config file:
  • This will load the LangGraph Python documentation from the sample_config.json file in this repo.
  1. Directly specifying llms.txt URLs with optional names:
  • URLs can be specified either as plain URLs or with optional names using the format name:url.
  • You can specify multiple URLs by using the --urls parameter multiple times.
  • This is how we loaded llms.txt for the MCP server above.
You can also combine these methods to merge documentation sources:

Additional Options

  • --follow-redirects: Follow HTTP redirects (defaults to False)
  • --timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)
Example with additional options:
This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.

Configuration Format

Both YAML and JSON configuration files should contain a list of documentation sources.
Each source must include an llms_txt URL and can optionally include a name:

YAML Configuration Example (sample_config.yaml)

JSON Configuration Example (sample_config.json)

Programmatic Usage

MCP LLMS-TXT Documentation Server

Overview

llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.
MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.

llms-txt

You can find llms.txt files for langgraph and langchain here:
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]

Quickstart

Install uv

Choose an `llms.txt` file to use.

  • For example, here's the LangGraph llms.txt file.
Note: Security and Domain Access ControlFor security reasons, mcpdoc implements strict domain access controls:This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.

(Optional) Test the MCP server locally with your `llms.txt` file(s) of choice:

Screenshot 2025-03-18 at 3 29 30 PM
Screenshot 2025-03-18 at 3 30 30 PM
  • Here, you can test the tool calls.

Connect to Cursor

  • Open Cursor Settings and MCP tab.
  • This will open the ~/.cursor/mcp.json file.
Screenshot 2025-03-19 at 11 01 31 AM
  • Paste the following into the file (we use the langgraph-docs-mcp name and link to the LangGraph llms.txt).
  • Confirm that the server is running in your Cursor Settings/MCP tab.
  • Best practice is to then update Cursor Global (User) rules.
  • Open Cursor Settings/Rules and update User Rules with the following (or similar):
  • CMD+L (on Mac) to open chat.
  • Ensure agent is selected.
Screenshot 2025-03-18 at 1 56 54 PM
Then, try an example prompt, such as:
Screenshot 2025-03-18 at 1 58 38 PM

Connect to Windsurf

  • Open Cascade with CMD+L (on Mac).
  • Click Configure MCP to open the config file, ~/.codeium/windsurf/mcp_config.json.
  • Update with langgraph-docs-mcp as noted above.
Screenshot 2025-03-19 at 11 02 52 AM
  • Update Windsurf Rules/Global rules with the following (or similar):
Screenshot 2025-03-18 at 2 02 12 PM
Then, try the example prompt:
  • It will perform your tool calls.
Screenshot 2025-03-18 at 2 03 07 PM

Connect to Claude Desktop

  • Open Settings/Developer to update ~/Library/Application\ Support/Claude/claude_desktop_config.json.
  • Update with langgraph-docs-mcp as noted above.
  • Restart Claude Desktop app.
[!Note] If you run into issues with Python version incompatibility when trying to add MCPDoc tools to Claude Desktop, you can explicitly specify the filepath to python executable in the uvx command.
[!Note] Currently (3/21/25) it appears that Claude Desktop does not support rules for global rules, so appending the following to your prompt.
Screenshot 2025-03-18 at 2 05 54 PM
  • You will see your tools visible in the bottom right of your chat input.
Screenshot 2025-03-18 at 2 05 39 PM
Then, try the example prompt:
  • It will ask to approve tool calls as it processes your request.
Screenshot 2025-03-18 at 2 06 54 PM

Connect to Claude Code

  • In a terminal after installing Claude Code, run this command to add the MCP server to your project:
  • You will see ~/.claude.json updated.
  • Test by launching Claude Code and running to view your tools:
Screenshot 2025-03-18 at 2 13 49 PM
[!Note] Currently (3/21/25) it appears that Claude Code does not support rules for global rules, so appending the following to your prompt.
Then, try the example prompt:
  • It will ask to approve tool calls.
Screenshot 2025-03-18 at 2 14 37 PM

Command-line Interface

The mcpdoc command provides a simple CLI for launching the documentation server.
You can specify documentation sources in three ways, and these can be combined:
  1. Using a YAML config file:
  • This will load the LangGraph Python documentation from the sample_config.yaml file in this repo.
  1. Using a JSON config file:
  • This will load the LangGraph Python documentation from the sample_config.json file in this repo.
  1. Directly specifying llms.txt URLs with optional names:
  • URLs can be specified either as plain URLs or with optional names using the format name:url.
  • You can specify multiple URLs by using the --urls parameter multiple times.
  • This is how we loaded llms.txt for the MCP server above.
You can also combine these methods to merge documentation sources:

Additional Options

  • --follow-redirects: Follow HTTP redirects (defaults to False)
  • --timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)
Example with additional options:
This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.

Configuration Format

Both YAML and JSON configuration files should contain a list of documentation sources.
Each source must include an llms_txt URL and can optionally include a name:

YAML Configuration Example (sample_config.yaml)

JSON Configuration Example (sample_config.json)

Programmatic Usage