rag docs.com
rag docs.com logo

RAG Docs

Integrates semantic search and retrieval of documentation using a vector database (Qdrant), enabling efficient access to...

Created byApr 22, 2025

MCP-Ragdocs

A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.

Quick Install Guide

  1. Install the package globally:
  1. Start Qdrant (using Docker):
  1. Ensure Ollama is running with the default embedding model:
  1. Add to your configuration file:
  1. Verify installation:

Version

Current version: 0.1.6

Features

  • Add documentation from URLs or local files
  • Store documentation in a vector database for semantic search
  • Search through documentation using natural language
  • List all documentation sources

Installation

Install globally using npm:
This will install the server in your global npm directory, which you'll need for the configuration steps below.

Requirements

  • Node.js 16 or higher
  • Qdrant (either local or cloud)
  • One of the following for embeddings:

Qdrant Setup Options

Option 1: Local Qdrant

  1. Using Docker (recommended):
  1. Or download from Qdrant's website

Option 2: Qdrant Cloud

  1. Create an account at Qdrant Cloud
  1. Create a new cluster
  1. Get your cluster URL and API key from the dashboard
  1. Use these in your configuration (see Configuration section below)

Configuration

The server can be used with both Cline/Roo and Claude Desktop. Configuration differs slightly between them:

Cline Configuration

Add to your Cline settings file (%AppData%\Roaming\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json) AND/OR Add to your Roo-Code settings file (%AppData%\Roaming\Code\User\globalStorage\rooveterinaryinc.roo-cline\settings\cline_mcp_settings.json):
  1. Using npm global install (recommended):
For OpenAI instead of Ollama:
  1. Using local development setup:

Claude Desktop Configuration

Add to your Claude Desktop config file:
  • Windows: %AppData%\Claude\claude_desktop_config.json
  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  1. Windows Setup with Ollama (using full paths):
Windows Setup with OpenAI:
  1. macOS Setup with Ollama:

Qdrant Cloud Configuration

For either Cline or Claude Desktop, when using Qdrant Cloud, modify the env section:
With Ollama:
With OpenAI:

Environment Variables

Qdrant Configuration

  • QDRANT_URL (required): URL of your Qdrant instance
  • QDRANT_API_KEY (required for cloud): Your Qdrant Cloud API key

Embeddings Configuration

  • EMBEDDING_PROVIDER (optional): Choose between 'ollama' (default) or 'openai'
  • EMBEDDING_MODEL (optional):
  • OPENAI_API_KEY (required if using OpenAI): Your OpenAI API key

Available Tools

  1. add_documentation
  1. search_documentation
  1. list_sources

Example Usage

In Claude Desktop or any other MCP-compatible client:
  1. Add documentation:
  1. Search documentation:
  1. List sources:

Development

  1. Clone the repository:
  1. Install dependencies:
  1. Build the project:
  1. Run locally:

License

MIT

Troubleshooting

Common Issues

  1. Qdrant Connection Error
  1. Ollama Model Missing
  1. Configuration Path Issues
  1. npm Global Install Issues
For other issues, please check:
  • Docker logs: docker logs $(docker ps -q --filter ancestor=qdrant/qdrant)
  • Ollama status: ollama list
  • Node.js version: node -v (should be 16 or higher)

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

MCP-Ragdocs

A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.

Quick Install Guide

  1. Install the package globally:
  1. Start Qdrant (using Docker):
  1. Ensure Ollama is running with the default embedding model:
  1. Add to your configuration file:
  1. Verify installation:

Version

Current version: 0.1.6

Features

  • Add documentation from URLs or local files
  • Store documentation in a vector database for semantic search
  • Search through documentation using natural language
  • List all documentation sources

Installation

Install globally using npm:
This will install the server in your global npm directory, which you'll need for the configuration steps below.

Requirements

  • Node.js 16 or higher
  • Qdrant (either local or cloud)
  • One of the following for embeddings:

Qdrant Setup Options

Option 1: Local Qdrant

  1. Using Docker (recommended):
  1. Or download from Qdrant's website

Option 2: Qdrant Cloud

  1. Create an account at Qdrant Cloud
  1. Create a new cluster
  1. Get your cluster URL and API key from the dashboard
  1. Use these in your configuration (see Configuration section below)

Configuration

The server can be used with both Cline/Roo and Claude Desktop. Configuration differs slightly between them:

Cline Configuration

Add to your Cline settings file (%AppData%\Roaming\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json) AND/OR Add to your Roo-Code settings file (%AppData%\Roaming\Code\User\globalStorage\rooveterinaryinc.roo-cline\settings\cline_mcp_settings.json):
  1. Using npm global install (recommended):
For OpenAI instead of Ollama:
  1. Using local development setup:

Claude Desktop Configuration

Add to your Claude Desktop config file:
  • Windows: %AppData%\Claude\claude_desktop_config.json
  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  1. Windows Setup with Ollama (using full paths):
Windows Setup with OpenAI:
  1. macOS Setup with Ollama:

Qdrant Cloud Configuration

For either Cline or Claude Desktop, when using Qdrant Cloud, modify the env section:
With Ollama:
With OpenAI:

Environment Variables

Qdrant Configuration

  • QDRANT_URL (required): URL of your Qdrant instance
  • QDRANT_API_KEY (required for cloud): Your Qdrant Cloud API key

Embeddings Configuration

  • EMBEDDING_PROVIDER (optional): Choose between 'ollama' (default) or 'openai'
  • EMBEDDING_MODEL (optional):
  • OPENAI_API_KEY (required if using OpenAI): Your OpenAI API key

Available Tools

  1. add_documentation
  1. search_documentation
  1. list_sources

Example Usage

In Claude Desktop or any other MCP-compatible client:
  1. Add documentation:
  1. Search documentation:
  1. List sources:

Development

  1. Clone the repository:
  1. Install dependencies:
  1. Build the project:
  1. Run locally:

License

MIT

Troubleshooting

Common Issues

  1. Qdrant Connection Error
  1. Ollama Model Missing
  1. Configuration Path Issues
  1. npm Global Install Issues
For other issues, please check:
  • Docker logs: docker logs $(docker ps -q --filter ancestor=qdrant/qdrant)
  • Ollama status: ollama list
  • Node.js version: node -v (should be 16 or higher)

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.