While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
research_resource: Get web resources related to a given task via research.
Primary Tools
deep_research: Performs deep web research on a topic, finding the most reliable and relevant information
quick_search: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more here
write_report: Generate a report based on research results
get_research_sources: Get the sources used in the research
get_research_context: Get the full context of the research
Prompts
research_query: Create a research query prompt
Prerequisites
Before running the MCP server, make sure you have:
Python 3.10 or higher installed
API keys for the services you plan to use:
Installation
Clone the GPT Researcher repository:
Install the gptr-mcp dependencies:
Set up your environment variables:
You can also add any other env variable for your GPT Researcher configuration.
Running the MCP Server
You can start the MCP server in two ways:
Method 1: Directly using Python
Method 2: Using the MCP CLI (if installed)
Once the server is running, you'll see output indicating that the server is ready to accept connections.
Integrating with Claude
You can integrate your MCP server with Claude using:
[Claude Desktop Integration](https://docs.gptr.dev/docs/gpt-researcher/mcp-server/claude-integration) - For using with Claude desktop application on Mac
For detailed instructions, follow the link above.
Claude Desktop Integration
To integrate your locally running MCP server with Claude for Mac, you'll need to:
While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
research_resource: Get web resources related to a given task via research.
Primary Tools
deep_research: Performs deep web research on a topic, finding the most reliable and relevant information
quick_search: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more here
write_report: Generate a report based on research results
get_research_sources: Get the sources used in the research
get_research_context: Get the full context of the research
Prompts
research_query: Create a research query prompt
Prerequisites
Before running the MCP server, make sure you have:
Python 3.10 or higher installed
API keys for the services you plan to use:
Installation
Clone the GPT Researcher repository:
Install the gptr-mcp dependencies:
Set up your environment variables:
You can also add any other env variable for your GPT Researcher configuration.
Running the MCP Server
You can start the MCP server in two ways:
Method 1: Directly using Python
Method 2: Using the MCP CLI (if installed)
Once the server is running, you'll see output indicating that the server is ready to accept connections.
Integrating with Claude
You can integrate your MCP server with Claude using:
[Claude Desktop Integration](https://docs.gptr.dev/docs/gpt-researcher/mcp-server/claude-integration) - For using with Claude desktop application on Mac
For detailed instructions, follow the link above.
Claude Desktop Integration
To integrate your locally running MCP server with Claude for Mac, you'll need to: