gpt researcher.com
gpt researcher.com logo

GPT Researcher

Enables real-time web research, information gathering, and report generation with tools for conducting deep research, qu...

Created byApr 22, 2025

GPT Researcher MCP Server

Why GPT Researcher MCP?

While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
  • Higher quality information
  • Optimized context usage
  • Comprehensive results
  • Better reasoning for LLMs

Claude Desktop Demo

Resources

  • research_resource: Get web resources related to a given task via research.

Primary Tools

  • deep_research: Performs deep web research on a topic, finding the most reliable and relevant information
  • quick_search: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more here
  • write_report: Generate a report based on research results
  • get_research_sources: Get the sources used in the research
  • get_research_context: Get the full context of the research

Prompts

  • research_query: Create a research query prompt

Prerequisites

Before running the MCP server, make sure you have:
  1. Python 3.10 or higher installed
  1. API keys for the services you plan to use:

Installation

  1. Clone the GPT Researcher repository:
  1. Install the gptr-mcp dependencies:
  1. Set up your environment variables:
You can also add any other env variable for your GPT Researcher configuration.

Running the MCP Server

You can start the MCP server in two ways:

Method 1: Directly using Python

Method 2: Using the MCP CLI (if installed)

Once the server is running, you'll see output indicating that the server is ready to accept connections.

Integrating with Claude

You can integrate your MCP server with Claude using:
[Claude Desktop Integration](https://docs.gptr.dev/docs/gpt-researcher/mcp-server/claude-integration) - For using with Claude desktop application on Mac
For detailed instructions, follow the link above.

Claude Desktop Integration

To integrate your locally running MCP server with Claude for Mac, you'll need to:
  1. Make sure the MCP server is installed and running
  1. Configure Claude Desktop:
For complete step-by-step instructions, see the Claude Desktop Integration guide.

Example Usage with Claude

Note: Code block was split into 2 parts due to size limits.

Troubleshooting

If you encounter issues while running the MCP server:
  1. Make sure your API keys are correctly set in the .env file
  1. Check that you're using Python 3.10 or higher
  1. Ensure all dependencies are installed correctly
  1. Check the server logs for error messages

Next Steps

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support / Contact

GPT Researcher MCP Server

Why GPT Researcher MCP?

While LLM apps can access web search tools with MCP, GPT Researcher MCP delivers deep research results. Standard search tools return raw results requiring manual filtering, often containing irrelevant sources and wasting context window space.
GPT Researcher autonomously explores and validates numerous sources, focusing only on relevant, trusted and up-to-date information. Though slightly slower than standard search (~30 seconds wait), it delivers:
  • Higher quality information
  • Optimized context usage
  • Comprehensive results
  • Better reasoning for LLMs

Claude Desktop Demo

Resources

  • research_resource: Get web resources related to a given task via research.

Primary Tools

  • deep_research: Performs deep web research on a topic, finding the most reliable and relevant information
  • quick_search: Performs a fast web search optimized for speed over quality, returning search results with snippets. Supports any GPTR supported web retriever such as Tavily, Bing, Google, etc... Learn more here
  • write_report: Generate a report based on research results
  • get_research_sources: Get the sources used in the research
  • get_research_context: Get the full context of the research

Prompts

  • research_query: Create a research query prompt

Prerequisites

Before running the MCP server, make sure you have:
  1. Python 3.10 or higher installed
  1. API keys for the services you plan to use:

Installation

  1. Clone the GPT Researcher repository:
  1. Install the gptr-mcp dependencies:
  1. Set up your environment variables:
You can also add any other env variable for your GPT Researcher configuration.

Running the MCP Server

You can start the MCP server in two ways:

Method 1: Directly using Python

Method 2: Using the MCP CLI (if installed)

Once the server is running, you'll see output indicating that the server is ready to accept connections.

Integrating with Claude

You can integrate your MCP server with Claude using:
[Claude Desktop Integration](https://docs.gptr.dev/docs/gpt-researcher/mcp-server/claude-integration) - For using with Claude desktop application on Mac
For detailed instructions, follow the link above.

Claude Desktop Integration

To integrate your locally running MCP server with Claude for Mac, you'll need to:
  1. Make sure the MCP server is installed and running
  1. Configure Claude Desktop:
For complete step-by-step instructions, see the Claude Desktop Integration guide.

Example Usage with Claude

Note: Code block was split into 2 parts due to size limits.

Troubleshooting

If you encounter issues while running the MCP server:
  1. Make sure your API keys are correctly set in the .env file
  1. Check that you're using Python 3.10 or higher
  1. Ensure all dependencies are installed correctly
  1. Check the server logs for error messages

Next Steps

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support / Contact