openapi.com
openapi.com logo

OpenAPI

Automatically generates MCP tools from OpenAPI specifications, enabling direct access to third-party REST APIs with auth...

Created byApr 22, 2025

OpenAPI to Model Context Protocol (MCP)

![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg) Repo Size Last Commit Open Issues Python version
The OpenAPI-MCP proxy translates OpenAPI specs into MCP tools, enabling AI agents to access external APIs without custom wrappers!
OpenAPI-MCP

Bridge the gap between AI agents and external APIs

The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools, resources, and prompts. This simplifies integration by eliminating the need for custom API wrappers.

If you find it useful, please give it a on GitHub!

Key Features

  • FastMCP Transport: Optimized for stdio, working out-of-the-box with popular LLM orchestrators.
  • OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
  • Resource Registration: Automatically converts OpenAPI component schemas into resource objects with defined URIs.
  • Prompt Generation: Generates contextual prompts based on API operations to guide LLMs in using the API.
  • OAuth2 Support: Handles machine authentication via Client Credentials flow.
  • JSON-RPC 2.0 Support: Fully compliant request/response structure.
  • Auto Metadata: Derives tool names, summaries, and schemas from the OpenAPI specification.
  • Sanitized Tool Names: Ensures compatibility with MCP name constraints.
  • Flexible Parameter Parsing: Supports query strings (with a leading "?") and multiple JSON variations (including keys with dots and numeric values).
  • Enhanced Parameter Handling: Automatically converts parameters to the correct data types.
  • Extended Tool Metadata: Includes detailed parameter information and response schemas.

Quick Start

Installation

LLM Orchestrator Configuration

For Claude Desktop, Cursor, and Windsurf, use the snippet below and adapt the paths accordingly:
Apply this configuration to the following files:
  • Cursor: ~/.cursor/mcp.json
  • Windsurf: ~/.codeium/windsurf/mcp_config.json
  • Claude Desktop: ~/Library/Application Support/Claude/claude_desktop_config.json
Replace full_path_to_openapi_mcp with your actual installation path.

Environment Configuration

[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]

How It Works

  1. Parses OpenAPI Spec: Loads the OpenAPI specification using httpx and PyYAML if needed.
  1. Registers Operations: Extracts API operations and generates MCP-compatible tools with proper input and response schemas.
  1. Resource Registration: Automatically converts OpenAPI component schemas into resource objects with assigned URIs (e.g., /resource/{name}).
  1. Prompt Generation: Creates contextual prompts based on API operations to assist LLMs in understanding API usage.
  1. Authentication: Supports OAuth2 authentication via the Client Credentials flow.
  1. Parameter Handling: Converts parameters to required data types and supports flexible query string and JSON formats.
  1. JSON-RPC 2.0 Compliance: Ensures standard communication protocols for tool interactions.

Resources & Prompts

In addition to tools, the proxy server now automatically registers:
  • Resources: Derived from OpenAPI component schemas, resource objects are registered with defined URIs (e.g., /resource/{name}) for structured data handling.
  • Prompts: Contextual prompts are generated based on API operations to provide usage guidance to LLMs, enhancing their understanding of available endpoints.
This extended metadata improves integration by providing comprehensive API context.
OpenAPI-MCP

Contributing

  • Fork this repository.
  • Create a new branch.
  • Submit a pull request with a clear description of your changes.

License

MIT License
If you find it useful, please give it a on GitHub!

OpenAPI to Model Context Protocol (MCP)

![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg) Repo Size Last Commit Open Issues Python version
The OpenAPI-MCP proxy translates OpenAPI specs into MCP tools, enabling AI agents to access external APIs without custom wrappers!
OpenAPI-MCP

Bridge the gap between AI agents and external APIs

The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools, resources, and prompts. This simplifies integration by eliminating the need for custom API wrappers.

If you find it useful, please give it a on GitHub!

Key Features

  • FastMCP Transport: Optimized for stdio, working out-of-the-box with popular LLM orchestrators.
  • OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
  • Resource Registration: Automatically converts OpenAPI component schemas into resource objects with defined URIs.
  • Prompt Generation: Generates contextual prompts based on API operations to guide LLMs in using the API.
  • OAuth2 Support: Handles machine authentication via Client Credentials flow.
  • JSON-RPC 2.0 Support: Fully compliant request/response structure.
  • Auto Metadata: Derives tool names, summaries, and schemas from the OpenAPI specification.
  • Sanitized Tool Names: Ensures compatibility with MCP name constraints.
  • Flexible Parameter Parsing: Supports query strings (with a leading "?") and multiple JSON variations (including keys with dots and numeric values).
  • Enhanced Parameter Handling: Automatically converts parameters to the correct data types.
  • Extended Tool Metadata: Includes detailed parameter information and response schemas.

Quick Start

Installation

LLM Orchestrator Configuration

For Claude Desktop, Cursor, and Windsurf, use the snippet below and adapt the paths accordingly:
Apply this configuration to the following files:
  • Cursor: ~/.cursor/mcp.json
  • Windsurf: ~/.codeium/windsurf/mcp_config.json
  • Claude Desktop: ~/Library/Application Support/Claude/claude_desktop_config.json
Replace full_path_to_openapi_mcp with your actual installation path.

Environment Configuration

[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]
[object Object]

How It Works

  1. Parses OpenAPI Spec: Loads the OpenAPI specification using httpx and PyYAML if needed.
  1. Registers Operations: Extracts API operations and generates MCP-compatible tools with proper input and response schemas.
  1. Resource Registration: Automatically converts OpenAPI component schemas into resource objects with assigned URIs (e.g., /resource/{name}).
  1. Prompt Generation: Creates contextual prompts based on API operations to assist LLMs in understanding API usage.
  1. Authentication: Supports OAuth2 authentication via the Client Credentials flow.
  1. Parameter Handling: Converts parameters to required data types and supports flexible query string and JSON formats.
  1. JSON-RPC 2.0 Compliance: Ensures standard communication protocols for tool interactions.

Resources & Prompts

In addition to tools, the proxy server now automatically registers:
  • Resources: Derived from OpenAPI component schemas, resource objects are registered with defined URIs (e.g., /resource/{name}) for structured data handling.
  • Prompts: Contextual prompts are generated based on API operations to provide usage guidance to LLMs, enhancing their understanding of available endpoints.
This extended metadata improves integration by providing comprehensive API context.
OpenAPI-MCP

Contributing

  • Fork this repository.
  • Create a new branch.
  • Submit a pull request with a clear description of your changes.

License

MIT License
If you find it useful, please give it a on GitHub!