Integrates with Databricks services to enable interaction with clusters, jobs, notebooks, DBFS, and SQL workspaces via t...
Created byApr 22, 2025
Databricks MCP Server
A Model Completion Protocol (MCP) server for Databricks that provides access to Databricks functionality via the MCP protocol. This allows LLM-powered tools to interact with Databricks clusters, jobs, notebooks, and more.
Features
MCP Protocol Support: Implements the MCP protocol to allow LLMs to interact with Databricks
Databricks API Integration: Provides access to Databricks REST API functionality
Tool Registration: Exposes Databricks functionality as MCP tools
Async Support: Built with asyncio for efficient operation
Available Tools
The Databricks MCP Server exposes the following tools:
list_clusters: List all Databricks clusters
create_cluster: Create a new Databricks cluster
terminate_cluster: Terminate a Databricks cluster
get_cluster: Get information about a specific Databricks cluster
start_cluster: Start a terminated Databricks cluster
list_jobs: List all Databricks jobs
run_job: Run a Databricks job
list_notebooks: List notebooks in a workspace directory
export_notebook: Export a notebook from the workspace
list_files: List files and directories in a DBFS path
execute_sql: Execute a SQL statement
Installation
Prerequisites
Python 3.10 or higher
uv package manager (recommended for MCP servers)
Setup
Install uv if you don't have it already:Restart your terminal after installation.
Clone the repository:
Set up the project with uv:
Set up environment variables:You can also create an .env file based on the .env.example template.
Running the MCP Server
To start the MCP server, run:
These wrapper scripts will execute the actual server scripts located in the scripts directory. The server will start and be ready to accept MCP protocol connections.
You can also directly run the server scripts from the scripts directory:
Querying Databricks Resources
The repository includes utility scripts to quickly view Databricks resources:
Project Structure
See project_structure.md for a more detailed view of the project structure.
Development
Code Standards
Python code follows PEP 8 style guide with a maximum line length of 100 characters
Use 4 spaces for indentation (no tabs)
Use double quotes for strings
All classes, methods, and functions should have Google-style docstrings
Type hints are required for all code except tests
Linting
The project uses the following linting tools:
Testing
The project uses pytest for testing. To run the tests:
You can also run the tests directly with pytest:
A minimum code coverage of 80% is the goal for the project.
Documentation
API documentation is generated using Sphinx and can be found in the docs/api directory
All code includes Google-style docstrings
See the examples/ directory for usage examples
Examples
Check the examples/ directory for usage examples. To run examples:
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Ensure your code follows the project's coding standards
Add tests for any new functionality
Update documentation as necessary
Verify all tests pass before submitting
License
This project is licensed under the MIT License - see the LICENSE file for details.
Databricks MCP Server
A Model Completion Protocol (MCP) server for Databricks that provides access to Databricks functionality via the MCP protocol. This allows LLM-powered tools to interact with Databricks clusters, jobs, notebooks, and more.
Features
MCP Protocol Support: Implements the MCP protocol to allow LLMs to interact with Databricks
Databricks API Integration: Provides access to Databricks REST API functionality
Tool Registration: Exposes Databricks functionality as MCP tools
Async Support: Built with asyncio for efficient operation
Available Tools
The Databricks MCP Server exposes the following tools:
list_clusters: List all Databricks clusters
create_cluster: Create a new Databricks cluster
terminate_cluster: Terminate a Databricks cluster
get_cluster: Get information about a specific Databricks cluster
start_cluster: Start a terminated Databricks cluster
list_jobs: List all Databricks jobs
run_job: Run a Databricks job
list_notebooks: List notebooks in a workspace directory
export_notebook: Export a notebook from the workspace
list_files: List files and directories in a DBFS path
execute_sql: Execute a SQL statement
Installation
Prerequisites
Python 3.10 or higher
uv package manager (recommended for MCP servers)
Setup
Install uv if you don't have it already:Restart your terminal after installation.
Clone the repository:
Set up the project with uv:
Set up environment variables:You can also create an .env file based on the .env.example template.
Running the MCP Server
To start the MCP server, run:
These wrapper scripts will execute the actual server scripts located in the scripts directory. The server will start and be ready to accept MCP protocol connections.
You can also directly run the server scripts from the scripts directory:
Querying Databricks Resources
The repository includes utility scripts to quickly view Databricks resources:
Project Structure
See project_structure.md for a more detailed view of the project structure.
Development
Code Standards
Python code follows PEP 8 style guide with a maximum line length of 100 characters
Use 4 spaces for indentation (no tabs)
Use double quotes for strings
All classes, methods, and functions should have Google-style docstrings
Type hints are required for all code except tests
Linting
The project uses the following linting tools:
Testing
The project uses pytest for testing. To run the tests:
You can also run the tests directly with pytest:
A minimum code coverage of 80% is the goal for the project.
Documentation
API documentation is generated using Sphinx and can be found in the docs/api directory
All code includes Google-style docstrings
See the examples/ directory for usage examples
Examples
Check the examples/ directory for usage examples. To run examples:
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Ensure your code follows the project's coding standards
Add tests for any new functionality
Update documentation as necessary
Verify all tests pass before submitting
License
This project is licensed under the MIT License - see the LICENSE file for details.