Databricks MCP Server
A Model Context Protocol (MCP) server that connects to Databricks API, allowing LLMs to run SQL queries, list jobs, and get job status.
Features
- Run SQL queries on Databricks SQL warehouses
- Get status of specific Databricks jobs
- Get detailed information about Databricks jobs
Prerequisites
- Databricks workspace with:
Setup
- Clone this repository
- Create and activate a virtual environment (recommended):
- Install dependencies:
- Create a
.env
file in the root directory with the following variables:
- Test your connection (optional but recommended):
Obtaining Databricks Credentials
- Host: Your Databricks instance URL (e.g.,
your-instance.cloud.databricks.com
)
- Token: Create a personal access token in Databricks:
- HTTP Path: For your SQL warehouse:
Running the Server
Start the MCP server:
You can test the MCP server using the inspector by running
Available MCP Tools
The following MCP tools are available:
- run_sql_query(sql: str) - Execute SQL queries on your Databricks SQL warehouse
- list_jobs() - List all Databricks jobs in your workspace
- get_job_status(job_id: int) - Get the status of a specific Databricks job by ID
- get_job_details(job_id: int) - Get detailed information about a specific Databricks job
Example Usage with LLMs
When used with LLMs that support the MCP protocol, this server enables natural language interaction with your Databricks environment:
- "Show me all tables in the database"
- "Run a query to count records in the customer table"
- "List all my Databricks jobs"
- "Check the status of job #123"
- "Show me details about job #456"
Troubleshooting
Connection Issues
- Ensure your Databricks host is correct and doesn't include
https://
prefix
- Check that your SQL warehouse is running and accessible
- Verify your personal access token has the necessary permissions
- Run the included test script:
python test_connection.py
Security Considerations
- Your Databricks personal access token provides direct access to your workspace
- Secure your
.env
file and never commit it to version control
- Consider using Databricks token with appropriate permission scopes only
- Run this server in a secure environment
Databricks MCP Server
A Model Context Protocol (MCP) server that connects to Databricks API, allowing LLMs to run SQL queries, list jobs, and get job status.
Features
- Run SQL queries on Databricks SQL warehouses
- Get status of specific Databricks jobs
- Get detailed information about Databricks jobs
Prerequisites
- Databricks workspace with:
Setup
- Clone this repository
- Create and activate a virtual environment (recommended):
- Install dependencies:
- Create a
.env
file in the root directory with the following variables:
- Test your connection (optional but recommended):
Obtaining Databricks Credentials
- Host: Your Databricks instance URL (e.g.,
your-instance.cloud.databricks.com
)
- Token: Create a personal access token in Databricks:
- HTTP Path: For your SQL warehouse:
Running the Server
Start the MCP server:
You can test the MCP server using the inspector by running
Available MCP Tools
The following MCP tools are available:
- run_sql_query(sql: str) - Execute SQL queries on your Databricks SQL warehouse
- list_jobs() - List all Databricks jobs in your workspace
- get_job_status(job_id: int) - Get the status of a specific Databricks job by ID
- get_job_details(job_id: int) - Get detailed information about a specific Databricks job
Example Usage with LLMs
When used with LLMs that support the MCP protocol, this server enables natural language interaction with your Databricks environment:
- "Show me all tables in the database"
- "Run a query to count records in the customer table"
- "List all my Databricks jobs"
- "Check the status of job #123"
- "Show me details about job #456"
Troubleshooting
Connection Issues
- Ensure your Databricks host is correct and doesn't include
https://
prefix
- Check that your SQL warehouse is running and accessible
- Verify your personal access token has the necessary permissions
- Run the included test script:
python test_connection.py
Security Considerations
- Your Databricks personal access token provides direct access to your workspace
- Secure your
.env
file and never commit it to version control
- Consider using Databricks token with appropriate permission scopes only
- Run this server in a secure environment