A Model Context Protocol (MCP) server for interacting with Keboola Connection. This server provides tools for listing and accessing data from Keboola Storage API.
Requirements
Python 3.10 or newer
Keboola Storage API token
Snowflake or BigQuery Read Only Workspace
Installation
Installing via Smithery
To install Keboola Explorer for Claude Desktop automatically via Smithery:
Manual Installation
First, clone the repository and create a virtual environment:
Install the package in development mode:
For development dependencies:
Claude Desktop Setup
To use this server with Claude Desktop, follow these steps:
Create or edit the Claude Desktop configuration file:
Add the following configuration (adjust paths according to your setup):
Replace:
/path/to/keboola-mcp-server with your actual path to the cloned repository
YOUR_REGION with your Keboola region (e.g., north-europe.azure, etc.). You can remove it if your region is just connection explicitly
your-keboola-storage-token with your Keboola Storage API token
your-workspace-schema with your Snowflake schema or BigQuery dataset of your workspace
Note: If you are using a specific version of Python (e.g. 3.11 due to some package compatibility issues),
you'll need to update the command into using that specific version, e.g. /path/to/keboola-mcp-server/.venv/bin/python3.11
Note: The Workspace can be created in your Keboola project. It is the same project where you got
your Storage Token. The workspace will provide all the necessary connection parameters including the schema or dataset name.
After updating the configuration:
Troubleshooting
If you encounter connection issues:
Check the logs in Claude Desktop for any error messages
Verify your Keboola Storage API token is correct
Ensure all paths in the configuration are absolute paths
Confirm the virtual environment is properly activated and all dependencies are installed
Cursor AI Setup
To use this server with Cursor AI, you have two options for configuring the transport method: Server-Sent Events (SSE) or Standard I/O (stdio).
Create or edit the Cursor AI configuration file:
Add one of the following configurations (or all) based on your preferred transport method:
Option 1: Using Server-Sent Events (SSE)
Option 2a: Using Standard I/O (stdio)
Option 2b: Using WSL Standard I/O (wsl stdio)
When running the MCP server from Windows Subsystem for Linux with Cursor AI, use this.
where /wsl_path/to/keboola-mcp-server/.env file contains environment variables:
Replace:
/path/to/keboola-mcp-server with your actual path to the cloned repository
YOUR_REGION with your Keboola region (e.g., north-europe.azure, etc.). You can remove it if your region is just connection explicitly
your-keboola-storage-token with your Keboola Storage API token
your-workspace-schema with your Snowflake schema or BigQuery dataset of your workspace
After updating the configuration:
Restart Cursor AI
If you use the sse transport make sure to start your MCP server. You can do so by running this in the activated
virtual environment where you built the server:
Cursor AI should be automatically detect your MCP server and enable it.
BigQuery support
If your Keboola project uses BigQuery backend you will need to set GOOGLE_APPLICATION_CREDENTIALS environment variable
in addition to KBC_STORAGE_TOKEN and KBC_WORKSPACE_SCHEMA.
Go to your Keboola BigQuery workspace and display its credentials (click Connect button).
Download the credentials file to your local disk. It is a plain JSON file.
Set the full path of the downloaded JSON credentials file to GOOGLE_APPLICATION_CREDENTIALS environment variable.
This will give your MCP server instance permissions to access your BigQuery workspace in Google Cloud.
Available Tools
The server provides the following tools for interacting with Keboola Connection:
A Model Context Protocol (MCP) server for interacting with Keboola Connection. This server provides tools for listing and accessing data from Keboola Storage API.
Requirements
Python 3.10 or newer
Keboola Storage API token
Snowflake or BigQuery Read Only Workspace
Installation
Installing via Smithery
To install Keboola Explorer for Claude Desktop automatically via Smithery:
Manual Installation
First, clone the repository and create a virtual environment:
Install the package in development mode:
For development dependencies:
Claude Desktop Setup
To use this server with Claude Desktop, follow these steps:
Create or edit the Claude Desktop configuration file:
Add the following configuration (adjust paths according to your setup):
Replace:
/path/to/keboola-mcp-server with your actual path to the cloned repository
YOUR_REGION with your Keboola region (e.g., north-europe.azure, etc.). You can remove it if your region is just connection explicitly
your-keboola-storage-token with your Keboola Storage API token
your-workspace-schema with your Snowflake schema or BigQuery dataset of your workspace
Note: If you are using a specific version of Python (e.g. 3.11 due to some package compatibility issues),
you'll need to update the command into using that specific version, e.g. /path/to/keboola-mcp-server/.venv/bin/python3.11
Note: The Workspace can be created in your Keboola project. It is the same project where you got
your Storage Token. The workspace will provide all the necessary connection parameters including the schema or dataset name.
After updating the configuration:
Troubleshooting
If you encounter connection issues:
Check the logs in Claude Desktop for any error messages
Verify your Keboola Storage API token is correct
Ensure all paths in the configuration are absolute paths
Confirm the virtual environment is properly activated and all dependencies are installed
Cursor AI Setup
To use this server with Cursor AI, you have two options for configuring the transport method: Server-Sent Events (SSE) or Standard I/O (stdio).
Create or edit the Cursor AI configuration file:
Add one of the following configurations (or all) based on your preferred transport method:
Option 1: Using Server-Sent Events (SSE)
Option 2a: Using Standard I/O (stdio)
Option 2b: Using WSL Standard I/O (wsl stdio)
When running the MCP server from Windows Subsystem for Linux with Cursor AI, use this.
where /wsl_path/to/keboola-mcp-server/.env file contains environment variables:
Replace:
/path/to/keboola-mcp-server with your actual path to the cloned repository
YOUR_REGION with your Keboola region (e.g., north-europe.azure, etc.). You can remove it if your region is just connection explicitly
your-keboola-storage-token with your Keboola Storage API token
your-workspace-schema with your Snowflake schema or BigQuery dataset of your workspace
After updating the configuration:
Restart Cursor AI
If you use the sse transport make sure to start your MCP server. You can do so by running this in the activated
virtual environment where you built the server:
Cursor AI should be automatically detect your MCP server and enable it.
BigQuery support
If your Keboola project uses BigQuery backend you will need to set GOOGLE_APPLICATION_CREDENTIALS environment variable
in addition to KBC_STORAGE_TOKEN and KBC_WORKSPACE_SCHEMA.
Go to your Keboola BigQuery workspace and display its credentials (click Connect button).
Download the credentials file to your local disk. It is a plain JSON file.
Set the full path of the downloaded JSON credentials file to GOOGLE_APPLICATION_CREDENTIALS environment variable.
This will give your MCP server instance permissions to access your BigQuery workspace in Google Cloud.
Available Tools
The server provides the following tools for interacting with Keboola Connection: