A Model Context Protocol (MCP) server that provides read-only access to the Hugging Face Hub APIs. This server allows LLMs like Claude to interact with Hugging Face's models, datasets, spaces, papers, and collections.
Components
Resources
The server exposes popular Hugging Face resources:
Custom hf:// URI scheme for accessing resources
Models with hf://model/{model_id} URIs
Datasets with hf://dataset/{dataset_id} URIs
Spaces with hf://space/{space_id} URIs
All resources have descriptive names and JSON content type
Prompts
The server provides two prompt templates:
compare-models: Generates a comparison between multiple Hugging Face models
summarize-paper: Summarizes a research paper from Hugging Face
Tools
The server implements several tool categories:
Model Tools
Dataset Tools
Space Tools
Paper Tools
Collection Tools
Configuration
The server does not require configuration, but supports optional Hugging Face authentication:
Set HF_TOKEN environment variable with your Hugging Face API token for:
Quickstart
Install
Installing via Smithery
To install huggingface-mcp-server for Claude Desktop automatically via Smithery:
Claude Desktop
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Development
Building and Publishing
To prepare the package for distribution:
Sync dependencies and update lockfile:
Build package distributions:
This will create source and wheel distributions in the dist/ directory.
Publish to PyPI:
Note: You'll need to set PyPI credentials via environment variables or command flags:
Token: --token or UV_PUBLISH_TOKEN
Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging
experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via `npm` with this command:
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
Example Prompts for Claude
When using this server with Claude, try these example prompts:
"Search for BERT models on Hugging Face with less than 100 million parameters"
"Find the most popular datasets for text classification on Hugging Face"
"What are today's featured AI research papers on Hugging Face?"
"Summarize the paper with arXiv ID 2307.09288 using the Hugging Face MCP server"
"Compare the Llama-3-8B and Mistral-7B models from Hugging Face"
"Show me the most popular Gradio spaces for image generation"
"Find collections created by TheBloke that include Mixtral models"
Troubleshooting
If you encounter issues with the server:
Check server logs in Claude Desktop:
For API rate limiting errors, consider adding a Hugging Face API token
Make sure your machine has internet connectivity to reach the Hugging Face API
If a particular tool is failing, try accessing the same data through the Hugging Face website to verify it exists
A Model Context Protocol (MCP) server that provides read-only access to the Hugging Face Hub APIs. This server allows LLMs like Claude to interact with Hugging Face's models, datasets, spaces, papers, and collections.
Components
Resources
The server exposes popular Hugging Face resources:
Custom hf:// URI scheme for accessing resources
Models with hf://model/{model_id} URIs
Datasets with hf://dataset/{dataset_id} URIs
Spaces with hf://space/{space_id} URIs
All resources have descriptive names and JSON content type
Prompts
The server provides two prompt templates:
compare-models: Generates a comparison between multiple Hugging Face models
summarize-paper: Summarizes a research paper from Hugging Face
Tools
The server implements several tool categories:
Model Tools
Dataset Tools
Space Tools
Paper Tools
Collection Tools
Configuration
The server does not require configuration, but supports optional Hugging Face authentication:
Set HF_TOKEN environment variable with your Hugging Face API token for:
Quickstart
Install
Installing via Smithery
To install huggingface-mcp-server for Claude Desktop automatically via Smithery:
Claude Desktop
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Development
Building and Publishing
To prepare the package for distribution:
Sync dependencies and update lockfile:
Build package distributions:
This will create source and wheel distributions in the dist/ directory.
Publish to PyPI:
Note: You'll need to set PyPI credentials via environment variables or command flags:
Token: --token or UV_PUBLISH_TOKEN
Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging
experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via `npm` with this command:
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
Example Prompts for Claude
When using this server with Claude, try these example prompts:
"Search for BERT models on Hugging Face with less than 100 million parameters"
"Find the most popular datasets for text classification on Hugging Face"
"What are today's featured AI research papers on Hugging Face?"
"Summarize the paper with arXiv ID 2307.09288 using the Hugging Face MCP server"
"Compare the Llama-3-8B and Mistral-7B models from Hugging Face"
"Show me the most popular Gradio spaces for image generation"
"Find collections created by TheBloke that include Mixtral models"
Troubleshooting
If you encounter issues with the server:
Check server logs in Claude Desktop:
For API rate limiting errors, consider adding a Hugging Face API token
Make sure your machine has internet connectivity to reach the Hugging Face API
If a particular tool is failing, try accessing the same data through the Hugging Face website to verify it exists