MongoDB Lens
Contents
- Quick Start
- Features
- Installation
- Configuration
- Client Setup
- Data Protection
- Tutorial
- Test Suite
- Disclaimer
- Support
Quick Start
- Install MongoDB Lens
- Configure MongoDB Lens
- Set up your MCP Client (e.g. Claude Desktop, Cursor, etc)
- Explore your MongoDB databases with natural language queries
Features
- Tools
- Resources
- Prompts
- Other
Tools
- `add-connection-alias`: Add a new MongoDB connection alias
- `aggregate-data`: Execute aggregation pipelines
- `analyze-query-patterns`: Analyze live queries and suggest optimizations
- `analyze-schema`: Automatically infer collection schemas
- `bulk-operations`: Perform multiple operations efficiently (requires confirmation for destructive operations)
- `clear-cache`: Clear memory caches to ensure fresh data
- `collation-query`: Find documents with language-specific collation rules
- `compare-schemas`: Compare schemas between two collections
- `connect-mongodb`: Connect to a different MongoDB URI
- `connect-original`: Connect back to the original MongoDB URI used at startup
- `count-documents`: Count documents matching specified criteria
- `create-collection`: Create new collections with custom options
- `create-database`: Create a new database with option to switch to it
- `create-index`: Create new indexes for performance optimization
- `create-timeseries`: Create time series collections for temporal data
- `create-user`: Create new database users with specific roles
- `current-database`: Show the current database context
- `delete-document`: Delete documents matching specified criteria (requires confirmation)
- `distinct-values`: Extract unique values for any field
- `drop-collection`: Remove collections from the database (requires confirmation)
- `drop-database`: Drop a database (requires confirmation)
- `drop-index`: Remove indexes from collections (requires confirmation)
- `drop-user`: Remove database users (requires confirmation)
- `explain-query`: Analyze query execution plans
- `export-data`: Export query results in JSON or CSV format
- `find-documents`: Run queries with filters, projections, and sorting
- `generate-schema-validator`: Generate JSON Schema validators
- `geo-query`: Perform geospatial queries with various operators
- `get-stats`: Retrieve database or collection statistics
- `gridfs-operation`: Manage large files with GridFS buckets
- `insert-document`: Insert one or more documents into collections
- `list-collections`: Explore collections in the current database
- `list-connections`: View all available MongoDB connection aliases
- `list-databases`: View all accessible databases
- `rename-collection`: Rename existing collections (requires confirmation when dropping targets)
- `shard-status`: View sharding configuration for databases and collections
- `text-search`: Perform full-text search across text-indexed fields
- `transaction`: Execute multiple operations in a single ACID transaction
- `update-document`: Update documents matching specified criteria
- `use-database`: Switch to a specific database context
- `validate-collection`: Check for data inconsistencies
- `watch-changes`: Monitor real-time changes to collections
Resources
- `collection-indexes`: Index information for a collection
- `collection-schema`: Schema information for a collection
- `collection-stats`: Performance statistics for a collection
- `collection-validation`: Validation rules for a collection
- `collections`: List of collections in the current database
- `database-triggers`: Database change streams and event triggers configuration
- `database-users`: Database users and roles in the current database
- `databases`: List of all accessible databases
- `performance-metrics`: Real-time performance metrics and profiling data
- `replica-status`: Replica set status and configuration
- `server-status`: Server status information
- `stored-functions`: Stored JavaScript functions in the current database
Prompts
- `aggregation-builder`: Step-by-step creation of aggregation pipelines
- `backup-strategy`: Customized backup and recovery recommendations
- `data-modeling`: Expert advice on MongoDB schema design for specific use cases
- `database-health-check`: Comprehensive database health assessment and recommendations
- `index-recommendation`: Get personalized index suggestions based on query patterns
- `migration-guide`: Step-by-step MongoDB version migration plans
- `mongo-shell`: Generate MongoDB shell commands with explanations
- `multi-tenant-design`: Design MongoDB multi-tenant database architecture
- `query-builder`: Interactive guidance for constructing MongoDB queries
- `query-optimizer`: Optimization recommendations for slow queries
- `schema-analysis`: Detailed collection schema analysis with recommendations
- `schema-versioning`: Manage schema evolution in MongoDB applications
- `security-audit`: Database security analysis and improvement recommendations
- `sql-to-mongodb`: Convert SQL queries to MongoDB aggregation pipelines
Other Features
- Overview
- New Database Metadata
Other Features: Overview
- [Config File](#configuration-config-file): Custom configuration via
~/.mongodb-lens.[jsonc|json]
- [Env Var Overrides](#configuration-environment-variable-overrides): Override config settings via
process.env.CONFIG_*
- [Confirmation System](#data-protection-confirmation-for-destructive-operations): Two-step verification for destructive operations
- [Multiple Connections](#configuration-multiple-mongodb-connections): Define and switch between named URI aliases
- [Component Disabling](#disabling-tools): Selectively disable tools, prompts or resources
- Connection Resilience: Auto-reconnection with exponential backoff
- Query Safeguards: Configurable limits and performance protections
- Error Handling: Comprehensive JSONRPC error codes and messages
- Schema Inference: Efficient schema analysis with intelligent sampling
- Credential Protection: Connection string password obfuscation in logs
- Memory Management: Auto-monitoring and cleanup for large operations
- Smart Caching: Optimized caching for schema, indexes, fields and collections
- Backwards Compatible: Support both modern and legacy MongoDB versions
Other Features: New Database Metadata
metadata
collection into each database it creates.metadata
collection stores a single document containing contextual information serving as a permanent record of the database's origin while ensuring the new and otherwise empty database persists in MongoDB's storage system.metadata
collection via the drop-collection
tool:- "Drop the new database's metadata collection"<br>
<sup> Uses
drop-collection
tool (with confirmation)</sup>
Installation
- NPX (Easiest)
- Docker Hub
- Node.js from Source
- Docker from Source
- Installation Verification
- Older MongoDB Versions
Installation: NPX
[!TIP]<br> If you encounter permissions errors withnpx
try runningnpx clear-npx-cache
prior to runningnpx -y mongodb-lens
(this clears the cache and re-downloads the package).
Installation: Docker Hub
[!NOTE]<br> Docker Hub requires Docker installed and running on your system.
Installation: Node.js from Source
- Clone the MongoDB Lens repository:<br>
- Navigate to the cloned repository directory:<br>
- Ensure Node.js is installed:<br>
- Install Node.js dependencies:<br>
- Start the server:<br>
Installation: Docker from Source
[!NOTE]<br> Docker from source requires Docker installed and running on your system.
- Clone the MongoDB Lens repository:<br>
- Navigate to the cloned repository directory:<br>
- Ensure Docker is installed:<br>
- Build the Docker image:<br>
- Run the container:<br>
Installation Verification
Installation: Older MongoDB Versions
< 4.0
, the MongoDB Node.js driver used by the latest version of MongoDB Lens will not be compatible. Specifically, MongoDB Node.js driver versions 4.0.0
and above require MongoDB version 4.0
or higher.3.x
series (e.g. 3.7.4
which is compatible with MongoDB 3.6
).Older MongoDB Versions: Running from Source
- Clone the MongoDB Lens repository:<br>
- Navigate to the cloned repository directory:<br>
- Modify
package.json
:<br>
- Install Node.js dependencies:<br>
- Start MongoDB Lens:<br>
[!NOTE]<br> You may also need to revert this commit to add backuseNewUrlParser
anduseUnifiedTopology
MongoDB configuration options.
Older MongoDB Versions: Using NPX or Docker
Configuration
- MongoDB Connection String
- Config File
- Config File Generation
- Multiple MongoDB Connections
- Environment Variable Overrides
- Cross-Platform Environment Variables
Configuration: MongoDB Connection String
- Local connection:<br>
mongodb://localhost:27017
- Connection to
mydatabase
with credentials fromadmin
database:<br>mongodb://username:password@hostname:27017/mydatabase?authSource=admin
- Connection to
mydatabase
with various other options:<br>mongodb://hostname:27017/mydatabase?retryWrites=true&w=majority
Configuration: Config File
[!NOTE]<br> The config file is optional. MongoDB Lens will run with default settings if no config file is provided.
[!TIP]<br> You only need to include the settings you want to customize in the config file. MongoDB Lens will use default settings for any omitted values.
[!TIP]<br> MongoDB Lens supports both.json
and.jsonc
(JSON with comments) config file formats.
~/.mongodb-lens.jsonc
first, then falls back to
~/.mongodb-lens.json
if the former doesn't exist
CONFIG_PATH
to the desired file path.Configuration: Config File Generation
config:create
script:~/.mongodb-lens.jsonc
Config File Generation: Custom Path
CONFIG_PATH
environment variable.- If
CONFIG_PATH
has no file extension, it's treated as a directory and.mongodb-lens.jsonc
is appended
- If
CONFIG_PATH
ends with.json
(not.jsonc
) comments are removed from the generated file
Configuration: Multiple MongoDB Connections
mongoUri
config setting to an object with alias-URI pairs:- The first URI in the list (e.g.
main
) becomes the default connection at startup
- You can switch connections using natural language:
"Connect to backup"
or"Connect to atlas"
- The original syntax still works:
"Connect to mongodb://localhost:27018"
- The
list-connections
tool shows all available connection aliases
[!NOTE]<br> When using the command-line argument to specify a connection, you can use either a full MongoDB URI or an alias defined in your configuration file.
[!TIP]<br> To add connection aliases at runtime, use theadd-connection-alias
tool.
Configuration: Environment Variable Overrides
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
[object Object] | [object Object] |
- For boolean settings, use string values
'true'
or'false'
.
- For numeric settings, use string representations.
- For nested objects or arrays, use JSON strings.
Configuration: Cross-Platform Environment Variables
cross-env
:- Install cross-env globally:<br>
- Prefix any NPX or Node.js environment variables in this document's examples:<br>
Client Setup
- Claude Desktop
- MCP Inspector
- Other MCP Clients
Client Setup: Claude Desktop
- Install Claude Desktop
- Open
claude_desktop_config.json
(create if it doesn't exist):
- Add the MongoDB Lens server configuration as per configuration options
- Restart Claude Desktop
- Start a conversation with Claude about your MongoDB data
Claude Desktop Configuration Options
- Option 1: NPX (Recommended)
- Option 2: Docker Hub Image
- Option 3: Local Node.js Installation
- Option 4: Local Docker Image
- Replace
mongodb://your-connection-string
with your MongoDB connection string or omit it to use the defaultmongodb://localhost:27017
.
- To use a custom config file, set `CONFIG_PATH` environment variable.
- To include environment variables:
Client Setup: MCP Inspector
[!NOTE]<br> MCP Inspector starts a proxy server on port 3000 and web client on port 5173.
- Run MCP Inspector:<br>
- Open MCP Inspector: http://localhost:5173
Client Setup: Other MCP Clients
Data Protection
- Read-Only User Accounts
- Working with Database Backups
- Data Flow Considerations
- Confirmation for Destructive Operations
- Disabling Destructive Operations
Data Protection: Read-Only User Accounts
read
role scoped to the database(s) you're targeting. In MongoDB shell, you'd run something like:Data Protection: Working with Database Backups
mongodump
. Next, spin up a fresh MongoDB instance (e.g. on a different port like 27018
) and restore the backup there using mongorestore
. Once it's running, point MongoDB Lens to the backup instance's connection string (e.g. mongodb://localhost:27018/mydatabase
).Data Protection: Data Flow Considerations
- How Your Data Flows Through the System
- Protecting Sensitive Data with Projection
- Connection Aliases and Passwords
- Local Setup for Maximum Safety
Data Flow Considerations: How Your Data Flows Through the System
[!NOTE]<br> While this example uses a local MongoDB instance, the same principles apply to remote MongoDB instances.
- You submit a request<br><sup> e.g. "Show me all users older than 30"</sup>
- Your client sends the request to the remote LLM<br><sup> The LLM provider receives your exact words along with a list of available MCP tools and their parameters.</sup>
- The remote LLM interprets your request<br><sup> It determines your intent and instructs the client to use a specific MCP tool with appropriate parameters.</sup>
- The client asks MongoDB Lens to run the tool<br><sup> This occurs locally on your machine via stdio.</sup>
- MongoDB Lens queries your MongoDB database
- MongoDB Lens retrieves your MongoDB query results
- MongoDB Lens sends the data back to the client<br><sup> The client receives results formatted by MongoDB Lens.</sup>
- The client forwards the data to the remote LLM<br><sup> The LLM provider sees the exact data returned by MongoDB Lens.</sup>
- The remote LLM processes the data<br><sup> It may summarize or format the results further.</sup>
- The remote LLM sends the final response to the client<br><sup> The client displays the answer to you.</sup>
Data Flow Considerations: Protecting Sensitive Data with Projection
find-documents
, aggregate-data
, or export-data
. Projection allows you to specify which fields to include or exclude in query results, ensuring sensitive information stays local.- "Show me all users older than 30, but use projection to hide their passwords."<br>
<sup> Uses
find-documents
tool with projection</sup>
Data Flow Considerations: Connection Aliases and Passwords
add-connection-alias
tool, avoid added aliases to URIs that contain passwords if you're using a remote LLM provider. Since your request is sent to the LLM, any passwords in the URI could be exposed. Instead, define aliases with passwords in the MongoDB Lens config file, where they remain local and are not transmitted to the LLM.Data Flow Considerations: Local Setup for Maximum Safety
Data Protection: Confirmation for Destructive Operations
- First tool invocation: Returns a 4-digit confirmation token that expires after 5 minutes
- Second tool invocation: Executes the operation if provided with the valid token
drop-user
: Remove a database user
drop-index
: Remove an index (potential performance impact)
drop-database
: Permanently delete a database
drop-collection
: Delete a collection and all its documents
delete-document
: Delete one or multiple documents
bulk-operations
: When including delete operations
rename-collection
: When the target collection exists and will be dropped
[!NOTE]<br> If you're working in a controlled environment where data loss is acceptable, you can configure MongoDB Lens to bypass confirmation and perform destructive operations immediately.
Bypassing Confirmation for Destructive Operations
CONFIG_DISABLE_DESTRUCTIVE_OPERATION_TOKENS
to true
to execute destructive operations immediately without confirmation:[!WARNING]<br> Disabling confirmation tokens removes an important safety mechanism. It's strongly recommended to only use this option in controlled environments where data loss is acceptable, such as development or testing. Disable at your own risk.
Data Protection: Disabling Destructive Operations
- Disabling Tools
- High-Risk Tools
- Medium-Risk Tools