memgpt.com
memgpt.com logo

MemGPT

Implements a multi-provider memory system for LLMs, enabling conversation history management and seamless model switchin...

Created byApr 22, 2025

MemGPT MCP Server

A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.

Features

Tools

  • chat - Send a message to the current LLM provider
  • get_memory - Retrieve conversation history
  • clear_memory - Clear conversation history
  • use_provider - Switch between different LLM providers
  • use_model - Switch to a different model for the current provider

Development

Install dependencies:
Build the server:
For development with auto-rebuild:

Installation

To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Environment Variables

  • OPENAI_API_KEY - Your OpenAI API key
  • ANTHROPIC_API_KEY - Your Anthropic API key
  • OPENROUTER_API_KEY - Your OpenRouter API key

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector:
The Inspector will provide a URL to access debugging tools in your browser.

Recent Updates

Claude 3 and 3.5 Series Support (March 2024)

  • Added support for latest Claude models:

Unlimited Memory Retrieval

  • Added support for retrieving unlimited conversation history
  • Use { "limit": null } with the get_memory tool to retrieve all stored memories
  • Use { "limit": n } to retrieve the n most recent memories
  • Default limit is 10 if not specified

MemGPT MCP Server

A TypeScript-based MCP server that implements a memory system for LLMs. It provides tools for chatting with different LLM providers while maintaining conversation history.

Features

Tools

  • chat - Send a message to the current LLM provider
  • get_memory - Retrieve conversation history
  • clear_memory - Clear conversation history
  • use_provider - Switch between different LLM providers
  • use_model - Switch to a different model for the current provider

Development

Install dependencies:
Build the server:
For development with auto-rebuild:

Installation

To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Environment Variables

  • OPENAI_API_KEY - Your OpenAI API key
  • ANTHROPIC_API_KEY - Your Anthropic API key
  • OPENROUTER_API_KEY - Your OpenRouter API key

Debugging

Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector:
The Inspector will provide a URL to access debugging tools in your browser.

Recent Updates

Claude 3 and 3.5 Series Support (March 2024)

  • Added support for latest Claude models:

Unlimited Memory Retrieval

  • Added support for retrieving unlimited conversation history
  • Use { "limit": null } with the get_memory tool to retrieve all stored memories
  • Use { "limit": n } to retrieve the n most recent memories
  • Default limit is 10 if not specified