MCP Automem
AutoMem MCP is an MCP client tool that provides persistent memory for AI assistants. By connecting to the AutoMem graph - vector memory service, MCP - compatible AI platforms such as Claude, Cursor, and GitHub Copilot can remember users' decisions, coding styles, and preferences across sessions and devices, achieving perfect memory and personalized interaction.
rating : 2.5 points
downloads : 5.7K
What is AutoMem MCP?
AutoMem MCP is a Model Context Protocol server that connects your AI assistant to an advanced memory system. Traditional AI conversations start anew each time, while AutoMem enables AI to remember everything about you: coding style, decision - making logic, and project preferences, maintaining consistency across all conversations. It's like a long - term memory brain for AI, storing and retrieving information through a graph - vector architecture, allowing your AI assistant to truly understand your work habits and thinking patterns.How to use AutoMem MCP?
Using AutoMem is very simple: 1. Deploy the AutoMem memory service (locally or in the cloud) 2. Install the MCP client in your AI tool 3. Configure the connection parameters 4. Start the conversation, and the AI will automatically remember and learn It supports all MCP - compatible platforms such as Claude Desktop, Cursor IDE, Claude Code, ChatGPT, and GitHub Copilot.Use cases
• Software development: Let AI remember your coding specifications and project architecture • Decision support: Provide personalized suggestions based on historical decisions • Team collaboration: Share team preferences and best practices • Learning assistant: Track learning progress and knowledge acquisition over the long term • Personalized AI: Create an intelligent assistant that truly understands your habitsMain features
Persistent memory
AI memory is permanently saved and synchronized across sessions and devices. There's no need to repeatedly explain the same concepts, and the AI always remembers your preferences and decisions.
Graph - vector architecture
Combining the relational modeling of graph databases and the semantic understanding of vector search, it implements 11 types of memory relationships, supporting multi - hop reasoning and context - aware retrieval.
Cross - platform support
Supports all major AI platforms: Claude Desktop, Cursor IDE, Claude Code, ChatGPT, GitHub Copilot, OpenAI Codex, etc.
Multi - hop reasoning
Supports reasoning for complex questions, such as 'What is the occupation of Amanda's sister?', and finds relevant memory chains through entity expansion.
Context - aware coding
Based on the current programming language, project type, and file content, it intelligently recalls the most relevant coding patterns and best practices.
Remote MCP support
Supports remote connection via HTTP/SSE, enabling ChatGPT, Claude Web, mobile devices, etc. to use the AutoMem memory system.
Advantages
True long - term memory: Memory is permanently saved without session limitations
Cross - platform consistency: Use the same memory across all AI tools
Personalized experience: AI learns your unique style and preferences
Research - verified: Based on the latest memory research such as HippoRAG 2
Open - source and self - hosted: Full control over data at low cost
Easy to deploy: Supports multiple deployment methods such as local, Railway, and Docker
Limitations
Requires additional deployment of the memory service (but simple deployment options are provided)
Initial setup requires technical configuration (a wizard is provided to simplify the process)
Memory quality depends on usage frequency and patterns
Requires AI platforms to support the MCP protocol
How to use
Deploy the memory service
Choose the deployment method: local development (free) or Railway cloud (recommended for production environments). Local deployment only requires running make dev, and cloud deployment can be done with one click.
Install the MCP client
Choose the installation method according to your AI platform. For Claude Desktop, you can download the.mcpb file for installation, and for other platforms, use the npx command.
Configure the connection
Set the AutoMem service endpoint (http://localhost:8001 for local) and the API key (required for cloud). The wizard will automatically generate the configuration.
Start using
In AI conversations, memories will be automatically stored and recalled. You can also manually call memory operations.
Usage examples
Learn personal coding style
AI learns your coding preferences through multiple code reviews, such as error - handling patterns, naming conventions, and code structures.
Technical decision support
When facing technical selections, AI provides suggestions that match your thinking pattern based on your past decision records and reasons.
Project context memory
AI remembers the project's architectural decisions, technical debts, team agreements, etc., providing consistent context - aware assistance.
Multi - hop reasoning query
Answer complex questions through entity relationship chains, demonstrating the powerful reasoning ability of graph memory.
Frequently Asked Questions
Is AutoMem MCP free?
Is my data secure?
Which AI platforms are supported?
Do I need programming knowledge to set it up?
How much storage space does the memory take up?
How can I delete or modify memories?
Does it support team collaboration?
What if the memory service goes down?
Related resources
GitHub repository
MCP client source code and documentation
AutoMem service repository
Memory backend service, including deployment guide
Installation guide
Detailed platform installation and configuration instructions
Discord community
Get help, share feedback, and join the community
NPM package
MCP client NPM package page
Railway deployment
One - click cloud deployment of the memory service
Research paper
HippoRAG 2 paper (basis of the graph - vector memory architecture)

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
21.5K
4.5 points

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
34.5K
5 points

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
24.7K
4.3 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
72.5K
4.3 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
64.6K
4.5 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
32.3K
5 points

Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
49.3K
4.8 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
21.1K
4.5 points






