MCP Neuralmemory
MCP-KG-Memory is an MCP server with a long-term memory layer based on a knowledge graph, aiming to solve the context forgetting problem of AI programming assistants. It persistently stores project goals, constraints, strategies, and user preferences through Neo4j, enabling semantic retrieval and active learning, and endowing the AI assistant with continuous memory and context awareness capabilities.
rating : 2.5 points
downloads : 6.8K
What is MCP-KG-Memory?
MCP-KG-Memory is a long-term memory system based on a knowledge graph, specifically designed for AI programming assistants (such as Cursor, Windsurf, Antigravity, etc.). It solves the problem of AI assistants starting 'from scratch' in each conversation. By persistently storing project-related context information, the AI can remember: • Project goals and progress status • Architectural decisions and coding constraints • Past successful strategies and failed experiences • User's coding preferences and styles • Semantic relationships between code files It's like a 'brain' for the AI, enabling the assistant to make more intelligent decisions based on historical experiences.How to use MCP-KG-Memory?
Using MCP-KG-Memory is very simple: 1. Install and configure the server 2. Add MCP configuration in the AI editor 3. The AI assistant will automatically use the memory tool 4. When starting a conversation, the AI will query relevant historical context 5. When modifying code, the system automatically tracks and updates the knowledge graph The entire process is transparent to the user. You just need to talk to the AI assistant as usual, and it will automatically use the memory system to provide more accurate assistance.Applicable scenarios
MCP-KG-Memory is particularly suitable for the following scenarios: • Long-term project development: Requires the AI to remember decisions made weeks or even months ago • Team collaboration: Ensures that the AI understands the overall architecture and constraints of the project • Complex refactoring: Avoids repeating past mistakes and reuses successful strategies • Personalized coding: Allows the AI to learn and adapt to your coding style preferences • New member onboarding: Enables quick understanding of project history and contextMain features
Active context injection (kg_autopilot)
Every time a new task starts, the AI will automatically query the memory system to obtain relevant project goals, historical decisions, successful strategies, and failed experiences, ensuring that it works based on the complete context.
Semantic graph search
It's not just keyword matching, but intelligent retrieval through the relationship network of the knowledge graph. For example, when searching for 'authentication', it will also return all relevant nodes such as user models, JWT tools, and security constraints.
Strategic learning ability
The system will learn from conversations:
• Implicit learning: Infer your work patterns and strategies
• Result tracking: Record which methods are successful and which are failed
• Pattern recognition: Discover commonly used coding patterns and architectural decisions
Automatic change tracking (kg_track_changes)
When the AI assistant modifies code files, the system will automatically record these changes and link them to relevant project goals, establishing the association between code implementation and design intent.
User preference learning
The system will learn your coding preferences, such as whether you like the SOLID principle, Clean Architecture, specific naming conventions, etc., and maintain consistency in future suggestions.
Advantages
Solve the context forgetting problem: The AI no longer starts 'from scratch' in each conversation.
Improve work efficiency: Reduce the time for repeatedly explaining the project background.
Maintain consistency: Ensure that the AI's suggestions comply with project architecture constraints.
Learning ability: The system will continuously learn and improve from interactions.
Visualization support: Provide a visualization interface for the knowledge graph to intuitively understand the project structure.
Limitations
Requires initial configuration: Need to set up the Neo4j database and API key.
Learning curve: Need to understand the basic concepts of the knowledge graph.
Resource consumption: Running the Neo4j database requires certain system resources.
Privacy considerations: All project information is stored in the local database.
Editor support: Requires the editor to support the MCP protocol.
How to use
Install MCP-KG-Memory
Install the package using pipx or pip, and run the setup wizard to complete the initial configuration.
Configure the editor
Add the kg-memory server configuration to the MCP configuration file of the AI editor.
Add system prompts
Add rules in the editor to let the AI assistant automatically use the memory tool.
Start using
Talk to the AI assistant as usual, and it will automatically query and update the memory system.
Usage examples
New feature development
When you start developing a new feature, the AI will automatically query relevant project goals, technical constraints, and past similar experiences.
Problem debugging
When encountering a problem, the AI can query solutions and precautions for past similar problems.
Code refactoring
When refactoring code, the AI understands the dependency relationships between modules and the reasons for historical changes.
Frequently Asked Questions
Do I need to run the Neo4j database all the time?
Is my project information secure?
Which AI editors are supported?
How can I view the stored knowledge graph?
What should I do if I want to reset the memory?
Related resources
GitHub repository
Project source code and latest updates
Model Context Protocol
Official documentation of the MCP protocol
Neo4j documentation
Guide to using the Neo4j graph database
Google AI Studio
Get the Gemini API key

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
34.2K
5 points

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
24.4K
4.3 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
71.7K
4.3 points

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.4K
4.5 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
31.0K
5 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
64.3K
4.5 points

Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
47.4K
4.8 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
22.0K
4.5 points



