MCP Neuralmemory
M

MCP Neuralmemory

MCP-KG-Memory is an MCP server with a long-term memory layer based on a knowledge graph, aiming to solve the context forgetting problem of AI programming assistants. It persistently stores project goals, constraints, strategies, and user preferences through Neo4j, enabling semantic retrieval and active learning, and endowing the AI assistant with continuous memory and context awareness capabilities.
2.5 points
6.8K

What is MCP-KG-Memory?

MCP-KG-Memory is a long-term memory system based on a knowledge graph, specifically designed for AI programming assistants (such as Cursor, Windsurf, Antigravity, etc.). It solves the problem of AI assistants starting 'from scratch' in each conversation. By persistently storing project-related context information, the AI can remember: • Project goals and progress status • Architectural decisions and coding constraints • Past successful strategies and failed experiences • User's coding preferences and styles • Semantic relationships between code files It's like a 'brain' for the AI, enabling the assistant to make more intelligent decisions based on historical experiences.

How to use MCP-KG-Memory?

Using MCP-KG-Memory is very simple: 1. Install and configure the server 2. Add MCP configuration in the AI editor 3. The AI assistant will automatically use the memory tool 4. When starting a conversation, the AI will query relevant historical context 5. When modifying code, the system automatically tracks and updates the knowledge graph The entire process is transparent to the user. You just need to talk to the AI assistant as usual, and it will automatically use the memory system to provide more accurate assistance.

Applicable scenarios

MCP-KG-Memory is particularly suitable for the following scenarios: • Long-term project development: Requires the AI to remember decisions made weeks or even months ago • Team collaboration: Ensures that the AI understands the overall architecture and constraints of the project • Complex refactoring: Avoids repeating past mistakes and reuses successful strategies • Personalized coding: Allows the AI to learn and adapt to your coding style preferences • New member onboarding: Enables quick understanding of project history and context

Main features

Active context injection (kg_autopilot)
Every time a new task starts, the AI will automatically query the memory system to obtain relevant project goals, historical decisions, successful strategies, and failed experiences, ensuring that it works based on the complete context.
Semantic graph search
It's not just keyword matching, but intelligent retrieval through the relationship network of the knowledge graph. For example, when searching for 'authentication', it will also return all relevant nodes such as user models, JWT tools, and security constraints.
Strategic learning ability
The system will learn from conversations: • Implicit learning: Infer your work patterns and strategies • Result tracking: Record which methods are successful and which are failed • Pattern recognition: Discover commonly used coding patterns and architectural decisions
Automatic change tracking (kg_track_changes)
When the AI assistant modifies code files, the system will automatically record these changes and link them to relevant project goals, establishing the association between code implementation and design intent.
User preference learning
The system will learn your coding preferences, such as whether you like the SOLID principle, Clean Architecture, specific naming conventions, etc., and maintain consistency in future suggestions.
Advantages
Solve the context forgetting problem: The AI no longer starts 'from scratch' in each conversation.
Improve work efficiency: Reduce the time for repeatedly explaining the project background.
Maintain consistency: Ensure that the AI's suggestions comply with project architecture constraints.
Learning ability: The system will continuously learn and improve from interactions.
Visualization support: Provide a visualization interface for the knowledge graph to intuitively understand the project structure.
Limitations
Requires initial configuration: Need to set up the Neo4j database and API key.
Learning curve: Need to understand the basic concepts of the knowledge graph.
Resource consumption: Running the Neo4j database requires certain system resources.
Privacy considerations: All project information is stored in the local database.
Editor support: Requires the editor to support the MCP protocol.

How to use

Install MCP-KG-Memory
Install the package using pipx or pip, and run the setup wizard to complete the initial configuration.
Configure the editor
Add the kg-memory server configuration to the MCP configuration file of the AI editor.
Add system prompts
Add rules in the editor to let the AI assistant automatically use the memory tool.
Start using
Talk to the AI assistant as usual, and it will automatically query and update the memory system.

Usage examples

New feature development
When you start developing a new feature, the AI will automatically query relevant project goals, technical constraints, and past similar experiences.
Problem debugging
When encountering a problem, the AI can query solutions and precautions for past similar problems.
Code refactoring
When refactoring code, the AI understands the dependency relationships between modules and the reasons for historical changes.

Frequently Asked Questions

Do I need to run the Neo4j database all the time?
Is my project information secure?
Which AI editors are supported?
How can I view the stored knowledge graph?
What should I do if I want to reset the memory?

Related resources

GitHub repository
Project source code and latest updates
Model Context Protocol
Official documentation of the MCP protocol
Neo4j documentation
Guide to using the Neo4j graph database
Google AI Studio
Get the Gemini API key

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "kg-memory": {
      "command": "/path/to/your/venv/bin/python",
      "args": [
        "-m",
        "kg_mcp",
        "--transport",
        "stdio"
      ],
      "env": {
        "NEO4J_URI": "bolt://127.0.0.1:7687",
        "NEO4J_USER": "neo4j",
        "NEO4J_PASSWORD": "YOUR_NEO4J_PASSWORD",
        "GEMINI_API_KEY": "YOUR_GOOGLE_AI_STUDIO_KEY",
        "LLM_MODE": "gemini_direct",
        "KG_MCP_TOKEN": "your-secure-token",
        "LOG_LEVEL": "INFO"
      }
    }
  }
}

{
  "mcpServers": {
    "kg-memory": {
      "command": "/path/to/venv/bin/python",
      "args": ["-m", "kg_mcp", "--transport", "stdio"],
      "env": {
        "NEO4J_URI": "bolt://127.0.0.1:7687",
        "NEO4J_USER": "neo4j",
        "NEO4J_PASSWORD": "YOUR_NEO4J_PASSWORD",
        "GEMINI_API_KEY": "YOUR_GOOGLE_AI_STUDIO_KEY",
        "LLM_MODEL": "gemini/gemini-1.5-flash",
        "KG_MCP_TOKEN": "your-secure-token"
      }
    }
  }
}
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Airweave
Airweave is an open - source context retrieval layer for AI agents and RAG systems. It connects and synchronizes data from various applications, tools, and databases, and provides relevant, real - time, multi - source contextual information to AI agents through a unified search interface.
Python
5.9K
5 points
V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
4.5K
4.5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
6.7K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
6.2K
5 points
A
Apify MCP Server
The Apify MCP Server is a tool based on the Model Context Protocol (MCP) that allows AI assistants to extract data from websites such as social media, search engines, and e-commerce through thousands of ready-to-use crawlers, scrapers, and automation tools (Apify Actors). It supports OAuth and Skyfire proxy payment and can be integrated into MCP clients such as Claude and VS Code through HTTPS endpoints or local stdio.
TypeScript
7.5K
5 points
R
Rsdoctor
Rsdoctor is a build analysis tool specifically designed for the Rspack ecosystem, fully compatible with webpack. It provides visual build analysis, multi - dimensional performance diagnosis, and intelligent optimization suggestions to help developers improve build efficiency and engineering quality.
TypeScript
10.4K
5 points
N
Next Devtools MCP
The Next.js development tools MCP server provides Next.js development tools and utilities for AI programming assistants such as Claude and Cursor, including runtime diagnostics, development automation, and document access functions.
TypeScript
9.7K
5 points
T
Testkube
Testkube is a test orchestration and execution framework for cloud-native applications, providing a unified platform to define, run, and analyze tests. It supports existing testing tools and Kubernetes infrastructure.
Go
6.5K
5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
34.2K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
24.4K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
71.7K
4.3 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.4K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
31.0K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
64.3K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
47.4K
4.8 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
22.0K
4.5 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase