Smartmemory
S

Smartmemory

SmartMemory is an MCP server that provides structured memory for LLMs. It converts conversation content into knowledge graphs through natural dialogue, enabling AI assistants to learn and apply business rules.
2.5 points
7.4K

What is SmartMemory?

SmartMemory is an innovative MCP (Model Context Protocol) server that endows your favorite AI assistants (such as Claude, Gemini, etc.) with long - term memory and logical reasoning capabilities. It's like an AI's 'brain', capable of learning facts and business rules from conversations and organizing them into a structured knowledge graph, allowing the AI to remember information and conduct logical deductions.

How to use SmartMemory?

You can use SmartMemory in two main ways: 1. **Conversation Mode (Brain)**: Integrate it as an MCP server into an AI client (such as Claude Desktop) to naturally learn rules during the chat process. 2. **Supervision Mode (Factory)**: Deploy a complete Web dashboard to extract rules in batches from documents (such as PDFs) and visualize the knowledge graph. Both modes support rapid Docker deployment without complex Python environment configuration.

Applicable Scenarios

SmartMemory is particularly suitable for the following scenarios: • **Individual Users**: Those who hope their AI assistants can remember personal preferences, habits, and important facts. • **Business Teams**: Teams that need to let the AI learn company policies, business processes, and rules. • **Knowledge Management**: Extract structured knowledge from a large number of documents and establish associations. • **Education and Training**: Build domain knowledge graphs to assist learning and decision - making. • **Developers**: Add memory and reasoning capabilities to AI applications.

Main Features

Structured Memory Storage
Convert conversation content into an RDF knowledge graph and store entities, attributes, and relationships in the form of triples to achieve structured memory.
Natural Dialogue Learning
Let the AI learn new facts and rules through daily conversations without manual programming or complex configuration.
Logical Reasoning Ability
Conduct logical deductions based on learned rules and facts to answer questions that require reasoning (e.g., 'Can Bob vote?').
Rule Verification and Confirmation
Confirm with the user before adding new rules to ensure the accuracy and reliability of the rules learned by the AI.
Multi - mode Deployment
Support both MCP server mode and Web dashboard mode to meet the needs of different usage scenarios.
Document Intelligent Extraction
Automatically extract business rules from documents such as PDFs and build knowledge graphs in batches.
Visualized Knowledge Graph
Intuitively view and manage entities and relationships in the knowledge graph through the Web dashboard.
Support for Multiple LLM Providers
Support multiple LLM providers such as OpenAI, Anthropic, Google Gemini, and local Ollama.
Advantages
🔄 **Seamless Integration**: As an MCP server, it can be easily integrated into mainstream AI clients such as Claude Desktop and Gemini.
🧠 **Intelligent Learning**: Learn through natural dialogue, allowing you to teach the AI new knowledge without a technical background.
📊 **Structured Storage**: Store knowledge in the form of a graph for easy querying, updating, and reasoning.
🐳 **Simple Deployment**: Provide a Docker image for one - click deployment without complex environment configuration.
🔧 **Flexible Configuration**: Support multiple LLM providers and choose the most suitable model according to your needs.
👥 **Team Collaboration**: The supervision mode supports teams to share knowledge graphs and rule libraries.
📈 **Scalability**: Based on open standards (RDF/SPARQL), it is easy to expand and integrate.
Limitations
⚠️ **Experimental Nature**: Currently in the proof - of - concept stage, not recommended for production environments.
📚 **Learning Curve**: You need to understand the basic concepts of knowledge graphs for the best results.
⚡ **Performance Dependence**: The reasoning speed depends on the LLM response time and graph complexity.
🔍 **Rule Verification**: Manual confirmation of new rules is required, which may increase interaction costs.
💾 **Storage Management**: Appropriate management and maintenance are required after the knowledge graph grows.
🌐 **Network Requirements**: A stable network connection is required when using cloud - based LLMs.
🛠️ **Configuration Requirements**: Advanced features require certain technical configuration knowledge.

How to Use

Select the Usage Mode
Choose the conversation mode (for personal use) or the supervision mode (for teams/batch processing) according to your needs. The conversation mode is suitable for daily chats, while the supervision mode is suitable for document processing and visualization.
Rapid Docker Deployment (Recommended)
Use the Docker image for rapid deployment without installing a Python environment. Configure the MCP client to point to the SmartMemory container.
Configure the MCP Client
Add the SmartMemory server configuration to the configuration file of the AI client (such as Claude Desktop).
Configure the LLM Provider
Configure the LLM provider (OpenAI, Ollama, etc.) and API key through environment variables or the Web dashboard.
Start Conversation Learning
Restart the AI client and start chatting with the AI. The AI will automatically learn the facts and rules you mention and request confirmation when necessary.
Use the Supervision Mode (Optional)
Access the Web dashboard (localhost:8080), upload documents to extract rules, visualize the knowledge graph, and manage learned rules.

Usage Examples

Personal Knowledge Management
Let the AI remember information about your friends and colleagues and their relationships to build a personal social knowledge graph.
Business Rule Learning
Teach the AI company policies and work processes to answer questions from new employees.
Eligibility Reasoning and Judgment
Based on known facts and rules, reason whether someone meets specific conditions.
Document Rule Extraction
Extract rules in batches from PDF policy documents to build a complete business knowledge base.

Frequently Asked Questions

Does SmartMemory require programming knowledge?
Which AI clients are supported?
Where is the data stored? Is it secure?
Can the learned rules be deleted or modified?
Is a paid LLM API required?
What are the limitations on the size of the knowledge graph?
What if there are rule conflicts?
Can the knowledge graph be exported?

Related Resources

5 - Minute Quick Start Guide
A quick - start guide for new users, including the simplest deployment and usage steps
Troubleshooting Guide
Solutions to common problems to help solve issues in deployment and usage
Complete Configuration Reference
Detailed description of all configuration options, including environment variables and configuration files
Explanation of the Neuro - symbolic Architecture
Technical principle: How to combine LLMs and knowledge graphs to achieve reasoning
Overview of the Technical Architecture
Description of the overall system architecture design and technology stack
Document Index
A complete index of all documents for easy searching of specific topics
Deployment Guide
Advanced deployment guide, including cloud deployment and team deployment
GitHub Code Repository
Project source code and latest version
Docker Image
Official Docker image repository
MCP Protocol Official Website
Official documentation and specifications of the Model Context Protocol

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "smart-memory": {
      "command": "docker",
      "args": ["run", "--rm", "-i", "ghcr.io/mauriceisrael/smart-memory:latest"]
    }
  }
}

{
      "mcpServers": {
        "smartmemory": {
          "command": "/absolute/path/to/SmartMemory/venv/bin/python",
          "args": ["-m", "smart_memory.server"]
        }
      }
    }
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

N
Next Devtools MCP
The Next.js development tools MCP server provides Next.js development tools and utilities for AI programming assistants such as Claude and Cursor, including runtime diagnostics, development automation, and document access functions.
TypeScript
7.4K
5 points
P
Praisonai
PraisonAI is a production-ready multi-AI agent framework with self-reflection capabilities, designed to create AI agents to automate the solution of various problems from simple tasks to complex challenges. It simplifies the construction and management of multi-agent LLM systems by integrating PraisonAI agents, AG2, and CrewAI into a low-code solution, emphasizing simplicity, customization, and effective human-machine collaboration.
Python
6.5K
5 points
H
Haiku.rag
Haiku RAG is an intelligent retrieval - augmented generation system built on LanceDB, Pydantic AI, and Docling. It supports hybrid search, re - ranking, Q&A agents, multi - agent research processes, and provides local - first document processing and MCP server integration.
Python
4.6K
5 points
B
Blueprint MCP
Blueprint MCP is a chart generation tool based on the Arcade ecosystem. It uses technologies such as Nano Banana Pro to automatically generate visual charts such as architecture diagrams and flowcharts by analyzing codebases and system architectures, helping developers understand complex systems.
Python
8.6K
4 points
K
Klavis
Klavis AI is an open-source project that provides a simple and easy-to-use MCP (Model Context Protocol) service on Slack, Discord, and Web platforms. It includes various functions such as report generation, YouTube tools, and document conversion, supporting non-technical users and developers to use AI workflows.
TypeScript
16.2K
5 points
D
Devtools Debugger MCP
The Node.js Debugger MCP server provides complete debugging capabilities based on the Chrome DevTools protocol, including breakpoint setting, stepping execution, variable inspection, and expression evaluation.
TypeScript
10.5K
4 points
M
Mcpjungle
MCPJungle is a self-hosted MCP gateway used to centrally manage and proxy multiple MCP servers, providing a unified tool access interface for AI agents.
Go
0
4.5 points
C
Cipher
Cipher is an open-source memory layer framework designed for programming AI agents. It integrates with various IDEs and AI coding assistants through the MCP protocol, providing core functions such as automatic memory generation, team memory sharing, and dual-system memory management.
TypeScript
0
5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
30.8K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
20.0K
4.3 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
18.2K
4.5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
58.9K
4.3 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
56.1K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
26.5K
5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
18.1K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
39.2K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase