Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
rating : 4.5 points
downloads : 5.5K
What is Vestige?
Vestige is an AI cognitive memory engine that addresses the problem of AI assistants forgetting information between conversations. Different from traditional vector databases, Vestige is based on 130 years of memory research and implements the core mechanisms of human memory: prediction error gating determines what to store, spaced repetition determines what to retain, and memory consolidation connects related concepts in the background.How to use Vestige?
Vestige runs as an MCP server and integrates with AI assistants such as Claude, Cursor, and VS Code. After installation, AI assistants will automatically use Vestige's memory functions: save important information, retrieve relevant context, and create reminders. Users can also visualize the AI's memory network through the 3D dashboard.Use Cases
Vestige is particularly suitable for AI interaction scenarios that require long-term memory: software development (remembering code preferences and architectural decisions), learning assistants (remembering learning progress and difficult points), personal assistants (remembering habits and preferences), research assistants (connecting related concepts and discoveries).Main Features
Cognitive Memory System
Based on the memory mechanism of neuroscience: prediction error gating filters redundant information, FSRS - 6 spaced repetition manages memory decay, and memory consolidation connects related concepts in the background.
3D Memory Dashboard
Real - time 3D visualization of the AI's memory network, using Three.js and WebGL to display memory nodes, connection relationships, and activation patterns, supporting WebSocket real - time updates.
Universal MCP Integration
Integrates with all mainstream AI assistants and IDEs through the Model Context Protocol: Claude Desktop, Cursor, VS Code, JetBrains, Windsurf, etc.
HyDE Intelligent Search
Hypothetical document embedding query expansion converts user queries into multiple semantic variants, significantly improving the recall rate of conceptual queries.
Memory Dreaming
Simulates the memory consolidation process during sleep, automatically replaying, connecting, and strengthening important memories to discover hidden patterns and associations.
100% Local Operation
All data processing is completed locally. The embedding model, vector search, and memory storage all run on the user's device without cloud transmission.
Advantages
Intelligent memory management: Based on cognitive science principles, only important and new information is stored to avoid memory bloat.
Long - term memory retention: Uses the spaced repetition algorithm, and important memories decay more slowly over time.
Context - aware retrieval: The 7 - stage search pipeline considers semantic, temporal, associative, and competitive factors.
Visualized insights: The 3D dashboard allows users to intuitively understand the AI's memory structure and state.
Privacy protection: Runs completely locally, and sensitive data does not leave the device.
Wide compatibility: Supports all mainstream AI assistants and development environments through the MCP protocol.
Limitations
Resource consumption: The embedding model and vector search require a certain amount of CPU/GPU resources.
Learning curve: It is necessary to understand basic concepts to fully utilize all functions.
Initial setup: It is necessary to download the embedding model (about 130MB).
Platform limitations: Some advanced features (such as Metal GPU acceleration) are only available on specific platforms.
How to Use
Install Vestige
Download the corresponding binary file according to the operating system, or install it via npm. macOS users can use curl for one - click installation.
Configure AI Assistant
Add Vestige as an MCP server. The configuration methods of different AI assistants are slightly different.
Start Using
After installation, the AI assistant will automatically use Vestige's memory functions. You can let the AI remember information or retrieve memories through natural language.
View Dashboard
The 3D memory dashboard will automatically open when Vestige starts. View the AI's memory network and state in the browser.
Usage Examples
Remember Development Preferences
Let the AI remember your coding style, tool preferences, and project settings, and automatically apply these contexts in new conversations.
Project Context Memory
Save project - specific decisions, architectural designs, and known issues to ensure that the AI maintains a consistent understanding throughout the project cycle.
Learning Progress Tracking
Record difficult points, mastered concepts, and topics to be studied during the learning process. The AI can adjust the teaching strategy according to your progress.
Intelligent Reminder Creation
Create context - based reminders that are automatically triggered when specific conditions are met.
Frequently Asked Questions
What is the difference between Vestige and an ordinary vector database?
What should I do if I get a 'command not found' error after installation?
What should I do if the embedding model download fails?
How can I make the AI automatically use Vestige?
What should I do if the dashboard fails to load?
Which AI assistants does Vestige support?
Where is the data stored? How to back it up?
Will Vestige affect the performance of the AI assistant?
Related Resources
Official GitHub Repository
Source code, issue tracking, and the latest version
Scientific Principle Documentation
Details the neuroscience and cognitive principles behind Vestige
Configuration Guide
CLI commands, environment variables, and advanced configuration options
Integration Guide
Detailed integration steps for each IDE and AI assistant
Frequently Asked Questions
More than 30 common questions and their solutions
MCP Protocol Official Website
Official documentation and specifications of the Model Context Protocol

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.2K
4.5 points

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
24.2K
4.3 points

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
35.2K
5 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
72.3K
4.3 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
31.0K
5 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
64.2K
4.5 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
21.0K
4.5 points

Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
97.8K
4.7 points



