Memdx
MemDx is a local and private memory layer extension designed for VS Code/Cursor. It deploys Qdrant and Ollama through Docker, providing persistent project context memory for AI, and supporting semantic search and Cursor Agent integration.
2 points
3.7K

What is MemDx Memory Layer?

MemDx is a memory extension designed for developers and AI assistants. It's like a smart notebook that can remember important information in your project, such as architectural decisions, code rules, and project structures. When collaborating with an AI assistant (e.g., Cursor's Agent), MemDx can provide persistent context, allowing the AI to'remember' previously discussed content without the need for repeated explanations.

How to use MemDx?

MemDx is designed to be very simple and easy to use: 1) Install the extension and run the setup wizard; 2) Select code or text, right-click and save it as a memory; 3) Search for saved memories using natural language; 4) Configure Cursor AI integration to allow the Agent to automatically use your memory library. The entire process runs completely locally to protect your privacy.

Use cases

MemDx is particularly suitable for the following scenarios: maintaining consistency in long - term project development, sharing project knowledge in team collaboration, recording architectural decisions in complex systems, understanding project rules when learning a new codebase, and having persistent context for in - depth collaboration with AI assistants.

Main Features

Persistent Context
Save and recall project decisions, rules, and architectures across sessions, enabling the AI assistant to have 'long - term memory'
100% Local and Private
Run locally using Docker (Qdrant + Ollama). All data stays on your device, ensuring privacy and security.
Semantic Search
Search for memories using natural language without the need for exact keyword matching. It understands the intention and meaning of the query.
Cursor Agent Integration
Seamlessly integrate with Cursor's Agent (Composer) through the MCP protocol. The AI assistant can automatically read and write memories.
Zero - Configuration Setup
The built - in setup wizard automatically handles all configurations, including starting Docker containers and initializing services.
Smart Indexing
Automatically index README.md,.cursorrules, and project structures with one click to quickly establish an initial memory library.
Advantages
Runs completely locally. Data never leaves your device, providing excellent privacy.
Deeply integrated with Cursor AI, enhancing the AI assistant's context understanding ability.
Simple setup. Wizard - style configuration lowers the usage threshold.
Supports natural language search, making it more intuitive to find memories.
Isolates memories across projects to avoid information confusion.
Limitations
Requires installing Docker Desktop, which occupies some system resources.
Needs to download the AI model for the first use, which may take a few minutes.
Currently mainly targets the VS Code/Cursor environment, with limited support for other editors.
Local storage capacity is limited by the device and is not suitable for storing a large amount of data.

How to Use

Installation and Setup
Install MemDx from the VS Code/Cursor extension market. When running it for the first time, click 'Setup Now' or run the 'MemDx: Quick Setup Wizard' command. Enter the user ID and project ID to complete the initialization.
Add Memory
Select the text (code, comments, decisions, etc.) you want to save in the editor. Right - click and select 'MemDx: Add Memory from Selection', or use the command panel to add it.
Search Memories
Use the command panel to search for saved memories. You can describe what you want to find in natural language, and the system will return semantically relevant memory entries.
Configure Cursor AI Integration
To allow Cursor's Agent to automatically use memories, you need to copy the MCP configuration and paste it into Cursor's MCP settings. After that, the AI assistant can read and write your memory library.
Index Project
Quickly establish a project memory library. Automatically save README.md,.cursorrules, and project structure information to memory with one click.

Usage Examples

Record Architectural Decisions
After team discussions, save important architectural decisions to memory to ensure that all members and AI assistants can remember these decisions.
Understand Project Specifications
Save the project's code specifications, naming conventions, and best practices as memories to help new members and AI assistants quickly adapt to the project.
Maintain Context Across Sessions
Maintain the continuity of discussions across multiple development sessions to avoid repeating explanations of the same concepts and decisions.
Quickly Get Started with a New Project
When joining a new project, use the smart indexing function to quickly understand the project structure and important documents.

Frequently Asked Questions

Does MemDx require an internet connection?
How can I ensure the privacy of my data?
What should I do if I encounter a 'Vector Dimension Error'?
Which AI models does MemDx support?
Can I use MemDx in multiple projects?
How can I back up my memory data?
Will MemDx affect the performance of VS Code?

Related Resources

MemDx GitHub Repository
The source code and latest version of MemDx
Model Context Protocol Documentation
The official documentation and specifications of the MCP protocol
Docker Desktop Download
The Docker environment required to run MemDx
Cursor Editor Official Website
The AI code editor that MemDx is mainly integrated with
MemDx Usage Tutorial Video
Video tutorial on the installation and use of MemDx

Installation

Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Airweave
Airweave is an open - source context retrieval layer for AI agents and RAG systems. It connects and synchronizes data from various applications, tools, and databases, and provides relevant, real - time, multi - source contextual information to AI agents through a unified search interface.
Python
6.1K
5 points
V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
5.5K
4.5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
6.7K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
7.4K
5 points
A
Apify MCP Server
The Apify MCP Server is a tool based on the Model Context Protocol (MCP) that allows AI assistants to extract data from websites such as social media, search engines, and e-commerce through thousands of ready-to-use crawlers, scrapers, and automation tools (Apify Actors). It supports OAuth and Skyfire proxy payment and can be integrated into MCP clients such as Claude and VS Code through HTTPS endpoints or local stdio.
TypeScript
7.6K
5 points
R
Rsdoctor
Rsdoctor is a build analysis tool specifically designed for the Rspack ecosystem, fully compatible with webpack. It provides visual build analysis, multi - dimensional performance diagnosis, and intelligent optimization suggestions to help developers improve build efficiency and engineering quality.
TypeScript
10.5K
5 points
N
Next Devtools MCP
The Next.js development tools MCP server provides Next.js development tools and utilities for AI programming assistants such as Claude and Cursor, including runtime diagnostics, development automation, and document access functions.
TypeScript
10.8K
5 points
T
Testkube
Testkube is a test orchestration and execution framework for cloud-native applications, providing a unified platform to define, run, and analyze tests. It supports existing testing tools and Kubernetes infrastructure.
Go
6.5K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.4K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
34.3K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
25.5K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
72.9K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
32.2K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
64.4K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
21.0K
4.5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
98.2K
4.7 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase