MCP Context Server
M

MCP Context Server

A high-performance MCP server that provides persistent multimodal context storage for LLM agents, supports thread isolation, metadata filtering, full-text search, and semantic search, and is compatible with SQLite and PostgreSQL backends.
2 points
5.5K

What is the MCP Context Server?

The MCP Context Server is a context storage server specifically designed for AI agents. It allows different AI agents (such as Claude Code, LangGraph, etc.) to share and access information such as historical conversations, images, and metadata in the same task. It's like providing a shared memory bank for AI agents, enabling them to collaborate on tasks.

How to use the MCP Context Server?

It's very simple to use: 1) Install the server on your AI client (such as Claude Code), 2) Store and retrieve context through the provided tools, 3) Different agents share context through the same thread_id. The server supports multiple database backends, from simple SQLite to production-grade PostgreSQL.

Use cases

• Multi-agent collaboration: Multiple AI agents share context when collaborating on complex tasks. • Long-term conversations: Long conversations that require remembering historical dialogues. • Multimodal tasks: Tasks that need to process both text and images simultaneously. • Knowledge management: Providing a retrievable knowledge base for AI agents. • Development collaboration: AI assistants in a development team share project context.

Main features

Multimodal storage
Supports storing both text and image content. AI agents can save and retrieve visual information such as screenshots, charts, and interface designs.
Threaded context management
Organizes context through thread_id. Different agents in the same task can share the same context, and contexts between different tasks are isolated.
Flexible metadata system
Supports metadata in any JSON format. Information such as task status, priority, and responsible person can be stored. It supports 15 filtering operators for precise queries.
Multiple search methods
Provides three search modes: full-text search (keyword matching), semantic search (meaning similarity), and hybrid search (combining both) to meet different retrieval needs.
Multiple database support
Supports two database backends, SQLite (zero-configuration) and PostgreSQL (production-grade), which can be selected according to requirements.
Batch operations
Supports batch storage, update, and deletion of context to improve processing efficiency. It supports atomic operations to ensure data consistency.
Date range filtering
Supports filtering context by creation time using the ISO 8601 format, making it easy to find relevant information along the timeline.
Tag management
Supports adding tags to context for easy classification and organization. It supports tag filtering to quickly find relevant context.
Advantages
High-performance design: Supports high-concurrency access and optimizes database query performance.
Easy to integrate: Seamlessly integrates with mainstream MCP clients such as Claude Code and LangGraph.
Flexible configuration: Supports multiple database backends and search methods to meet different scenario requirements.
Production-ready: Includes complete error handling, logging, and monitoring functions.
Open source and free: Based on the MIT license, it can be freely used and modified.
Limitations
Requires an MCP client: Must be used in conjunction with a client that supports the MCP protocol.
Semantic search requires additional configuration: Ollama and an embedding model need to be installed.
Image storage has size limitations: The default maximum size for a single image is 10MB, and the maximum total request size is 100MB.
PostgreSQL configuration is relatively complex: The database needs to be installed and configured separately.

How to use

Install the server
Add the MCP Context Server to your AI client. Taking Claude Code as an example, it can be added through the command line or a configuration file.
Configure environment variables
Configure environment variables as needed, such as selecting the database backend and enabling search functions. This can be configured in the.mcp.json file.
Store context
Use the store_context tool to store conversation context, which can include text, images, metadata, and tags.
Retrieve context
Use the search_context or specific search tools to find previously stored context.
Manage context
Use update_context to update content, delete_context to delete unnecessary context, and list_threads to view all threads.

Usage examples

Multi-agent collaborative development
Multiple AI agents collaborate to develop a web application. The front-end agent is responsible for UI design, the back-end agent is responsible for API development, and the testing agent is responsible for verifying functionality. They access the project context through the shared thread_id.
Long-term technical discussion
Have a long-term technical discussion with an AI assistant, covering multiple related topics. It's necessary to remember previous discussion content to avoid repetition.
Document analysis and summarization
Upload screenshots of technical documents and let the AI assistant analyze the content and extract key information.
Task management and tracking
Use metadata to manage the status, priority, and responsible person of development tasks.

Frequently Asked Questions

What's the difference between the MCP Context Server and ordinary chat history?
Do I need to install a database?
What's the difference between semantic search and full-text search?
How to share context between different AI agents?
What are the limitations of image storage?
How to back up my context data?
Can I use SQLite and PostgreSQL simultaneously?
How to view the server's running status?

Related resources

GitHub repository
Source code, issue tracking, and contribution guidelines
PyPI package page
Python package release page, view version history and installation statistics
MCP protocol documentation
Official specification documentation for the Model Context Protocol
Claude Code MCP guide
Detailed guide on how to use the MCP server in Claude Code
Semantic search configuration guide
Detailed instructions on how to configure and use the semantic search function
Docker deployment guide
Complete guide on deploying a production environment using Docker
Metadata filtering guide
Detailed examples of metadata addition, updating, and filtering

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "context-server": {
      "type": "stdio",
      "command": "uvx",
      "args": ["--python", "3.12", "mcp-context-server"],
      "env": {}
    }
  }
}

{
  "mcpServers": {
    "context-server": {
      "type": "stdio",
      "command": "uvx",
      "args": ["--python", "3.12", "mcp-context-server"],
      "env": {
        "LOG_LEVEL": "${LOG_LEVEL:-INFO}",
        "DB_PATH": "${DB_PATH:-~/.mcp/context_storage.db}",
        "MAX_IMAGE_SIZE_MB": "${MAX_IMAGE_SIZE_MB:-10}",
        "MAX_TOTAL_SIZE_MB": "${MAX_TOTAL_SIZE_MB:-100}"
      }
    }
  }
}

{
  "mcpServers": {
    "context-server": {
      "type": "stdio",
      "command": "uvx",
      "args": ["--python", "3.12", "mcp-context-server"],
      "env": {
        "STORAGE_BACKEND": "postgresql",
        "POSTGRESQL_HOST": "localhost",
        "POSTGRESQL_USER": "postgres",
        "POSTGRESQL_PASSWORD": "postgres",
        "POSTGRESQL_DATABASE": "mcp_context",
        "ENABLE_SEMANTIC_SEARCH": "true"
      }
    }
  }
}

{
     "mcpServers": {
       "context-server": {
         "type": "stdio",
         "command": "uvx",
         "args": ["--python", "3.12", "mcp-context-server"],
         "env": {
           "STORAGE_BACKEND": "postgresql",
           "POSTGRESQL_CONNECTION_STRING": "postgresql://postgres:your-actual-password@db.[PROJECT_REF].supabase.co:5432/postgres"
         }
       }
     }
   }

{
     "mcpServers": {
       "context-server": {
         "type": "stdio",
         "command": "uvx",
         "args": ["--python", "3.12", "mcp-context-server"],
         "env": {
           "STORAGE_BACKEND": "postgresql",
           "POSTGRESQL_HOST": "db.[PROJECT_REF].supabase.co",
           "POSTGRESQL_PORT": "5432",
           "POSTGRESQL_USER": "postgres",
           "POSTGRESQL_PASSWORD": "your-actual-password",
           "POSTGRESQL_DATABASE": "postgres",
           "ENABLE_SEMANTIC_SEARCH": "true"
         }
       }
     }
   }

{
     "mcpServers": {
       "context-server": {
         "type": "stdio",
         "command": "uvx",
         "args": ["--python", "3.12", "mcp-context-server"],
         "env": {
           "STORAGE_BACKEND": "postgresql",
           "POSTGRESQL_CONNECTION_STRING": "postgresql://postgres.[PROJECT-REF]:your-actual-password@aws-0-[REGION].pooler.supabase.com:5432/postgres"
         }
       }
     }
   }

{
     "mcpServers": {
       "context-server": {
         "type": "stdio",
         "command": "uvx",
         "args": ["--python", "3.12", "mcp-context-server"],
         "env": {
           "STORAGE_BACKEND": "postgresql",
           "POSTGRESQL_HOST": "aws-0-[REGION].pooler.supabase.com",
           "POSTGRESQL_PORT": "5432",
           "POSTGRESQL_USER": "postgres.[PROJECT-REF]",
           "POSTGRESQL_PASSWORD": "your-actual-password",
           "POSTGRESQL_DATABASE": "postgres",
           "ENABLE_SEMANTIC_SEARCH": "true"
         }
       }
     }
   }

{
     "mcpServers": {
       "context-server": {
         "type": "stdio",
         "command": "uvx",
         "args": ["--python", "3.12", "mcp-context-server"],
         "env": {
           "STORAGE_BACKEND": "postgresql",
           "POSTGRESQL_CONNECTION_STRING": "postgresql://postgres:your-actual-password@db.[PROJECT_REF].supabase.co:5432/postgres",
           "ENABLE_SEMANTIC_SEARCH": "true"
         }
       }
     }
   }
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Airweave
Airweave is an open - source context retrieval layer for AI agents and RAG systems. It connects and synchronizes data from various applications, tools, and databases, and provides relevant, real - time, multi - source contextual information to AI agents through a unified search interface.
Python
14.2K
5 points
V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
9.7K
4.5 points
M
Moltbrain
MoltBrain is a long-term memory layer plugin designed for OpenClaw, MoltBook, and Claude Code, capable of automatically learning and recalling project context, providing intelligent search, observation recording, analysis statistics, and persistent storage functions.
TypeScript
9.3K
4.5 points
B
Bm.md
A feature-rich Markdown typesetting tool that supports multiple style themes and platform adaptation, providing real-time editing preview, image export, and API integration capabilities
TypeScript
16.2K
5 points
S
Security Detections MCP
Security Detections MCP is a server based on the Model Context Protocol that allows LLMs to query a unified security detection rule database covering Sigma, Splunk ESCU, Elastic, and KQL formats. The latest version 3.0 is upgraded to an autonomous detection engineering platform that can automatically extract TTPs from threat intelligence, analyze coverage gaps, generate SIEM-native format detection rules, run tests, and verify. The project includes over 71 tools, 11 pre-built workflow prompts, and a knowledge graph system, supporting multiple SIEM platforms.
TypeScript
6.4K
4 points
P
Paperbanana
Python
9.2K
5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
8.5K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
9.1K
5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
37.5K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
23.8K
4.5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
26.1K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
78.6K
4.3 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
68.3K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
36.7K
5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
23.0K
4.5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
104.0K
4.7 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase