MCP Server Opensearch
M

MCP Server Opensearch

This project is an implementation of an MCP server based on OpenSearch, providing semantic memory storage and retrieval functions for LLM applications such as Claude, and supporting the connection of AI tools and data sources through a standard protocol.
2.5 points
9.1K

What is the MCP server?

The MCP server is an open - protocol tool that allows large language models (LLMs) to integrate with external data sources. With this OpenSearch - implemented MCP server, you can easily store and retrieve semantic memories in a distributed search engine.

How to use the MCP server?

You can set up and run the MCP server in just a few steps. First, install the necessary dependencies, then configure the connection to your OpenSearch instance. Finally, start storing and retrieving memories.

Applicable scenarios

Suitable for applications that require efficient storage and retrieval of large amounts of text data, such as intelligent customer service, personalized recommendation systems, or knowledge graph construction.

Main features

Store memories
Store text data in the OpenSearch database for subsequent retrieval.
Retrieve memories
Retrieve relevant memory records from the OpenSearch database based on query conditions.
Support for asynchronous operations
Utilize the OpenSearch asynchronous client to improve data processing efficiency.
Advantages
Powerful distributed search capabilities
Easy to integrate with other systems
Efficient memory management mechanism
Limitations
Requires certain hardware resources to deploy the OpenSearch cluster
May be too complex for small - scale projects

How to use

Install dependencies
Ensure that Python and the uv tool are installed, then use pip to install the required dependency packages.
Start the server
Run the script to start the MCP server and specify the OpenSearch URL and index name.
Test the connection
Verify that the MCP server is successfully connected to the OpenSearch instance.

Usage examples

Example of storing memories
Store a piece of text in OpenSearch for subsequent retrieval.
Example of retrieving memories
Retrieve relevant information from the stored memories based on specific keywords.

Frequently Asked Questions

Why can't I install opensearch - py[async] in my environment?
How to verify that the MCP server is working properly?

Related resources

MCP official website
Understand the basic knowledge of the MCP protocol and its application scenarios.
OpenSearch official documentation
Get detailed technical documentation about OpenSearch.
GitHub repository
Access the project's source code and more examples.

Installation

Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Airweave
Airweave is an open - source context retrieval layer for AI agents and RAG systems. It connects and synchronizes data from various applications, tools, and databases, and provides relevant, real - time, multi - source contextual information to AI agents through a unified search interface.
Python
7.1K
5 points
V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
5.6K
4.5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
6.8K
4.5 points
H
Haiku.rag
Haiku RAG is an intelligent retrieval - augmented generation system built on LanceDB, Pydantic AI, and Docling. It supports hybrid search, re - ranking, Q&A agents, multi - agent research processes, and provides local - first document processing and MCP server integration.
Python
10.2K
5 points
C
Claude Context
Claude Context is an MCP plugin that provides in - depth context of the entire codebase for AI programming assistants through semantic code search. It supports multiple embedding models and vector databases to achieve efficient code retrieval.
TypeScript
16.7K
5 points
A
Acemcp
Acemcp is an MCP server for codebase indexing and semantic search, supporting automatic incremental indexing, multi-encoding file processing, .gitignore integration, and a Web management interface, helping developers quickly search for and understand code context.
Python
18.0K
5 points
M
MCP
The Microsoft official MCP server provides search and access functions for the latest Microsoft technical documentation for AI assistants
15.0K
5 points
C
Cipher
Cipher is an open-source memory layer framework designed for programming AI agents. It integrates with various IDEs and AI coding assistants through the MCP protocol, providing core functions such as automatic memory generation, team memory sharing, and dual-system memory management.
TypeScript
0
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.4K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
35.4K
5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
72.2K
4.3 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
24.6K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
32.2K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
65.5K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
22.1K
4.5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
98.5K
4.7 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase