MCP Rag Server Rag MCP Server Srm
M

MCP Rag Server Rag MCP Server Srm

mcp - rag - server is a Retrieval Augmented Generation (RAG) server based on the Model Context Protocol (MCP). It provides relevant context for connected LLMs by indexing project documents. It uses ChromaDB and Ollama for local storage and embedding generation, supports multiple file formats, and can be quickly deployed using Docker.
2 points
9.5K

What is MCP RAG Server?

MCP RAG Server is a server based on the Model Context Protocol (MCP), specifically designed to enhance the capabilities of large language models. It provides relevant context information for LLMs by automatically indexing your project files, enabling them to generate more accurate and targeted responses.

How to use MCP RAG Server?

You can easily deploy the server and its dependencies (ChromaDB and Ollama) using Docker Compose. After deployment, your MCP client (such as the VS Code plugin) can connect to the server and automatically gain document retrieval capabilities.

Use cases

It is particularly suitable for developers and teams who need to provide project - specific knowledge bases for locally running LLMs, enhancing the accuracy of model responses while maintaining data privacy.

Main features

Automatic indexing
Automatically scan the project directory and index supported file types (.txt, .md, code files, etc.)
Smart chunking
Perform hierarchical chunking on Markdown files, distinguishing between text and code blocks
Local processing
Use local ChromaDB to store vectors and Ollama to generate embeddings, ensuring data privacy
MCP integration
Provide RAG capabilities as a standard MCP tool, and can be seamlessly integrated with various MCP clients
Advantages
Designed specifically for the MCP ecosystem, easy to integrate
Local - first design to protect data privacy
Automatically index project files, reducing configuration work
Built on Genkit, highly scalable
Limitations
Currently, the chunking process for code files is relatively basic
Does not support complex file formats such as PDF
Performance benchmark data is not yet complete

How to use

Install Docker
Ensure that Docker Desktop or Docker Engine is installed
Clone the repository
Get the server source code
Start the service
Use Docker Compose to start the server and its dependencies
Download the embedding model
You need to download the default embedding model when running for the first time
Configure the client
Configure your MCP client to connect to this server

Usage examples

Code document query
When developers ask about specific APIs in the project, the server can automatically provide relevant document fragments
Project knowledge retrieval
Answer questions about project architecture and design decisions

Frequently Asked Questions

Which file types will be indexed?
How to exclude certain directories from being indexed?
Can the embedding model be changed?
Where is the data stored?

Related resources

Model Context Protocol official website
Official documentation of the MCP protocol
Google Genkit
Documentation of the Genkit framework
ChromaDB official website
Documentation of the vector database
Ollama official website
Local LLM running environment
GitHub repository
Project source code

Installation

Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Airweave
Airweave is an open - source context retrieval layer for AI agents and RAG systems. It connects and synchronizes data from various applications, tools, and databases, and provides relevant, real - time, multi - source contextual information to AI agents through a unified search interface.
Python
6.1K
5 points
V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
5.5K
4.5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
6.7K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
7.4K
5 points
A
Apify MCP Server
The Apify MCP Server is a tool based on the Model Context Protocol (MCP) that allows AI assistants to extract data from websites such as social media, search engines, and e-commerce through thousands of ready-to-use crawlers, scrapers, and automation tools (Apify Actors). It supports OAuth and Skyfire proxy payment and can be integrated into MCP clients such as Claude and VS Code through HTTPS endpoints or local stdio.
TypeScript
7.6K
5 points
R
Rsdoctor
Rsdoctor is a build analysis tool specifically designed for the Rspack ecosystem, fully compatible with webpack. It provides visual build analysis, multi - dimensional performance diagnosis, and intelligent optimization suggestions to help developers improve build efficiency and engineering quality.
TypeScript
10.5K
5 points
N
Next Devtools MCP
The Next.js development tools MCP server provides Next.js development tools and utilities for AI programming assistants such as Claude and Cursor, including runtime diagnostics, development automation, and document access functions.
TypeScript
10.8K
5 points
T
Testkube
Testkube is a test orchestration and execution framework for cloud-native applications, providing a unified platform to define, run, and analyze tests. It supports existing testing tools and Kubernetes infrastructure.
Go
6.5K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.4K
4.5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
25.5K
4.3 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
35.4K
5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
72.2K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
32.2K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
64.4K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
21.0K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
47.7K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase