Ragdocs
A RAG service based on the Qdrant vector database and Ollama/OpenAI embedding, providing document semantic search and management functions.
2.5 points
8.9K

What is the RagDocs MCP Server?

RagDocs MCP is a tool for managing and searching documents. It uses advanced embedding technology and vector databases to achieve efficient semantic search. Whether deployed locally or used in the cloud, it can help you quickly find the information you need.

How to use the RagDocs MCP Server?

You can start using the RagDocs MCP Server in just a few steps: install it, configure environment variables, start the service, and then add, query, and delete documents through the API.

Use Cases

RagDocs MCP is particularly suitable for enterprises, developers, and researchers who need efficient document management, such as organizing technical documents and building knowledge bases.

Main Features

Add Documents
Supports uploading documents and assigning metadata to them for easy subsequent management and retrieval.
Semantic Search
Quickly locate relevant content through natural language queries without the need for exact keyword matching.
Document List and Organization
View stored documents by category or chronological order, supporting pagination and sorting.
Delete Documents
Easily remove documents that are no longer needed to keep the database tidy.
Support for Multiple Embedding Models
Compatible with both Ollama (free) and OpenAI (paid) embedding methods to meet different needs.
Advantages
Powerful semantic search ability to improve work efficiency.
Flexible choice of embedding models to adapt to diverse needs.
Open - source and easy to integrate into existing systems.
Supports local deployment and cloud services to protect data privacy.
A free version is available to reduce initial costs.
Limitations
Higher hardware resources may be required for large - scale document sets.
Fees are required for the OpenAI embedding service.
Depends on external services such as Qdrant, and functionality may be affected when the network connection is interrupted.

How to Use

Install the RagDocs MCP Server
Run the following command to globally install the RagDocs MCP CLI tool: `npm install -g @mcpservers/ragdocs`.
Configure Environment Variables
Set the necessary environment variables, such as the Qdrant address and the embedding model type.
Start the Server
Start the RagDocs MCP service using Node.js: `node @mcpservers/ragdocs`.

Usage Examples

Example 1: Add a Document
Demonstrate how to add a new document to the RagDocs MCP Server.
Example 2: Search for Documents
Show how to find specific documents through semantic search.

Frequently Asked Questions

How to choose an embedding model?
Does it support custom filter conditions?
How to back up my document data?

Related Resources

Official Documentation
Detailed installation guides and technical documentation.
Qdrant Official Website
Learn more about the Qdrant vector database.
Ollama GitHub
Explore the specific implementation of the Ollama embedding model.

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "http://127.0.0.1:6333",
        "EMBEDDING_PROVIDER": "ollama"
      }
    }
  }
}

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "https://your-cluster-url.qdrant.tech",
        "QDRANT_API_KEY": "your-qdrant-api-key",
        "EMBEDDING_PROVIDER": "ollama"
      }
    }
  }
}

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "http://127.0.0.1:6333",
        "EMBEDDING_PROVIDER": "openai",
        "OPENAI_API_KEY": "your-api-key"
      }
    }
  }
}
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Airweave
Airweave is an open - source context retrieval layer for AI agents and RAG systems. It connects and synchronizes data from various applications, tools, and databases, and provides relevant, real - time, multi - source contextual information to AI agents through a unified search interface.
Python
14.7K
5 points
V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
6.1K
4.5 points
M
Moltbrain
MoltBrain is a long-term memory layer plugin designed for OpenClaw, MoltBook, and Claude Code, capable of automatically learning and recalling project context, providing intelligent search, observation recording, analysis statistics, and persistent storage functions.
TypeScript
6.1K
4.5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
8.7K
4.5 points
H
Haiku.rag
Haiku RAG is an intelligent retrieval - augmented generation system built on LanceDB, Pydantic AI, and Docling. It supports hybrid search, re - ranking, Q&A agents, multi - agent research processes, and provides local - first document processing and MCP server integration.
Python
15.4K
5 points
C
Claude Context
Claude Context is an MCP plugin that provides in - depth context of the entire codebase for AI programming assistants through semantic code search. It supports multiple embedding models and vector databases to achieve efficient code retrieval.
TypeScript
30.6K
5 points
A
Acemcp
Acemcp is an MCP server for codebase indexing and semantic search, supporting automatic incremental indexing, multi-encoding file processing, .gitignore integration, and a Web management interface, helping developers quickly search for and understand code context.
Python
25.0K
5 points
M
MCP
The Microsoft official MCP server provides search and access functions for the latest Microsoft technical documentation for AI assistants
15.3K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
22.7K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
36.0K
5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
75.3K
4.3 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
25.8K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
35.3K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
67.0K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
23.1K
4.5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
100.6K
4.7 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase