Discover Top MCP Servers - Improve Your AI Workflows

One-Stop MCP Server & Client Integration - 121,231 Services Listed

By Rating
By Downloads
By Time
Filter

Found a total of 42 results related to

M
MCP Reasoner
MCP Reasoner is an inference enhancement tool designed for Claude Desktop, providing two search strategies, Beam Search and MCTS, and improving the ability to solve complex problems through experimental algorithms.
JavaScript
10.1K
3.5 points
R
RAT (Retrieval Augmented Thinking)
A two-stage AI dialogue service that combines DeepSeek inference and Claude generation
TypeScript
27.2K
3 points
P
Pageindex MCP
PageIndex MCP is an inference-based vectorless RAG system. Through the MCP protocol, it exposes the tree-like index of documents to LLMs, enabling platforms such as Claude to retrieve information from PDF documents through structural reasoning like human experts, without the need for a vector database.
TypeScript
10.2K
3 points
E
Easy MCP
EasyMCP is a TypeScript library that simplifies the development of Model Context Protocol (MCP) servers. It provides a concise API and experimental decorator support, automatically handling type inference and template configuration.
TypeScript
8.9K
3 points
O
Omni Nli
Omni-NLI is a self-hosted multi-interface (REST and MCP) server that focuses on natural language inference tasks, used to verify the support, contradiction, or neutral relationship between texts, and can help reduce AI hallucination and improve application reliability.
Python
5.2K
2.5 points
A
Allora MCP Server
The Allora MCP Server is an implementation based on the Model Context Protocol (MCP) that provides the function of obtaining machine learning inference data from the Allora network, enabling AI systems to seamlessly access Allora prediction market data.
TypeScript
9.2K
2.5 points
D
Deepseek R1
An MCP server implementation for Deepseek R1, supporting integration with Claude Desktop and providing powerful language model inference services.
TypeScript
10.5K
2.5 points
M
MCP Server Deepseek R1
An MCP server implementation for Deepseek R1, supporting the Node.js environment and providing powerful language model inference services.
TypeScript
9.0K
2.5 points
M
MCP Agent Tool Adapter
The MCP Agent Tool Adapter project enables modular tool invocation through the MCP protocol, supports two agent frameworks, Google ADK and LangGraph, and provides dynamic inference and tool planning capabilities.
Python
10.4K
2.5 points
M
MCP Llm Sandbox
mcp - scaffold is a development sandbox for verifying the Model Context Protocol (MCP) server, providing support for local LLMs (such as LLaMA 7B) and cloud inference, including a chat interface and reference architecture.
TypeScript
8.2K
2.5 points
D
Deepseek Thinker MCP
The Deepseek Thinker MCP Server is an MCP service that provides Deepseek inference content, supporting both the OpenAI API and the local Ollama mode, and can be integrated into AI clients.
TypeScript
8.4K
2.5 points
G
Groq MCP Server
Groq MCP Server is a service that provides fast model inference through the Model Context Protocol (MCP), supporting various functions such as text generation, voice conversion, image analysis, and batch processing.
Python
8.9K
2.5 points
M
MCP Server Replicate
A FastMCP server implementation based on the Replicate API, focusing on providing resource-based access to AI model inference, especially good at image generation functions.
Python
8.8K
2.5 points
G
Ghostshell
The Subconscious AI MCP Server is a tool based on the Model Context Protocol that allows users to run AI - driven conjoint experiments through AI assistants such as Claude and Cursor. It uses causal inference and synthetic population data for decision - making analysis and provides REST API and real - time update functions.
Python
6.4K
2.5 points
C
Chain Of Draft
An MCP server implementing the Chain of Draft inference method, significantly reducing token usage while maintaining accuracy
Python
7.7K
2.5 points
A
Auto Causal Inference
Auto Causal Inference is a project that uses large language models (LLMs) to automatically perform causal inference. Users only need to specify the treatment variable and the outcome variable, and the system can automatically complete the full - process analysis, including variable role identification, causal graph construction, effect estimation, and model validation. The project provides two agent architectures (LangGraph and MCP) to achieve this function, which is particularly suitable for causal problem analysis in the banking scenario.
Python
7.9K
2.5 points
F
Forge MCP Server
Forge MCP Server is a tool that automatically optimizes PyTorch models into high-performance CUDA/Triton kernels through 32 parallel AI agents, increasing inference speed by up to 14x and supporting multiple MCP-compatible AI programming assistants.
TypeScript
8.1K
2.5 points
R
Rlm
The RLM MCP Server is a large-scale context processing tool based on the Recursive Language Model pattern. It allows Claude Code to process text of over 10 million tokens through external variables, avoiding directly inputting massive content into the prompt. Through the processes of loading, chunking, sub-querying, and aggregation, it supports automatic analysis and programmatic execution, and can connect to the Claude API or the local Ollama for free inference.
Python
5.1K
2.5 points
C
Clarifai MCP Server Local
This project is an unofficial Clarifai MCP server that acts as a local bridge to connect to the Clarifai API, supporting functions such as image generation, inference, and search, and interacting with the client through the standard MCP protocol.
Go
5.1K
2.5 points
D
Deepseek Reasoner
A DeepSeek inference service project built by Claude
Python
10.4K
2.5 points
D
Deepseek Claude MCP Server
Enhance Claude's ability to handle complex tasks by integrating the inference engine of DeepSeek R1
Python
9.4K
2.5 points
D
Deepseek R1 Reasoner
An intelligent agent system running locally, combining inference models and tool - calling models
TypeScript
7.5K
2.5 points
A
Ai00 Rwkv Server
AI00 RWKV Server is an efficient inference API server based on the RWKV language model, supporting Vulkan acceleration and OpenAI-compatible interfaces.
Rust
9.0K
2 points
F
Fluent MCP
Fluent MCP is a modern framework for building Model Context Protocol (MCP) servers with intelligent inference capabilities. It supports AI integration, tool separation, and offloading of complex inferences, and uses a dual - layer LLM architecture for efficient inference.
Python
9.6K
2 points
S
Sharp
SHARP is an AI model developed by Apple Research, capable of quickly converting a single 2D photo into a 3D Gaussian Splat representation, achieving real-time conversion from photos to interactive 3D scenes, with an inference time of less than one second.
Python
7.7K
2 points
M
Moyu6027 Deepseek MCP Server
DeepSeek MCP Server enhances Claude's inference capabilities by integrating the advanced inference engine of DeepSeek R1, enabling it to handle complex multi - step inference tasks.
Python
8.9K
2 points
W
Web Llm MCP Server
A local LLM inference MCP server based on Playwright and Web-LLM, realizing text generation, chat interaction, and model management functions through browser automation.
TypeScript
6.3K
2 points
E
Eshoplite
eShopLite is a lightweight .NET-based eCommerce platform that offers features such as semantic search, cloud platform integration, and intelligent inference models to help developers quickly build and scale online stores.
C#
7.6K
2 points
H
Harshj23 Deepseek Claude MCP Server
This project enhances Claude's ability to handle complex reasoning tasks by integrating the inference engine of DeepSeek R1 and provides an efficient and accurate multi - step reasoning solution.
Python
6.7K
2 points
D
Deepseek R1 Reasoning
The DeepSeek MCP server enhances Claude's reasoning ability by integrating the advanced inference engine of DeepSeek R1, enabling it to handle complex multi-step reasoning tasks.
Python
9.3K
2 points
D
Deepseek MCP Server
The DeepSeek MCP Server enhances Claude's inference capabilities by integrating the advanced inference engine of DeepSeek R1, enabling it to handle complex multi-step inference tasks.
Python
6.5K
2 points
L
Langgraph MCP Agents
The MCP Agent Tool Adapter is a project that implements modular tool invocation through the MCP protocol and supports dynamic inference tools of two agents, Google ADK and LangGraph.
Python
10.5K
2 points
X
X402engine MCP
x402engine - mcp is an MCP server that provides HTTP 402 micropayment access to 38 pay - per - use APIs for AI agents, supporting payments in USDC and USDm, covering various services such as LLM inference, image generation, code execution, audio processing, and blockchain data.
JavaScript
6.4K
2 points
V
Vllm
vLLM is an efficient and easy - to - use LLM inference and service library that supports multiple model architectures and optimization techniques, providing high - performance LLM services.
Python
14.3K
2 points
M
Minirag MCP
MiniRAG-MCP is an MCP server wrapper built around the MiniRAG project, aiming to provide efficient and reliable Retrieval Augmented Generation (RAG) services for intelligent agent processes on local devices through client-managed LLM inference sampling.
Python
8.4K
2 points
R
RAT Retrieval Augmented Thinking MCP
An MCP service that combines the inference capabilities of DeepSeek R1 and the generation capabilities of Claude 3.5
TypeScript
9.1K
2 points
G
Gemini Thinking MCP
An MCP server implementation that provides inference capabilities of the Gemini language model for Claude Desktop
TypeScript
9.3K
2 points
M
MCP Scaffold
mcp-scaffold is a sandbox environment for verifying the Model Context Protocol (MCP) server, supporting local and cloud LLM inference, providing a chat interface and reference architecture.
TypeScript
9.0K
2 points
I
Intelliglow AI Voice MCP IoT Platform
IntelliGlow is a smart lighting system based on the MCP protocol that controls real smart bulbs through an AI assistant, supports voice commands, AI inference, and direct hardware control, and enables natural language interaction and intelligent lighting management.
Python
5.5K
2 points
S
Sensei MCP
Sensei MCP is a multi-role engineering tutor system that integrates 64 professional AI roles and provides engineering standards and suggestions through collaborative guidance. It can inject relevant engineering specifications before Claude's inference, supports multiple file types and context awareness, and has session memory and team collaboration functions.
Python
9.5K
2 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase