Discover Top MCP Servers - Improve Your AI Workflows

One-Stop MCP Server & Client Integration - 121,231 Services Listed

By Rating
By Downloads
By Time
Filter

Found a total of 21 results related to

E
Easy MCP
EasyMCP is a TypeScript library that simplifies the development of Model Context Protocol (MCP) servers. It provides a concise API and experimental decorator support, automatically handling type inference and template configuration.
TypeScript
8.8K
3 points
A
Allora MCP Server
The Allora MCP Server is an implementation based on the Model Context Protocol (MCP) that provides the function of obtaining machine learning inference data from the Allora network, enabling AI systems to seamlessly access Allora prediction market data.
TypeScript
10.2K
2.5 points
D
Deepseek R1
An MCP server implementation for Deepseek R1, supporting integration with Claude Desktop and providing powerful language model inference services.
TypeScript
10.5K
2.5 points
M
MCP Llm Sandbox
mcp - scaffold is a development sandbox for verifying the Model Context Protocol (MCP) server, providing support for local LLMs (such as LLaMA 7B) and cloud inference, including a chat interface and reference architecture.
TypeScript
8.1K
2.5 points
M
MCP Server Deepseek R1
An MCP server implementation for Deepseek R1, supporting the Node.js environment and providing powerful language model inference services.
TypeScript
8.9K
2.5 points
G
Groq MCP Server
Groq MCP Server is a service that provides fast model inference through the Model Context Protocol (MCP), supporting various functions such as text generation, voice conversion, image analysis, and batch processing.
Python
8.9K
2.5 points
G
Ghostshell
The Subconscious AI MCP Server is a tool based on the Model Context Protocol that allows users to run AI - driven conjoint experiments through AI assistants such as Claude and Cursor. It uses causal inference and synthetic population data for decision - making analysis and provides REST API and real - time update functions.
Python
7.3K
2.5 points
M
MCP Server Replicate
A FastMCP server implementation based on the Replicate API, focusing on providing resource-based access to AI model inference, especially good at image generation functions.
Python
9.8K
2.5 points
A
Auto Causal Inference
Auto Causal Inference is a project that uses large language models (LLMs) to automatically perform causal inference. Users only need to specify the treatment variable and the outcome variable, and the system can automatically complete the full - process analysis, including variable role identification, causal graph construction, effect estimation, and model validation. The project provides two agent architectures (LangGraph and MCP) to achieve this function, which is particularly suitable for causal problem analysis in the banking scenario.
Python
7.7K
2.5 points
R
Rlm
The RLM MCP Server is a large-scale context processing tool based on the Recursive Language Model pattern. It allows Claude Code to process text of over 10 million tokens through external variables, avoiding directly inputting massive content into the prompt. Through the processes of loading, chunking, sub-querying, and aggregation, it supports automatic analysis and programmatic execution, and can connect to the Claude API or the local Ollama for free inference.
Python
3.9K
2.5 points
F
Forge MCP Server
Forge MCP Server is a tool that automatically optimizes PyTorch models into high-performance CUDA/Triton kernels through 32 parallel AI agents, increasing inference speed by up to 14x and supporting multiple MCP-compatible AI programming assistants.
TypeScript
7.9K
2.5 points
D
Deepseek R1 Reasoner
An intelligent agent system running locally, combining inference models and tool - calling models
TypeScript
6.4K
2.5 points
A
Ai00 Rwkv Server
AI00 RWKV Server is an efficient inference API server based on the RWKV language model, supporting Vulkan acceleration and OpenAI-compatible interfaces.
Rust
8.9K
2 points
F
Fluent MCP
Fluent MCP is a modern framework for building Model Context Protocol (MCP) servers with intelligent inference capabilities. It supports AI integration, tool separation, and offloading of complex inferences, and uses a dual - layer LLM architecture for efficient inference.
Python
9.5K
2 points
S
Sharp
SHARP is an AI model developed by Apple Research, capable of quickly converting a single 2D photo into a 3D Gaussian Splat representation, achieving real-time conversion from photos to interactive 3D scenes, with an inference time of less than one second.
Python
7.6K
2 points
W
Web Llm MCP Server
A local LLM inference MCP server based on Playwright and Web-LLM, realizing text generation, chat interaction, and model management functions through browser automation.
TypeScript
6.2K
2 points
E
Eshoplite
eShopLite is a lightweight .NET-based eCommerce platform that offers features such as semantic search, cloud platform integration, and intelligent inference models to help developers quickly build and scale online stores.
C#
8.5K
2 points
V
Vllm
vLLM is an efficient and easy - to - use LLM inference and service library that supports multiple model architectures and optimization techniques, providing high - performance LLM services.
Python
14.2K
2 points
G
Gemini Thinking MCP
An MCP server implementation that provides inference capabilities of the Gemini language model for Claude Desktop
TypeScript
9.2K
2 points
M
MCP Scaffold
mcp-scaffold is a sandbox environment for verifying the Model Context Protocol (MCP) server, supporting local and cloud LLM inference, providing a chat interface and reference architecture.
TypeScript
9.0K
2 points
M
Massive Context MCP
An MCP server based on the recursive language model pattern, which processes ultra - large - scale contexts (over 10 million tokens) through chunking, sub - queries, and local inference, supporting automatic analysis, code execution, and security filtering.
Python
4.2K
2 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase