Discover Top MCP Servers - Improve Your AI Workflows

One-Stop MCP Server & Client Integration - 121,231 Services Listed

By Rating
By Downloads
By Time
Filter

Found a total of 16 results related to

E
Easy MCP
EasyMCP is a TypeScript library that simplifies the development of Model Context Protocol (MCP) servers. It provides a concise API and experimental decorator support, automatically handling type inference and template configuration.
TypeScript
5.9K
3 points
A
Allora MCP Server
The Allora MCP Server is an implementation based on the Model Context Protocol (MCP) that provides the function of obtaining machine learning inference data from the Allora network, enabling AI systems to seamlessly access Allora prediction market data.
TypeScript
9.0K
2.5 points
D
Deepseek R1
An MCP server implementation for Deepseek R1, supporting integration with Claude Desktop and providing powerful language model inference services.
TypeScript
6.2K
2.5 points
M
MCP Server Deepseek R1
An MCP server implementation for Deepseek R1, supporting the Node.js environment and providing powerful language model inference services.
TypeScript
8.9K
2.5 points
M
MCP Llm Sandbox
mcp - scaffold is a development sandbox for verifying the Model Context Protocol (MCP) server, providing support for local LLMs (such as LLaMA 7B) and cloud inference, including a chat interface and reference architecture.
TypeScript
7.4K
2.5 points
G
Groq MCP Server
Groq MCP Server is a service that provides fast model inference through the Model Context Protocol (MCP), supporting various functions such as text generation, voice conversion, image analysis, and batch processing.
Python
9.8K
2.5 points
M
MCP Server Replicate
A FastMCP server implementation based on the Replicate API, focusing on providing resource-based access to AI model inference, especially good at image generation functions.
Python
8.0K
2.5 points
A
Auto Causal Inference
Auto Causal Inference is a project that uses large language models (LLMs) to automatically perform causal inference. Users only need to specify the treatment variable and the outcome variable, and the system can automatically complete the full - process analysis, including variable role identification, causal graph construction, effect estimation, and model validation. The project provides two agent architectures (LangGraph and MCP) to achieve this function, which is particularly suitable for causal problem analysis in the banking scenario.
Python
7.9K
2.5 points
D
Deepseek R1 Reasoner
An intelligent agent system running locally, combining inference models and tool - calling models
TypeScript
6.2K
2.5 points
F
Fluent MCP
Fluent MCP is a modern framework for building Model Context Protocol (MCP) servers with intelligent inference capabilities. It supports AI integration, tool separation, and offloading of complex inferences, and uses a dual - layer LLM architecture for efficient inference.
Python
7.7K
2 points
A
Ai00 Rwkv Server
AI00 RWKV Server is an efficient inference API server based on the RWKV language model, supporting Vulkan acceleration and OpenAI-compatible interfaces.
Rust
6.7K
2 points
W
Web Llm MCP Server
A local LLM inference MCP server based on Playwright and Web-LLM, realizing text generation, chat interaction, and model management functions through browser automation.
TypeScript
5.5K
2 points
E
Eshoplite
eShopLite is a lightweight .NET-based eCommerce platform that offers features such as semantic search, cloud platform integration, and intelligent inference models to help developers quickly build and scale online stores.
C#
8.1K
2 points
V
Vllm
vLLM is an efficient and easy - to - use LLM inference and service library that supports multiple model architectures and optimization techniques, providing high - performance LLM services.
Python
11.9K
2 points
G
Gemini Thinking MCP
An MCP server implementation that provides inference capabilities of the Gemini language model for Claude Desktop
TypeScript
6.7K
2 points
M
MCP Scaffold
mcp-scaffold is a sandbox environment for verifying the Model Context Protocol (MCP) server, supporting local and cloud LLM inference, providing a chat interface and reference architecture.
TypeScript
7.1K
2 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2025AIBase