Discover Top MCP Servers - Improve Your AI Workflows
One-Stop MCP Server & Client Integration - 121,231 Services Listed
Categories
No LimitDeveloper toolsArtificial intelligence chatbotsResearch and dataKnowledge management and memoryEducation and learning toolsDatabaseFinanceSearch toolsSecurityVersion controlImage and video processingCloud platformMonitoringOperating system automationCommunication toolsEntertainment and mediaGames and gamificationNote-taking toolsMarketingHome automation and IoTBrowser automationSchedule managementFile systemE-commerce and retailLocation servicesCustomer supportSocial mediaVoice processingHealth and wellnessCustomer data platformTravel and transportationVirtualizationLaw and complianceCloud storageArt and cultureLanguage translationOther
Authentication Status
No LimitOfficial CertificationUnofficial Certification
Location
No LimitLocalRemote
Programming Language
No LimitC# GoJavaJavaScriptPythonRustTypeScript
Type
Filter
Found a total of 7 results related to

Deepseek R1
An MCP server implementation for Deepseek R1, supporting integration with Claude Desktop and providing powerful language model inference services.
TypeScript
9.4K
2.5 points

MCP Server Deepseek R1
An MCP server implementation for Deepseek R1, supporting the Node.js environment and providing powerful language model inference services.
TypeScript
8.8K
2.5 points
A
Auto Causal Inference
Auto Causal Inference is a project that uses large language models (LLMs) to automatically perform causal inference. Users only need to specify the treatment variable and the outcome variable, and the system can automatically complete the full - process analysis, including variable role identification, causal graph construction, effect estimation, and model validation. The project provides two agent architectures (LangGraph and MCP) to achieve this function, which is particularly suitable for causal problem analysis in the banking scenario.
Python
7.7K
2.5 points

Rlm
The RLM MCP Server is a large-scale context processing tool based on the Recursive Language Model pattern. It allows Claude Code to process text of over 10 million tokens through external variables, avoiding directly inputting massive content into the prompt. Through the processes of loading, chunking, sub-querying, and aggregation, it supports automatic analysis and programmatic execution, and can connect to the Claude API or the local Ollama for free inference.
Python
3.9K
2.5 points

Ai00 Rwkv Server
AI00 RWKV Server is an efficient inference API server based on the RWKV language model, supporting Vulkan acceleration and OpenAI-compatible interfaces.
Rust
9.9K
2 points

Gemini Thinking MCP
An MCP server implementation that provides inference capabilities of the Gemini language model for Claude Desktop
TypeScript
8.2K
2 points

Massive Context MCP
An MCP server based on the recursive language model pattern, which processes ultra - large - scale contexts (over 10 million tokens) through chunking, sub - queries, and local inference, supporting automatic analysis, code execution, and security filtering.
Python
4.2K
2 points