M

Model Context Protocol MCP Demo With Langchain MCP ADAPTERS Ollama

Model Context Protocol (MCP) 是一個開源協議,旨在通過標準化接口讓大型語言模型(LLM)與外部工具、服務和數據源無縫交互,簡化集成流程並擴展AI能力。
2分
17

What is MCP Server?

The MCP Server acts as a translator layer that allows LLMs to interact with external services (like math operations or weather data) in a standardized way. It converts service functionality into a format that LLMs can understand and use.

How to use MCP Server?

1. Start the desired MCP server (math or weather) 2. Connect your LLM application via the MCP client 3. The server will advertise available tools 4. Your LLM can now use these tools through standardized requests

Use Cases

- Building AI assistants that need real-world data - Creating workflows that combine LLMs with existing APIs - Developing applications where LLMs need to perform calculations - Integrating LLMs with business databases securely

Key Features

Standardized InterfaceProvides a consistent way for LLMs to interact with diverse services, similar to how REST standardized web services.
Multi-Service SupportCan connect to multiple services simultaneously (demonstrated in multiclient.py).
LangChain IntegrationWorks seamlessly with LangChain framework through langchain-mcp-adapters.

Pros and Cons

Advantages
Simplifies integration of LLMs with real-world tools
Reduces maintenance overhead with standardized protocol
Future-proof architecture that supports new services
Open-source and extensible
Limitations
Requires setup of server components
Currently has limited pre-built integrations
Performance depends on underlying service response times

Getting Started

Install Requirements
Ensure you have Python installed along with required packages (LangChain, Ollama).
Start Servers
Launch the desired MCP servers in separate terminal windows.
Run Client
Execute either the single-service client or multi-service client.

Example Scenarios

Financial CalculationAn AI assistant helping with shopping calculations
Travel PlanningChecking weather for trip planning

Frequently Asked Questions

Do I need to modify my LLM to use MCP?
Can I add my own custom services?
Is MCP secure for sensitive data?

Additional Resources

LangChain MCP Adapters
Library for using MCP with LangChain
Ollama Documentation
For managing local LLM models
MCP Protocol Discussion
Community forum for MCP development
安裝
複製以下命令到你的Client進行配置
注意:您的密鑰屬於敏感信息,請勿與任何人分享。
精選MCP服務推薦
M
Markdownify MCP
Markdownify是一個多功能文件轉換服務,支持將PDF、圖片、音頻等多種格式及網頁內容轉換為Markdown格式。
TypeScript
1.7K
5分
B
Baidu Map
已認證
百度地圖MCP Server是國內首個兼容MCP協議的地圖服務,提供地理編碼、路線規劃等10個標準化API接口,支持Python和Typescript快速接入,賦能智能體實現地圖相關功能。
Python
695
4.5分
F
Firecrawl MCP Server
Firecrawl MCP Server是一個集成Firecrawl網頁抓取能力的模型上下文協議服務器,提供豐富的網頁抓取、搜索和內容提取功能。
TypeScript
3.8K
5分
S
Sequential Thinking MCP Server
一個基於MCP協議的結構化思維服務器,通過定義思考階段幫助分解複雜問題並生成總結
Python
245
4.5分
N
Notion Api MCP
已認證
一個基於Python的MCP服務器,通過Notion API提供高級待辦事項管理和內容組織功能,實現AI模型與Notion的無縫集成。
Python
111
4.5分
E
Edgeone Pages MCP Server
EdgeOne Pages MCP是一個通過MCP協議快速部署HTML內容到EdgeOne Pages並獲取公開URL的服務
TypeScript
243
4.8分
C
Context7
Context7 MCP是一個為AI編程助手提供即時、版本特定文檔和代碼示例的服務,通過Model Context Protocol直接集成到提示中,解決LLM使用過時信息的問題。
TypeScript
5.2K
4.7分
M
Magic MCP
Magic Component Platform (MCP) 是一個AI驅動的UI組件生成工具,通過自然語言描述幫助開發者快速創建現代化UI組件,支持多種IDE集成。
JavaScript
1.7K
5分
AIbase
智啟未來,您的人工智慧解決方案智庫
© 2025AIbase