M

MCP Weather Server Demo

2 points
17
Installation
Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

🚀 MCP Tool Invocation Capability Introduction

This document provides an in - depth introduction to the docking mechanism between the MCP (Model Context Protocol) and AI models, and compares two typical docking architectures.

🔌 Docking Mechanism between MCP and AI Models

In the MCP (Model Context Protocol) architecture, although tool calling is a core capability, it doesn't mean that all operations are performed by the model itself. The actual docking behavior is a collaborative process completed jointly by the model and the client.

❓ Does the model have to support Function Call to use MCP?

The answer is that Function Call is a required feature for docking with MCP, but it doesn't necessarily have to be directly executed by the model itself. That is, there needs to be the ability to trigger tool calls, but this trigger instruction doesn't have to be in the model's native 'tool_call' format.

🆚 Comparison of Two Typical Docking Architectures

✅ Model natively supports Function Call (e.g., GPT - 4 - turbo + 5ire)

  • Model Environment: GPT - 4 - turbo
  • Client Platform: 5ire (used to support the docking of GPT models with external tools, such as MCP)
  • Model Function Call Support: ✅ Supported
  • Who is responsible for executing Function Call: 5ire (automatically executes according to the tool_call instructions generated by the model)
  • Can it dock with MCP: ✅ Yes
  • Process Description:
    1. The client (5ire) sends the user's request and the description of available tools (function schema) to the model.
    2. The model can automatically generate a structured tool_call based on the tool description.
    3. The client (5ire) is responsible for receiving and parsing this tool_call, and then executing the corresponding tool (interacting with MCP or directly calling the API).
    4. The client sends the execution result back to the model.
    5. The model generates the final response based on the result.

In this mode, the interaction between the model and MCP/tools is highly automated, and the process is relatively transparent to the user.

✅ Model does not support Function Call (e.g., DeepSeek R1 + Cline for VSCode)

  • Model Environment: DeepSeek R1 (assuming it doesn't natively support Function Call)
  • Client Platform: Cline (an extension plugin for VSCode, used as an intelligent client)
  • Model Function Call Support: ❌ Not supported
  • Who is responsible for executing Function Call: Cline (judges the intention based on the text output by the model, converts and executes it)
  • Can it dock with MCP: ✅ Yes
  • Process Description:
    1. Before sending the user's request to the model, the client (Cline) injects the description of available tools into the prompt. This is usually to inform the model in natural language or a specific format: "You can use get_weather(latitude, longitude) to query the weather".
    2. The model receives the prompt containing the user's request and the tool description.
    3. After understanding the request, the model decides to use a tool, but it cannot generate a structured tool_call. Therefore, it will produce an indicative text, for example: "Okay, let me check the weather: get_weather(latitude = 24.16, longitude = 120.64)".
    4. The client (Cline) parses this text returned by the model, and identifies the intention and parameters of get_weather(...).
    5. The client (Cline) converts these parameters into an appropriate format and executes the corresponding tool call (such as sending a request to the API).
    6. After the tool is executed, the client receives the response and returns it to the model.

In this mode, the model cannot directly execute tool calls, but can interact with external tools through the client plugin.

📊 Comparison of Model Docking Methods

Property Model natively supports Function Call (e.g., GPT - 4 - turbo) Model does not support Function Call (e.g., DeepSeek R1)
Function Call Support ✅ ❌
Tool Invocation Method The model directly generates and executes tool_call instructions The model cannot generate tool_call and requires the client plugin to convert and execute
Representative Combination GPT - 4 - turbo + 5ire DeepSeek R1 + Cline for VSCode
Advantages Highly automated, with the model directly controlling tool calls More flexible model functionality, but requires a client plugin to supplement tool - calling capabilities
Applicable Scenarios Rapid deployment, simplifying the development process High cost - effectiveness, suitable for secondary development and integration of existing models

📝 Summary

The MCP protocol provides multiple ways to achieve interaction between models and external tools. The choice of which method depends on the model's functional support and specific application requirements. If the model natively supports Function Call, its tool - calling capabilities can be directly used; if not, docking can be achieved through a client plugin.

N
Notte Browser
Certified
Notte is an open-source full-stack network AI agent framework that provides browser sessions, automated LLM-driven agents, web page observation and operation, credential management, etc. It aims to transform the Internet into an agent-friendly environment and reduce the cognitive burden of LLMs by describing website structures in natural language.
666
4.5 points
S
Search1api
The Search1API MCP Server is a server based on the Model Context Protocol (MCP), providing search and crawling functions, and supporting multiple search services and tools.
TypeScript
348
4 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
838
4.3 points
B
Bing Search MCP
An MCP server for integrating Microsoft Bing Search API, supporting web page, news, and image search functions, providing network search capabilities for AI assistants.
Python
234
4 points
M
MCP Alchemy
Certified
MCP Alchemy is a tool that connects Claude Desktop to multiple databases, supporting SQL queries, database structure analysis, and data report generation.
Python
335
4.2 points
P
Postgresql MCP
A PostgreSQL database MCP service based on the FastMCP library, providing CRUD operations, schema inspection, and custom SQL query functions for specified tables.
Python
117
4 points
M
MCP Scan
MCP-Scan is a security scanning tool for MCP servers, used to detect common security vulnerabilities such as prompt injection, tool poisoning, and cross-domain escalation.
Python
625
5 points
A
Agentic Radar
Agentic Radar is a security scanning tool for analyzing and assessing agentic systems, helping developers, researchers, and security experts understand the workflows of agentic systems and identify potential vulnerabilities.
Python
562
5 points
Featured MCP Services
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
838
4.3 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
1.7K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
152
4.5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
100
4.3 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
6.7K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
573
5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
5.2K
4.7 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
761
4.8 points
Š 2025AIbase