đ MCP Tool Invocation Capability Introduction
This document provides an in - depth introduction to the docking mechanism between the MCP (Model Context Protocol) and AI models, and compares two typical docking architectures.
đ Docking Mechanism between MCP and AI Models
In the MCP (Model Context Protocol) architecture, although tool calling is a core capability, it doesn't mean that all operations are performed by the model itself. The actual docking behavior is a collaborative process completed jointly by the model and the client.
â Does the model have to support Function Call to use MCP?
The answer is that Function Call is a required feature for docking with MCP, but it doesn't necessarily have to be directly executed by the model itself. That is, there needs to be the ability to trigger tool calls, but this trigger instruction doesn't have to be in the model's native 'tool_call' format.
đ Comparison of Two Typical Docking Architectures
â Model natively supports Function Call (e.g., GPT - 4 - turbo + 5ire)
- Model Environment: GPT - 4 - turbo
- Client Platform: 5ire (used to support the docking of GPT models with external tools, such as MCP)
- Model Function Call Support: â Supported
- Who is responsible for executing Function Call: 5ire (automatically executes according to the tool_call instructions generated by the model)
- Can it dock with MCP: â Yes
- Process Description:
- The client (5ire) sends the user's request and the description of available tools (function schema) to the model.
- The model can automatically generate a structured tool_call based on the tool description.
- The client (5ire) is responsible for receiving and parsing this tool_call, and then executing the corresponding tool (interacting with MCP or directly calling the API).
- The client sends the execution result back to the model.
- The model generates the final response based on the result.
In this mode, the interaction between the model and MCP/tools is highly automated, and the process is relatively transparent to the user.
â Model does not support Function Call (e.g., DeepSeek R1 + Cline for VSCode)
- Model Environment: DeepSeek R1 (assuming it doesn't natively support Function Call)
- Client Platform: Cline (an extension plugin for VSCode, used as an intelligent client)
- Model Function Call Support: â Not supported
- Who is responsible for executing Function Call: Cline (judges the intention based on the text output by the model, converts and executes it)
- Can it dock with MCP: â Yes
- Process Description:
- Before sending the user's request to the model, the client (Cline) injects the description of available tools into the prompt. This is usually to inform the model in natural language or a specific format: "You can use get_weather(latitude, longitude) to query the weather".
- The model receives the prompt containing the user's request and the tool description.
- After understanding the request, the model decides to use a tool, but it cannot generate a structured tool_call. Therefore, it will produce an indicative text, for example: "Okay, let me check the weather: get_weather(latitude = 24.16, longitude = 120.64)".
- The client (Cline) parses this text returned by the model, and identifies the intention and parameters of get_weather(...).
- The client (Cline) converts these parameters into an appropriate format and executes the corresponding tool call (such as sending a request to the API).
- After the tool is executed, the client receives the response and returns it to the model.
In this mode, the model cannot directly execute tool calls, but can interact with external tools through the client plugin.
đ Comparison of Model Docking Methods
Property | Model natively supports Function Call (e.g., GPT - 4 - turbo) | Model does not support Function Call (e.g., DeepSeek R1) |
---|---|---|
Function Call Support | â | â |
Tool Invocation Method | The model directly generates and executes tool_call instructions | The model cannot generate tool_call and requires the client plugin to convert and execute |
Representative Combination | GPT - 4 - turbo + 5ire | DeepSeek R1 + Cline for VSCode |
Advantages | Highly automated, with the model directly controlling tool calls | More flexible model functionality, but requires a client plugin to supplement tool - calling capabilities |
Applicable Scenarios | Rapid deployment, simplifying the development process | High cost - effectiveness, suitable for secondary development and integration of existing models |
đ Summary
The MCP protocol provides multiple ways to achieve interaction between models and external tools. The choice of which method depends on the model's functional support and specific application requirements. If the model natively supports Function Call, its tool - calling capabilities can be directly used; if not, docking can be achieved through a client plugin.







