đ đĻ đ LangChain MCP Client
This simple Model Context Protocol (MCP) client demonstrates the use of LangChain ReAct agents with MCP server tools.
đ Quick Start
The LangChain MCP Client is designed to seamlessly integrate with MCP servers and leverage LangChain ReAct agents. It offers several key features:
- đ Smoothly connect to any MCP server.
- đ¤ Flexibly select any LangChain-compatible LLM for model usage.
- đŦ Interact via CLI, supporting dynamic conversations.
⨠Features
Convert to LangChain Tools
It utilizes a utility function convert_mcp_to_langchain_tools()
. This function handles the simultaneous initialization of multiple specified MCP servers and converts their available tools into a list of LangChain-compatible tools (List[BaseTool]).
đĻ Installation
The Python version should be 3.11 or higher.
pip install langchain_mcp_client
đ Documentation
Configuration
Create a .env
file containing all the necessary API_KEYS
to access your LLM.
Configure the LangChain LLM, MCP servers, and example prompts in the llm_mcp_config.json5
file:
- LLM Configuration: Set the LangChain LLM parameters.
- MCP Servers: Specify the MCP servers to connect to.
- Example Queries: Define example queries for invoking MCP server tools. After pressing Enter, these example queries will be used for prompting.
đģ Usage Examples
Basic Usage
Here is an example with the Jupyter MCP Server:
Check the llm_mcp_config.json5
configuration (the command depends on whether you are running on Linux, macOS, or Windows).
# Start jupyterlab.
make jupyterlab
# Launch the CLI.
make cli
Here is an example prompt.
Create many variants of a matplotlib example
đ License
The initial code for this project is from hideya/mcp-client-langchain-py (MIT License) and from langchain_mcp_tools
(MIT License).







