MCP Llm Bridge
M

MCP Llm Bridge

The MCP LLM Bridge is a bridging tool that connects the Model Context Protocol (MCP) server with OpenAI-compatible LLMs. It enables bidirectional protocol conversion between MCP and the OpenAI function call interface, supporting both cloud and local model calls.
2 points
9.5K

Installation

Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

🚀 MCP LLM Bridge

A bridge connecting the Model Context Protocol (MCP) server with OpenAI-compatible Large Language Models (LLMs). It primarily supports the OpenAI API and is also compatible with local endpoints implementing the OpenAI API specification.

This implementation provides a bidirectional protocol conversion layer that converts between MCP and OpenAI's function call interfaces. It transforms MCP tool specifications into the OpenAI function architecture and handles the process of mapping function calls back to MCP tool execution. This enables any OpenAI-compatible language model to use MCP-compatible tools through a standardized interface, whether it's a cloud-based model or a local implementation (such as Ollama).

For more information about MCP, please refer to the following links:

🚀 Quick Start

Installation

# Installation
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .

# Create a test database
python -m mcp_llm_bridge.create_test_db

📦 Installation

Configuration

OpenAI (Main Support)

Create a .env file:

OPENAI_API_KEY=your_key
OPENAI_MODEL=gpt-4o # Or other OpenAI models supporting tools

⚠️ Important Note

If you need to use the key in .env, reactivate the environment: source .venv/bin/activate

Then configure the bridge in src/mcp_llm_bridge/main.py:

config = BridgeConfig(
    mcp_server_params=StdioServerParameters(
        command="uvx",
        args=["mcp-server-sqlite", "--db-path", "test.db"],
        env=None
    ),
    llm_config=LLMConfig(
        api_key=os.getenv("OPENAI_API_KEY"),
        model=os.getenv("OPENAI_MODEL", "gpt-4o"),
        base_url=None
    )
)

Additional Endpoint Support

The bridge also works with any endpoint implementing the OpenAI API specification:

Ollama
llm_config=LLMConfig(
    api_key="Not required",
    model="mistral-nemo:12b-instruct-2407-q8_0",
    base_url="http://localhost:11434/v1"
)

💡 Usage Tip

After testing, I found that mistral-nemo:12b-instruct-2407-q8_0 is better at handling complex queries.

LM Studio
llm_config=LLMConfig(
    api_key="Not required",
    model="local-model",
    base_url="http://localhost:1234/v1"
)

I haven't tested this configuration, but it should theoretically work.

💻 Usage Examples

Basic Usage

python -m mcp_llm_bridge.main

# Example question: What is the most expensive product in the database?
# Enter 'quit' or press Ctrl+C to exit

🧪 Running Tests

Install the package with test dependencies:

uv pip install -e ".[test]"

Then run the tests:

python -m pytest -v tests/

📄 License

MIT

🤝 Contribution Guidelines

If you'd like to contribute, please refer to the project repository: https://github.com/bartolli/mcp-llm-bridge

Alternatives

V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
4.5K
4.5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
6.6K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
7.2K
5 points
A
Apify MCP Server
The Apify MCP Server is a tool based on the Model Context Protocol (MCP) that allows AI assistants to extract data from websites such as social media, search engines, and e-commerce through thousands of ready-to-use crawlers, scrapers, and automation tools (Apify Actors). It supports OAuth and Skyfire proxy payment and can be integrated into MCP clients such as Claude and VS Code through HTTPS endpoints or local stdio.
TypeScript
7.4K
5 points
R
Rsdoctor
Rsdoctor is a build analysis tool specifically designed for the Rspack ecosystem, fully compatible with webpack. It provides visual build analysis, multi - dimensional performance diagnosis, and intelligent optimization suggestions to help developers improve build efficiency and engineering quality.
TypeScript
9.3K
5 points
N
Next Devtools MCP
The Next.js development tools MCP server provides Next.js development tools and utilities for AI programming assistants such as Claude and Cursor, including runtime diagnostics, development automation, and document access functions.
TypeScript
9.7K
5 points
T
Testkube
Testkube is a test orchestration and execution framework for cloud-native applications, providing a unified platform to define, run, and analyze tests. It supports existing testing tools and Kubernetes infrastructure.
Go
6.5K
5 points
M
MCP Windbg
An MCP server that integrates AI models with WinDbg/CDB for analyzing Windows crash dump files and remote debugging, supporting natural language interaction to execute debugging commands.
Python
10.5K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.2K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
34.2K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
25.3K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
71.6K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
31.0K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
64.2K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
21.0K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
48.4K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase