Codex MCP Go
C

Codex MCP Go

An MCP server implemented in Go language that encapsulates the OpenAI Codex CLI, enabling AI clients to call Codex for code generation, debugging, and repair.
2.5 points
7.9K

What is Codex MCP Go?

Codex MCP Go is an MCP (Model Context Protocol) server developed based on the Go language. It acts like an interpreter, allowing the AI programming assistant you are using (such as Gemini or Claude) to understand and call another powerful code expert - OpenAI's Codex CLI. In simple terms, it enables different AIs to collaborate: your main AI (such as Gemini) is responsible for planning and creativity, while Codex serves as a professional 'code special forces', responsible for performing complex code generation, debugging, and repair tasks.

How to use Codex MCP Go?

It's very simple to use, just follow three steps: 1. **Install prerequisite tools**: Ensure that the OpenAI `codex` command-line tool is installed on your computer. 2. **Install this server**: You can run it with a single command using `npx`, or download the compiled program. 3. **Configure your AI client**: In the AI programming tool you are using (such as KiloCode, Roo Code, Cursor), add this server as an MCP tool. After the configuration is complete, your AI assistant can directly call Codex during the conversation.

Applicable scenarios

Codex MCP Go is particularly useful in the following situations: - **Complex code generation**: When you need to generate boilerplate code for a complete function or module. - **Tricky bug fixing**: When the code logic is complex and the main AI has difficulty locating the root cause of the problem. - **Code review and optimization**: When you want to get another 'expert perspective' to review the code quality and propose improvement suggestions. - **Following precise instructions**: When you need to strictly follow specific coding specifications or implementation details.

Main features

AI collaboration bridge
Seamlessly connects your main AI (planner) with Codex (executor), enabling a collaborative programming workflow that complements each other's strengths.
Conversation context management
Supports multi-round conversations. The main AI can save the conversation ID and conduct continuous and in-depth discussions with Codex on the same complex issue.
Secure sandbox control
Provides various security policies (such as read-only mode and workspace write mode) to prevent Codex from accidentally modifying files and ensure code security.
High performance and easy deployment
Compiled into a single executable file based on the Go language, it starts quickly, occupies few resources, and does not require a complex runtime environment. It's ready to use out of the box.
Complete log tracking
Optionally obtain the complete inference process and tool call logs of Codex to help you understand its decision-making process.
Advantages
**Powerful combination**: Combines the macro planning ability of the main AI (such as Gemini) with the precise execution ability of Codex at the micro code level.
**Improved efficiency**: Assigns the most difficult coding tasks to the most suitable expert (Codex), reducing the trial and error and freezing of the main AI.
**Safe and controllable**: Through the sandbox strategy, you can control the permissions of Codex. The default read-only mode is very safe.
**Easy to integrate**: As a standard MCP server, it can be easily integrated into many popular AI programming clients.
**Simple to deploy**: Provides one-click running with `npx` and a single-file binary, which is extremely user-friendly.
Limitations
**Double dependency**: You need to install and configure both the Codex CLI and this MCP server.
**Extra cost**: Using the Codex CLI may incur OpenAI API call fees (depending on your Codex configuration).
**Learning curve**: Users need to understand the collaborative mode of 'main AI planning, Codex execution' and configure the system prompt words of the client to achieve the best results.
**Network requirements**: Calling Codex requires a stable network connection.

How to use

Install prerequisite tools (Codex CLI)
First, you need to install the official OpenAI Codex command-line tool on your computer. Open the terminal and run the following command:
Run the Codex MCP Go server
You don't need to install the Go environment. The simplest way is to use the `npx` command, which will automatically download and run the latest version:
Configure your AI client
According to the AI programming tool you are using, add the MCP server configuration. Here is a general JSON configuration example (applicable to Roo Code, KiloCode, etc.):
Configure system prompt words (key step)
To let your main AI know how to collaborate with Codex, you need to provide it with collaboration rules. We have prepared expert mode files for different clients. Please import the corresponding files: - For KiloCode users: Import `codex-engineer-kilocode.yaml` - For Roo Code users: Import `codex-engineer-roocode.yaml` - For Cline users: Import `codex-engineer-cline.yaml` After importing, your main AI will automatically follow the best collaborative process of 'planning - consulting - implementing - reviewing'.
Start collaborative programming
After the configuration is complete, restart your AI client. Now, when you put forward complex programming requirements, your main AI will actively call the Codex tool to assist in completing the tasks!

Usage examples

Example 1: Requirement analysis and plan formulation
When the user puts forward a vague or complex requirement, the main AI can first ask Codex to help with detailed analysis and plan formulation.
Example 2: Securely obtain a code prototype (Diff patch)
The main AI needs to implement a function but is not sure about the best way to write it. It can ask Codex to provide a 'code blueprint' for reference.
Example 3: Code review and optimization suggestions
After the main AI completes part of the code writing, it can immediately ask Codex for a 'peer review'.

Frequently Asked Questions

Do I need to pay to use Codex MCP Go?
Why doesn't my AI assistant call Codex after configuration?
Is it safe to use Codex? Will it modify my files randomly?
Can I use it in other tools besides KiloCode/Roo Code?
What's the difference between using Codex MCP Go and directly using the Codex CLI?

Related resources

Project GitHub repository
Get the latest source code, report issues, or participate in contributions.
OpenAI Codex CLI official repository
Understand and install the core component that this tool depends on.
Model Context Protocol (MCP) official website
Understand the technical details and ecosystem of the MCP protocol.
Inspiration source: Python version codexmcp
This project was originally inspired by this. Thanks to its author.

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "codex": {
      "command": "npx",
      "args": ["-y", "@zenfun510/codex-mcp-go"],
      "env": {
        "OPENAI_API_KEY": "your-api-key"
      }
    }
  }
}

{
  "mcpServers": {
    "codex": {
      "command": "/path/to/codex-mcp-go",
      "args": [],
      "env": {
        "OPENAI_API_KEY": "your-api-key"
      }
    }
  }
}
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
9.3K
4.5 points
M
Moltbrain
MoltBrain is a long-term memory layer plugin designed for OpenClaw, MoltBook, and Claude Code, capable of automatically learning and recalling project context, providing intelligent search, observation recording, analysis statistics, and persistent storage functions.
TypeScript
9.9K
4.5 points
B
Bm.md
A feature-rich Markdown typesetting tool that supports multiple style themes and platform adaptation, providing real-time editing preview, image export, and API integration capabilities
TypeScript
15.7K
5 points
S
Security Detections MCP
Security Detections MCP is a server based on the Model Context Protocol that allows LLMs to query a unified security detection rule database covering Sigma, Splunk ESCU, Elastic, and KQL formats. The latest version 3.0 is upgraded to an autonomous detection engineering platform that can automatically extract TTPs from threat intelligence, analyze coverage gaps, generate SIEM-native format detection rules, run tests, and verify. The project includes over 71 tools, 11 pre-built workflow prompts, and a knowledge graph system, supporting multiple SIEM platforms.
TypeScript
6.7K
4 points
P
Paperbanana
Python
9.9K
5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
9.6K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
8.8K
5 points
A
Apify MCP Server
The Apify MCP Server is a tool based on the Model Context Protocol (MCP) that allows AI assistants to extract data from websites such as social media, search engines, and e-commerce through thousands of ready-to-use crawlers, scrapers, and automation tools (Apify Actors). It supports OAuth and Skyfire proxy payment and can be integrated into MCP clients such as Claude and VS Code through HTTPS endpoints or local stdio.
TypeScript
10.6K
5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
81.1K
4.3 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
39.0K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
23.7K
4.5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
27.2K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
38.1K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
70.4K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
24.9K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
56.1K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase