Nano Agent
N

Nano Agent

Nano Agent is an experimental small engineering proxy MCP server that supports multi-provider LLM models, used to test and compare the proxy capabilities of cloud and local LLMs in terms of performance, speed, and cost. The project includes a multi-model evaluation system, a nested agent architecture, and a unified tool interface, supporting providers such as OpenAI, Anthropic, and Ollama.
2.5 points
7.0K

What is the Nano Agent MCP Server?

Nano Agent is an experimental small engineering proxy server that supports multiple AI model providers such as OpenAI, Anthropic, and local Ollama. It allows users to test and compare the performance of different AI models in engineering tasks through a unified interface.

How to use the Nano Agent?

It can be used in three ways: 1) Direct interaction through the command-line interface (CLI); 2) Invocation through MCP clients such as Claude Code; 3) Multi-model parallel evaluation using the HOP/LOP mode.

Applicable scenarios

Suitable for developers who need to compare the performance of different AI models, engineers who study AI proxy capabilities, and users who want to run AI models locally for engineering tasks.

Main features

Multi-model provider support
Supports OpenAI (GPT-5), Anthropic (Claude), and local Ollama models, using a unified OpenAI SDK interface.
Built-in engineering tools
Provides basic engineering tools such as file reading and writing, directory listing, and file editing, supporting automated task execution.
Parallel model evaluation
Through the HOP/LOP mode, the performance of up to 9 different models on the same task can be tested simultaneously.
Cost tracking
Automatically calculates and compares the usage costs of different models to help users make cost-effective choices.
Advantages
Unified interface supports multiple AI model providers
Can run large models with 120B parameters on a local Mac (M4 Max)
Provides detailed performance, speed, and cost comparison data
Simple installation and usage process
Limitations
Currently mainly focused on engineering tasks, with limited generality
Local large models require high-performance hardware support
Some advanced features require an API key

How to use

Installation preparation
Install Astral UV, Claude Code, and Ollama, and obtain the necessary API keys.
Environment configuration
Copy and fill in the.env configuration file, and set the API key and other parameters.
Global installation
Run the installation script to make nano-agent globally available.
Configure the MCP client
Set up the.mcp.json file to connect MCP clients such as Claude Code.

Usage examples

Basic file operations
Let the agent create, edit, and read files
Code analysis
Let the agent analyze Python code
Multi-model comparison
Compare the performance of different models on the same task

Frequently Asked Questions

What hardware configuration is required to run local large models?
How to add a new AI model provider?
Why is the cost of Claude Opus particularly high?
Can the agent tools be customized?

Related resources

Astral UV Installation Guide
Installation documentation for the Python package management tool UV
Claude Code Documentation
Official documentation for Anthropic Claude Code
Ollama Official Website
Tool for running large models locally
Demo Video
Demonstration of using Nano Agent
GitHub Repository
Project source code

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "nano-agent": {
      "command": "nano-agent",
      "args": []
    }
  }
}

{
  "mcpServers": {
    "nano-agent": {
      "command": "uv",
      "args": ["--directory", "apps/nano_agent_mcp_server", "run", "nano-agent"]
    }
  }
}
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Acemcp
Acemcp is an MCP server for codebase indexing and semantic search, supporting automatic incremental indexing, multi-encoding file processing, .gitignore integration, and a Web management interface, helping developers quickly search for and understand code context.
Python
9.2K
5 points
B
Blueprint MCP
Blueprint MCP is a chart generation tool based on the Arcade ecosystem. It uses technologies such as Nano Banana Pro to automatically generate visual charts such as architecture diagrams and flowcharts by analyzing codebases and system architectures, helping developers understand complex systems.
Python
8.1K
4 points
M
MCP Agent Mail
MCP Agent Mail is a mail - based coordination layer designed for AI programming agents, providing identity management, message sending and receiving, file reservation, and search functions, supporting asynchronous collaboration and conflict avoidance among multiple agents.
Python
8.4K
5 points
K
Klavis
Klavis AI is an open-source project that provides a simple and easy-to-use MCP (Model Context Protocol) service on Slack, Discord, and Web platforms. It includes various functions such as report generation, YouTube tools, and document conversion, supporting non-technical users and developers to use AI workflows.
TypeScript
13.9K
5 points
M
MCP
The Microsoft official MCP server provides search and access functions for the latest Microsoft technical documentation for AI assistants
11.9K
5 points
A
Aderyn
Aderyn is an open - source Solidity smart contract static analysis tool written in Rust, which helps developers and security researchers discover vulnerabilities in Solidity code. It supports Foundry and Hardhat projects, can generate reports in multiple formats, and provides a VSCode extension.
Rust
10.6K
5 points
D
Devtools Debugger MCP
The Node.js Debugger MCP server provides complete debugging capabilities based on the Chrome DevTools protocol, including breakpoint setting, stepping execution, variable inspection, and expression evaluation.
TypeScript
8.9K
4 points
S
Scrapling
Scrapling is an adaptive web scraping library that can automatically learn website changes and re - locate elements. It supports multiple scraping methods and AI integration, providing high - performance parsing and a developer - friendly experience.
Python
10.7K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
17.5K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
28.3K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
17.4K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
54.4K
4.3 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
51.0K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
23.0K
5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
35.4K
4.8 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
75.3K
4.7 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2025AIBase