Nano Agent
N

Nano Agent

Nano Agent is an experimental small engineering proxy MCP server that supports multi-provider LLM models, used to test and compare the proxy capabilities of cloud and local LLMs in terms of performance, speed, and cost. The project includes a multi-model evaluation system, a nested agent architecture, and a unified tool interface, supporting providers such as OpenAI, Anthropic, and Ollama.
2.5 points
6.2K

What is the Nano Agent MCP Server?

Nano Agent is an experimental small engineering proxy server that supports multiple AI model providers such as OpenAI, Anthropic, and local Ollama. It allows users to test and compare the performance of different AI models in engineering tasks through a unified interface.

How to use the Nano Agent?

It can be used in three ways: 1) Direct interaction through the command-line interface (CLI); 2) Invocation through MCP clients such as Claude Code; 3) Multi-model parallel evaluation using the HOP/LOP mode.

Applicable scenarios

Suitable for developers who need to compare the performance of different AI models, engineers who study AI proxy capabilities, and users who want to run AI models locally for engineering tasks.

Main features

Multi-model provider support
Supports OpenAI (GPT-5), Anthropic (Claude), and local Ollama models, using a unified OpenAI SDK interface.
Built-in engineering tools
Provides basic engineering tools such as file reading and writing, directory listing, and file editing, supporting automated task execution.
Parallel model evaluation
Through the HOP/LOP mode, the performance of up to 9 different models on the same task can be tested simultaneously.
Cost tracking
Automatically calculates and compares the usage costs of different models to help users make cost-effective choices.
Advantages
Unified interface supports multiple AI model providers
Can run large models with 120B parameters on a local Mac (M4 Max)
Provides detailed performance, speed, and cost comparison data
Simple installation and usage process
Limitations
Currently mainly focused on engineering tasks, with limited generality
Local large models require high-performance hardware support
Some advanced features require an API key

How to use

Installation preparation
Install Astral UV, Claude Code, and Ollama, and obtain the necessary API keys.
Environment configuration
Copy and fill in the.env configuration file, and set the API key and other parameters.
Global installation
Run the installation script to make nano-agent globally available.
Configure the MCP client
Set up the.mcp.json file to connect MCP clients such as Claude Code.

Usage examples

Basic file operations
Let the agent create, edit, and read files
Code analysis
Let the agent analyze Python code
Multi-model comparison
Compare the performance of different models on the same task

Frequently Asked Questions

What hardware configuration is required to run local large models?
How to add a new AI model provider?
Why is the cost of Claude Opus particularly high?
Can the agent tools be customized?

Related resources

Astral UV Installation Guide
Installation documentation for the Python package management tool UV
Claude Code Documentation
Official documentation for Anthropic Claude Code
Ollama Official Website
Tool for running large models locally
Demo Video
Demonstration of using Nano Agent
GitHub Repository
Project source code

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "nano-agent": {
      "command": "nano-agent",
      "args": []
    }
  }
}

{
  "mcpServers": {
    "nano-agent": {
      "command": "uv",
      "args": ["--directory", "apps/nano_agent_mcp_server", "run", "nano-agent"]
    }
  }
}
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

K
Klavis
Klavis AI is an open-source project that provides a simple and easy-to-use MCP (Model Context Protocol) service on Slack, Discord, and Web platforms. It includes various functions such as report generation, YouTube tools, and document conversion, supporting non-technical users and developers to use AI workflows.
TypeScript
9.4K
5 points
M
MCP
The Microsoft official MCP server provides search and access functions for the latest Microsoft technical documentation for AI assistants
9.1K
5 points
A
Aderyn
Aderyn is an open - source Solidity smart contract static analysis tool written in Rust, which helps developers and security researchers discover vulnerabilities in Solidity code. It supports Foundry and Hardhat projects, can generate reports in multiple formats, and provides a VSCode extension.
Rust
4.9K
5 points
D
Devtools Debugger MCP
The Node.js Debugger MCP server provides complete debugging capabilities based on the Chrome DevTools protocol, including breakpoint setting, stepping execution, variable inspection, and expression evaluation.
TypeScript
6.5K
4 points
S
Scrapling
Scrapling is an adaptive web scraping library that can automatically learn website changes and re - locate elements. It supports multiple scraping methods and AI integration, providing high - performance parsing and a developer - friendly experience.
Python
7.9K
5 points
M
Mcpjungle
MCPJungle is a self-hosted MCP gateway used to centrally manage and proxy multiple MCP servers, providing a unified tool access interface for AI agents.
Go
0
4.5 points
C
Cipher
Cipher is an open-source memory layer framework designed for programming AI agents. It integrates with various IDEs and AI coding assistants through the MCP protocol, providing core functions such as automatic memory generation, team memory sharing, and dual-system memory management.
TypeScript
0
5 points
N
Nexus
Nexus is an AI tool aggregation gateway that supports connecting multiple MCP servers and LLM providers, providing tool search, execution, and model routing functions through a unified endpoint, and supporting security authentication and rate limiting.
Rust
0
4 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
14.8K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
24.8K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
15.6K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
44.6K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
20.3K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
45.7K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
15.0K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
29.4K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2025AIBase