Tome
Tome is a MacOS application (support for Windows and Linux is coming soon) developed by the Runebook team. It aims to simplify the use of local large language models (LLMs) with MCP servers. By integrating Ollama and managing MCP servers, users can quickly start chatting with MCP-powered models without dealing with complex configurations.
rating : 2.5 points
downloads : 8.3K
What is Tome?
Tome is an application designed specifically for MacOS (support for Windows and Linux is coming soon). It aims to simplify the use of local large language models (LLMs) with MCP servers. Developed by the Runebook team, it provides an intuitive interface to manage MCP servers without the need to manually handle complex configurations or command-line tools.How to use Tome?
After installing Tome and Ollama, simply paste the MCP server command (e.g., `uvx mcp-server-fetch`) in the MCP tab of Tome to quickly connect and start interacting with the model.Use cases
Suitable for developers, researchers, or any users interested in local LLM and MCP technologies, especially non-technical users who want to avoid complex configurations.Main Features
One-click MCP server management
Automatically handle the installation and configuration of MCP servers without manually operating uv/npm or editing JSON files.
Ollama integration
Supports local or remote Ollama instances for flexible model connection configuration.
User-friendly interface
A simple chat interface that lowers the technical threshold, making it easy for non-technical users to use.
Advantages
Completely local-first, giving users full control over their data
Can be used without programming or configuration experience
Supports quick switching between different MCP servers and models
Limitations
Currently only supports MacOS (other platforms are coming soon)
Relies on Ollama as the backend
There may be stability issues in the technical preview version
How to Use
Install necessary software
Download and install Tome and Ollama. Ollama can be a local or remote instance.
Install a model that supports tool calls
Install a model that supports tool calls in Ollama, such as Qwen2.5 7B or 14B.
Add an MCP server
In the MCP tab of Tome, paste the MCP server command (e.g., `uvx mcp-server-fetch`) and install it.
Start chatting
Interact with your MCP-powered model in the chat interface and try to make it perform tasks such as getting network information.
Usage Examples
Get the top story on Hacker News
Ask the model to get the current top story on Hacker News through the Fetch MCP server
Local data query
After configuring an appropriate MCP server, query information in local files or databases
Frequently Asked Questions
Which operating systems does Tome support?
How much RAM do I need to run larger models?
How can I report issues or make suggestions?
Related Resources
Tome download page
Get the latest version of the Tome application
Ollama official website
Download and manage local LLM models
Runebook Discord community
Join the community to get help and share experiences
MCP server repository
Explore more available MCP servers

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
16.6K
4.3 points

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
14.8K
4.5 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
43.9K
4.3 points

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
23.5K
5 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
19.2K
5 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
44.4K
4.5 points

Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
30.2K
4.8 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
14.8K
4.5 points

