This project provides multiple Model Context Protocol servers, including web search, weather query, and memory storage functions, enabling AI models to integrate external tools and data sources.
2.5 points
7.5K

What is an MCP server?

An MCP server is a tool server based on the Model Context Protocol standard, enabling AI models to access external data sources and tools. It's like giving AI models 'hands and eyes', allowing them to search the web, obtain weather information, store memories, etc.

How to use these servers?

Run the servers through Docker containers, and then configure the connection in AI applications (such as LocalAI). The servers communicate with AI models through a standard protocol, providing real - time tool invocation capabilities.

Applicable scenarios

Suitable for various scenarios that require enhancing the capabilities of AI models: information retrieval, personalized memory, real - time data acquisition, etc. It is particularly suitable for chatbots, intelligent assistants, and knowledge Q&A systems.

Main Features

Web Search
Use the DuckDuckGo search engine to search for web information in real - time, supporting a configurable number of search results
Weather Query
Get the current weather and multi - day weather forecasts for global cities, including temperature, wind speed, and weather descriptions
Persistent Memory
Store and retrieve information across sessions, support adding, searching, and deleting memory entries, with timestamp tracking
Containerized Deployment
All servers provide Docker images, supporting rapid deployment and multi - architecture operation
Data Validation
Strict JSON Schema validation ensures the integrity and correctness of input and output data
Advantages
Ready to use: Pre - configured Docker images for rapid deployment
Standardized: Follows the MCP protocol and is compatible with various AI clients
Lightweight: Each server has a focused function and consumes few resources
Scalable: Easy to add new tools and functions
Persistent: The memory server supports cross - session data storage
Limitations
Network dependency: Search and weather services require an Internet connection
Function limitation: Each server has a relatively single function
Configuration requirements: Basic Docker and configuration knowledge is required
Data privacy: Memory server data is stored in local files

How to Use

Select the required server
Choose the appropriate server according to your needs: search, weather, or memory function
Run the Docker container
Use the provided Docker command to start the server container
Configure the AI client
Configure the server connection in LocalAI or other MCP clients
Start using
The AI model can now call the tools provided by the server through natural language

Usage Examples

Real - time information retrieval
When an AI model needs the latest information, it automatically calls the search server to obtain real - time data
Travel planning assistant
Provide weather information for the destination when helping users plan a trip
Personalized service
Remember user preferences and provide a personalized experience
Knowledge update
When the information in the AI knowledge base is outdated, search for the latest information to supplement it

Frequently Asked Questions

Do MCP servers require payment?
Do these servers support Chinese?
How to ensure data privacy?
Can multiple servers be run simultaneously?
What is the performance of the servers?
How to add a custom server?

Related Resources

Project code repository
Complete source code and development documentation
Model Context Protocol official website
Official documentation and specifications of the MCP protocol
LocalAI documentation
How to configure and use MCP servers in LocalAI
Docker image repository
Download Docker images for all servers
MCP SDK documentation
Go language MCP development toolkit

Installation

Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Airweave
Airweave is an open - source context retrieval layer for AI agents and RAG systems. It connects and synchronizes data from various applications, tools, and databases, and provides relevant, real - time, multi - source contextual information to AI agents through a unified search interface.
Python
14.6K
5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
8.5K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
7.0K
5 points
N
Next Devtools MCP
The Next.js development tools MCP server provides Next.js development tools and utilities for AI programming assistants such as Claude and Cursor, including runtime diagnostics, development automation, and document access functions.
TypeScript
16.7K
5 points
P
Praisonai
PraisonAI is a production-ready multi-AI agent framework with self-reflection capabilities, designed to create AI agents to automate the solution of various problems from simple tasks to complex challenges. It simplifies the construction and management of multi-agent LLM systems by integrating PraisonAI agents, AG2, and CrewAI into a low-code solution, emphasizing simplicity, customization, and effective human-machine collaboration.
Python
15.9K
5 points
H
Haiku.rag
Haiku RAG is an intelligent retrieval - augmented generation system built on LanceDB, Pydantic AI, and Docling. It supports hybrid search, re - ranking, Q&A agents, multi - agent research processes, and provides local - first document processing and MCP server integration.
Python
15.4K
5 points
C
Claude Context
Claude Context is an MCP plugin that provides in - depth context of the entire codebase for AI programming assistants through semantic code search. It supports multiple embedding models and vector databases to achieve efficient code retrieval.
TypeScript
30.4K
5 points
A
Acemcp
Acemcp is an MCP server for codebase indexing and semantic search, supporting automatic incremental indexing, multi-encoding file processing, .gitignore integration, and a Web management interface, helping developers quickly search for and understand code context.
Python
25.0K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
21.5K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
36.9K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
25.7K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
76.0K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
34.9K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
67.2K
4.5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
99.8K
4.7 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
50.3K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase