Adc MCP Project
An enterprise - level AI assistant system based on the Model Context Protocol, with intelligent server selection, text analysis, code review, sentiment analysis, and knowledge management functions, providing an aesthetically pleasing Web interface.
rating : 2.5 points
downloads : 7.4K
What is the MCP Enterprise AI Assistant?
This is an enterprise-level AI assistant system built using the latest Model Context Protocol (MCP) standard. The system automatically assigns user requests to the most suitable professional servers for processing through intelligent routing, providing four core functions: text analysis, code review, sentiment analysis, and knowledge management. The system supports local AI models (Ollama) and cloud-based AI services, with a complete Web interface and API.How to use the MCP Enterprise AI Assistant?
The system offers two usage methods: 1) Direct interaction through the Web interface, 2) Integration into other applications via the API. Users simply need to enter questions or upload documents on the Web interface, and the system will automatically recognize the task type and route it to the corresponding professional server for processing. It supports natural language queries and structured requests.Applicable Scenarios
Suitable for scenarios such as enterprise document processing, code quality inspection, customer feedback analysis, and knowledge base queries. Particularly suitable for enterprise environments that require the collaborative work of multiple professional AI services, such as software development teams, customer support departments, and content creation teams.Main Features
Standard MCP Protocol Implementation
Fully compliant with the JSON - RPC 2.0 and Model Context Protocol standards, supporting standard MCP methods such as initialize, tools/list, tools/call, etc., to ensure compatibility with other MCP systems.
Intelligent Server Routing
An AI - driven intelligent routing system that automatically analyzes the content of user requests and selects the most suitable professional server (text analysis, code review, sentiment analysis, knowledge management) for processing.
Text Analysis Service
Provides functions such as text summarization, entity extraction, and content classification, supporting long - document processing and multi - language text analysis.
Code Review Service
Automated code quality analysis, detecting potential bugs and providing improvement suggestions, supporting multiple programming languages (Python, JavaScript, Java, etc.).
Sentiment Analysis Service
Advanced sentiment detection and emotion analysis, providing sentiment scores and detailed sentiment classification, suitable for scenarios such as customer feedback and social media content analysis.
Knowledge Management Service
Document Q&A and information retrieval functions, supporting knowledge base queries and document content understanding, helping enterprises manage and utilize internal knowledge resources.
Aesthetically Pleasing Web Interface
A professional Web user interface that presents analysis results in a structured manner, supports real - time interaction and historical record viewing, providing an intuitive user experience.
Multi - AI Model Support
Supports local Ollama models and Azure OpenAI services, allowing flexible configuration of different AI models according to requirements to balance performance and cost.
Advantages
Modular architecture: Four professional servers perform their respective duties, improving processing accuracy.
Intelligent routing: Automatically selects the most suitable server without manual specification by the user.
Local deployment: Supports local Ollama AI models to protect data privacy.
Standard protocol: Based on the MCP standard, easy to expand and integrate.
User - friendly: Provides a complete Web interface, reducing the usage threshold.
Flexible configuration: Supports multiple AI models and server configurations.
Limitations
Resource requirements: Local operation requires sufficient computing resources (memory, CPU).
Model limitations: The performance of local models may be inferior to that of large - scale cloud models.
Initial configuration: Requires the installation and configuration of multiple components (Python, Ollama, etc.).
Concurrency limitations: The processing capacity of a single server is limited, and expansion is required in high - concurrency scenarios.
Knowledge scope: The knowledge management service depends on the pre - loaded knowledge base content.
How to Use
Environment Preparation
Install Python 3.8+ , Ollama, and Git to ensure that the system meets the operating requirements.
Download AI Models
Use Ollama to download the required AI models (llama3.2:3b is recommended).
Get Project Code
Clone the project code from GitHub to the local machine.
Install Dependencies
Install Python dependencies to ensure that all necessary libraries are installed.
Start the System
Use the provided script to start all servers and the Web application with one click.
Access the Web Interface
Open a browser and access http://localhost:5000 to start using the AI assistant.
Usage Examples
Technical Document Summarization
A user has a long technical document and needs to quickly understand its main content.
Code Quality Inspection
A developer needs to check if there are potential issues in newly written Python code.
Customer Feedback Analysis
A product manager needs to analyze the sentiment and main concerns in customer feedback.
Knowledge Base Query
A new employee needs to learn about the company's project management process.
Frequently Asked Questions
Does the system need to be connected to the Internet?
Which programming languages are supported for code review?
How to handle long documents?
Can the AI models be customized?
What is the system's response speed?
How to expand new functional servers?
How to ensure data privacy?
Does it support multiple users using it simultaneously?
Related Resources
Model Context Protocol Official Documentation
Official specifications and standard documents for the MCP protocol.
Ollama Official Website
A local AI model running platform that supports multiple open - source models.
Project GitHub Repository
The source code and latest version of this project.
Python Official Website
Official download and learning resources for the Python programming language.
Flask Web Framework Documentation
The Python Web framework used for the Web interface of this project.
Llama Model Introduction
Introduction to the open - source Llama series of AI models by Meta.

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
18.0K
4.3 points

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
17.4K
4.5 points

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
26.4K
5 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
53.3K
4.3 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
23.2K
5 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
51.0K
4.5 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
18.1K
4.5 points

Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
36.1K
4.8 points
