๐ MCP-RAG: Agentic AI Orchestration for Business Analytics
This project is a lightweight demonstration that combines the Model Context Protocol (MCP) with Retrieval-Augmented Generation (RAG). It aims to orchestrate multi - agent AI workflows for business analysis, offering a practical solution to streamline business data processing and analysis.
๐ Quick Start
Prerequisites
- Python 3.8+
- Google Gemini API key (free tier available) - for future LLM integration
- Basic understanding of MCP and RAG concepts
Installation
-
Clone the repository:
git clone https://github.com/ANSH-RIYAL/MCP-RAG.git cd MCP-RAG -
Install dependencies:
pip install -r requirements.txt -
Set up environment variables:
For Gemini API (default):
export LLM_MODE="gemini" export GEMINI_API_KEY="your-gemini-api-key"For Custom Localhost API:
export LLM_MODE="custom" export CUSTOM_API_URL="http://localhost:8000" export CUSTOM_API_KEY="your-api-key" # Optional
Quick Demo
Run the demonstration script to see both MCP servers in action:
python main.py
โจ Features
- MCP-Based Coordination: Multiple specialized servers work together seamlessly to handle different aspects of business analysis.
- Business Analytics Tools: It provides a range of tools such as calculating mean, standard deviation, correlation, and linear regression, enabling in - depth data analysis.
- RAG Knowledge Base: The knowledge base contains business terms, policies, and analysis guidelines, which can be retrieved to provide context - aware information.
- Modular Design: This design allows for easy extension with new tools or swapping of LLM backends without significant code changes.
- Natural Language Interface: Users can ask questions in natural language, like "What's the average earnings from Q1?", making it accessible to non - technical users.
๐ป Usage Examples
LLM Backend Selection
Option 1: Google Gemini API (Default)
export LLM_MODE="gemini"
export GEMINI_API_KEY="your-gemini-api-key"
python main.py
Option 2: Custom Localhost API
export LLM_MODE="custom"
export CUSTOM_API_URL="http://localhost:8000"
export CUSTOM_API_KEY="your-api-key" # Optional
python main.py
Conversation Scenarios
Run the conversation scenarios to see real - world usage examples:
python test_scenarios.py
Business Analytics Tools
- Data Exploration: Get dataset information and sample data.
- Statistical Analysis: Calculate mean, standard deviation with filtering.
- Correlation Analysis: Find relationships between variables.
- Predictive Modeling: Use linear regression for forecasting.
RAG Knowledge Retrieval
- Term Definitions: Look up business concepts.
- Policy Information: Retrieve company procedures.
- Analysis Guidelines: Get context for data interpretation.
๐ Documentation
Scenarios & Use Cases
Scenario 1: Sales Analysis
Manager: "What's the average earnings from Q1?"
MCP-RAG System:
1. Analytics Server: calculate_mean(column='earnings', filter_column='quarter', filter_value='Q1-2024')
โ Mean of earnings: 101666.67
2. RAG Server: get_business_terms(term='earnings')
โ Earnings: Total revenue generated by a department or company in a given period
3. Response: "Average earnings for Q1-2024: $101,667"
Scenario 2: Performance Correlation
Manager: "What's the correlation between sales and expenses?"
MCP-RAG System:
1. Analytics Server: calculate_correlation(column1='sales', column2='expenses')
โ Correlation between sales and expenses: 0.923
2. Response: "Correlation: 0.923 (strong positive relationship)"
Scenario 3: Predictive Modeling
Manager: "Build a model to predict earnings from sales and employees"
MCP-RAG System:
1. Analytics Server: linear_regression(target_column='earnings', feature_columns=['sales', 'employees'])
โ Linear Regression Results:
Target: earnings
Features: ['sales', 'employees']
Intercept: 15000.00
sales coefficient: 0.45
employees coefficient: 1250.00
R-squared: 0.987
2. Response: "Model created with Rยฒ = 0.987"
Scenario 4: Business Knowledge
Manager: "What does profit margin mean?"
MCP-RAG System:
1. RAG Server: get_business_terms(term='profit margin')
โ Profit Margin: Percentage of revenue that remains as profit after expenses, calculated as (earnings - expenses) / earnings
2. Response: "Profit Margin: Percentage of revenue that remains as profit after expenses"
Scenario 5: Policy Information
Manager: "What are the budget allocation policies?"
MCP-RAG System:
1. RAG Server: get_company_policies(policy_type='budget')
โ Budget Allocation: Marketing gets 25% of total budget, Engineering gets 30%, Sales gets 45%
2. Response: "Budget Allocation: Marketing gets 25%, Engineering gets 30%, Sales gets 45%"
Customization Guide
For Your Organization
Step 1: Replace Sample Data
- Update Business Data: Replace
data/sample_business_data.csvwith your actual data. Ensure columns are numeric for analysis tools, add any categorical columns for filtering, and include time - based columns for trend analysis. - Update Knowledge Base: Replace
data/business_knowledge.txtwith your organization's business terms and definitions, company policies and procedures, and analysis guidelines and best practices.
Step 2: Add Custom Analytics Tools
File to modify: src/servers/business_analytics_server.py
- Add New Tools: In the
handle_list_tools()function (around line 29), add new tools to the tools list. - Implement Tool Logic: In the
handle_call_tool()function (around line 140), add the corresponding handler.
Step 3: Extend RAG Capabilities
File to modify: src/servers/rag_server.py
- Add New Knowledge Sources: Modify the
load_business_knowledge()function (around line 25) to include database connections, document processing, and API integrations. - Add New RAG Tools: In the
handle_list_tools()function (around line 50), add new tools. - Implement RAG Tool Logic: In the
handle_call_tool()function (around line 90), add the handler.
Step 4: Integrate LLM Backend
File to create: src/servers/llm_server.py (new file)
- Using the Existing LLM Client: The
FlexibleRAGAgentinsrc/core/gemini_rag_agent.pyalready supports Google Gemini API and custom localhost API (OpenAI - compatible format). - Create Custom LLM Server (optional): If you need a dedicated MCP server for LLM operations, you can create a new server as shown in the example code.
- Add to requirements.txt:
openai>=1.0.0 google-genai>=0.3.0 httpx>=0.24.0
Step 5: Add New Data Sources
Files to modify: src/servers/business_analytics_server.py and src/servers/rag_server.py
- Database Connectors: Add tools to connect to various databases such as PostgreSQL, MySQL, SQLite, MongoDB, Redis, and data warehouses.
- API Integrations: Connect to business systems like CRM systems, marketing platforms, and financial systems.
Current Tool Implementations
- Business Analytics Tools (
src/servers/business_analytics_server.py):calculate_mean,calculate_std,calculate_correlation,linear_regression,get_data_info. - RAG Tools (
src/servers/rag_server.py):get_business_terms,get_company_policies,search_business_knowledge. - LLM Integration (
src/core/llm_client.py):FlexibleRAGAgent,LLMClient, tool calling and conversation management.
Modular Architecture Benefits
- Swap Components: Replace any server without affecting others.
- Add Capabilities: Plug in new tools without rewriting existing code.
- Scale Independently: Run different servers on different machines.
- Customize Per Use Case: Use only the tools you need.
Example Extensions
Adding Sentiment Analysis
File to create: src/servers/sentiment_analysis_server.py
# Create sentiment_analysis_server.py
@server.list_tool()
async def analyze_sentiment(text: str) -> CallToolResult:
# Integrate with sentiment analysis API
# Return sentiment scores and insights
Adding Forecasting
File to modify: src/servers/business_analytics_server.py
# Add to handle_list_tools() function
Tool(
name="time_series_forecast",
description="Forecast future values using time series analysis",
inputSchema={
"type": "object",
"properties": {
"column": {"type": "string"},
"periods": {"type": "integer"}
}
}
)
Adding Document Processing
File to create: src/servers/document_processor_server.py
# Create document_processor_server.py
@server.list_tool()
async def process_document(file_path: str) -> CallToolResult:
# Extract text from PDFs, Word docs, etc.
# Add to knowledge base
Architecture
Project Structure
MCP-RAG/
โโโ data/
โ โโโ sample_business_data.csv # Business dataset for analysis
โ โโโ business_knowledge.txt # RAG knowledge base
โโโ src/
โ โโโ servers/
โ โโโ business_analytics_server.py # Statistical analysis tools
โ โโโ rag_server.py # Knowledge retrieval tools
โโโ main.py # Demo and orchestration script
โโโ test_scenarios.py # Conversation scenarios
โโโ requirements.txt # Dependencies
โโโ README.md # This file
Key Components
- Business Analytics Server: An MCP server providing statistical analysis tools.
- RAG Server: An MCP server for business knowledge retrieval.
- Orchestration Layer: Coordinates between servers and LLM (future).
- Data Layer: Contains sample business data and knowledge base.
Configuration
Environment Variables
| Property | Details | Default |
|---|---|---|
LLM_MODE |
LLM backend mode: "gemini" or "custom" | gemini |
GEMINI_API_KEY |
Gemini API key for LLM integration | None |
GEMINI_MODEL |
Gemini model name | gemini-2.0-flash-exp |
CUSTOM_API_URL |
Custom localhost API URL | http://localhost:8000 |
CUSTOM_API_KEY |
Custom API key (optional) | None |
Sample Data
The system includes quarterly business data (sales, marketing, engineering metrics across 4 quarters) and a business knowledge base (terms, policies, and analysis guidelines).
Use Cases
For Business Leaders
- No - Code Analytics: Ask natural language questions about business data.
- Quick Insights: Get statistical analysis without technical expertise.
- Context - Aware Reports: Combine data analysis with business knowledge.
For Data Teams
- Modular Architecture: Easy to add new analysis tools.
- LLM Integration: Ready for natural language query processing.
- Extensible Framework: Build custom agents for specific needs.
For AI Engineers
- MCP Protocol: Learn modern AI orchestration patterns.
- RAG Implementation: Understand knowledge retrieval systems.
- Agentic Design: Build multi - agent AI workflows.
๐ง Technical Details
Future Enhancements
Planned Features
- [ ] LLM Integration: Connect with Gemini, OpenAI, or local models.
- [ ] Natural Language Queries: Process complex business questions.
- [ ] Advanced Analytics: Time series analysis, clustering, forecasting.
- [ ] Web Interface: User - friendly dashboard for non - technical users.
- [ ] Real - time Data: Connect to live data sources.
- [ ] Custom Knowledge Bases: Upload company - specific documents.
Integration Possibilities
- Local LLM API: Use open - source models with Local LLM API.
- Database Connectors: Connect to SQL databases, data warehouses.
- API Integrations: Salesforce, HubSpot, Google Analytics.
- Document Processing: PDF, DOCX, email analysis.
๐ค Contributing
This is a foundation for building agentic AI systems. Contributions are welcome, including adding new analysis tools, expanding the knowledge base, integrating different LLM models, and improving documentation.
๐ License
This project is under the MIT License. You are free to use and modify it for your own projects!
๐ Related Projects
- Local LLM API: Run open - source LLMs locally.
- MCP Protocol: Official documentation
Ready to build your own agentic AI system? Start with this foundation and extend it for your specific needs. The modular design makes it easy to add new capabilities while maintaining clean architecture.
#AgenticAI #MCP #RAG #BusinessAnalytics #OpenSourceAI







