MCP Py Prompt Cleaner
M

MCP Py Prompt Cleaner

This is a prompt optimization server based on the Model Context Protocol (MCP) that uses AI to enhance and clean the original prompts, making them clearer, more actionable, and more effective. It supports local and cloud LLMs and provides quality assessment and retry strategies.
2 points
7.6K

What is MCP Prompt Cleaner?

MCP Prompt Cleaner is an AI-based prompt optimization tool specifically designed to clean and enhance the original prompts you input. It can automatically identify vague, incomplete, or unclear prompts and convert them into clear, specific, and executable instructions, thereby improving the response quality of AI assistants.

How to use MCP Prompt Cleaner?

It's very simple to use: just send your original prompt to the tool, and it will automatically analyze and return an optimized version. You can also provide additional context or specify the optimization mode (general or code) to get more accurate results.

Applicable Scenarios

It is suitable for any scenario where you need to interact with an AI assistant, especially when your prompt is not clear enough, you need more specific guidance, or you hope to get a higher-quality response. It is particularly suitable for programming tasks, content creation, research analysis, and other fields that require precise instructions.

Main Features

AI Intelligent Enhancement
Use large language models to automatically improve the clarity and specificity of prompts, identify vague expressions, and add necessary details
Context-Aware Processing
Support providing additional context information to make the optimization process more in line with your specific needs
Mode-Specific Optimization
Provide two optimization modes, 'general' and 'code', and adopt different optimization strategies for different scenarios
Quality Assessment
Provide a quality score and detailed feedback for the optimized prompt to help you understand the improvement effect
Intelligent Retry Mechanism
Dual retry strategy: network layer retry to handle connection issues, and content layer retry to ensure the quality of AI output
Flexible Configuration
Support local LLMs (such as LMStudio) and cloud LLMs (such as OpenAI), with simple configuration
Advantages
Significantly improve the response quality of AI assistants and reduce the number of back-and-forth clarifications
Support local and cloud LLMs to protect privacy while maintaining flexibility
Provide detailed quality feedback to help users learn how to write better prompts
Simple configuration, no need for complex settings to start using
Open-source project, customizable and extensible
Limitations
Depends on external LLM services and requires an Internet connection (except for local LLMs)
For very professional domain terms, additional context may be required
The optimization process takes extra time and is not suitable for scenarios with extremely high real-time requirements
The free version may have usage limitations, and advanced features may require payment

How to Use

Installation and Configuration
Choose the local LLM or cloud LLM configuration according to your needs. Local LLMs do not require an API key, while cloud LLMs require the corresponding API key.
Start the Server
Run the main program to start the MCP server, which will automatically handle connections and protocol communication.
Configure the Client
Add the server configuration in Claude Desktop or other MCP clients.
Use the Tool
Call the clean_prompt tool through the client and pass in your original prompt and optional parameters.

Usage Examples

Programming Task Optimization
Convert vague programming requests into specific and executable code tasks
Content Creation Guidance
Expand simple creative ideas into a complete creative outline
Data Analysis Request
Convert vague data analysis requirements into clear analysis steps

Frequently Asked Questions

Do I need to pay to use it?
Which AI models are supported?
How long does the optimization process take?
How to ensure privacy and security?
Can I customize the optimization rules?

Related Resources

GitHub Repository
Project source code and latest version
MCP Protocol Documentation
Official specification of the Model Context Protocol
LMStudio Official Website
Local LLM runtime environment
Prompt Engineering Guide
Official guide on how to write effective prompts

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "mcp-prompt-cleaner": {
      "command": "python",
      "args": ["main.py"]
    }
  }
}

{
  "mcpServers": {
    "mcp-prompt-cleaner": {
      "command": "python",
      "args": ["main.py"],
      "env": {
        "LLM_API_KEY": "your-api-key-here",
        "LLM_API_ENDPOINT": "https://api.openai.com/v1/chat/completions",
        "LLM_MODEL": "gpt-4"
      }
    }
  }
}
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

A
Acemcp
Acemcp is an MCP server for codebase indexing and semantic search, supporting automatic incremental indexing, multi-encoding file processing, .gitignore integration, and a Web management interface, helping developers quickly search for and understand code context.
Python
8.4K
5 points
B
Blueprint MCP
Blueprint MCP is a chart generation tool based on the Arcade ecosystem. It uses technologies such as Nano Banana Pro to automatically generate visual charts such as architecture diagrams and flowcharts by analyzing codebases and system architectures, helping developers understand complex systems.
Python
7.5K
4 points
M
MCP Agent Mail
MCP Agent Mail is a mail - based coordination layer designed for AI programming agents, providing identity management, message sending and receiving, file reservation, and search functions, supporting asynchronous collaboration and conflict avoidance among multiple agents.
Python
8.4K
5 points
K
Klavis
Klavis AI is an open-source project that provides a simple and easy-to-use MCP (Model Context Protocol) service on Slack, Discord, and Web platforms. It includes various functions such as report generation, YouTube tools, and document conversion, supporting non-technical users and developers to use AI workflows.
TypeScript
12.5K
5 points
M
MCP
The Microsoft official MCP server provides search and access functions for the latest Microsoft technical documentation for AI assistants
11.7K
5 points
A
Aderyn
Aderyn is an open - source Solidity smart contract static analysis tool written in Rust, which helps developers and security researchers discover vulnerabilities in Solidity code. It supports Foundry and Hardhat projects, can generate reports in multiple formats, and provides a VSCode extension.
Rust
10.6K
5 points
D
Devtools Debugger MCP
The Node.js Debugger MCP server provides complete debugging capabilities based on the Chrome DevTools protocol, including breakpoint setting, stepping execution, variable inspection, and expression evaluation.
TypeScript
9.9K
4 points
S
Scrapling
Scrapling is an adaptive web scraping library that can automatically learn website changes and re - locate elements. It supports multiple scraping methods and AI integration, providing high - performance parsing and a developer - friendly experience.
Python
10.6K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
17.5K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
28.1K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
18.2K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
53.1K
4.3 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
50.4K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
23.8K
5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
18.1K
4.5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
74.5K
4.7 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2025AIBase