Xcatcher MCP Manifest
X

Xcatcher MCP Manifest

Xcatcher is a high - performance X (Twitter) data scraping service based on the remote MCP protocol. It supports pay - as - you - go with USDC on the Base and Solana chains via the x402 protocol, provides OpenAPI specifications for easy integration of AI agents, and focuses on the rapid collection of the latest posts of a large number of users.
2.5 points
5.7K

What is Xcatcher MCP?

Xcatcher is a remote MCP server designed for AI agents. Its core function is to efficiently and on a large scale scrape the latest posts of specified users on X (Twitter). It adopts an 'agent-first' architecture and provides tool call interfaces through the Streamable HTTP protocol, enabling AI agents to easily create scraping tasks, monitor status, and download results.

How to use Xcatcher MCP?

Using Xcatcher mainly consists of three steps: 1) Register/login via the API to obtain a key; 2) Use the key to create a scraping task (points are required); 3) Poll the task status and download the result file (in XLSX format) after completion. If the points are insufficient, the system will guide you to top up with USDC via the x402 protocol.

Use cases

Xcatcher is very suitable for scenarios that require real-time or near-real-time monitoring of the dynamics of a large number of X accounts, such as brand sentiment monitoring, competitor analysis, KOL dynamic tracking, market trend discovery, news event follow-up, and providing the latest social media data sources for AI agents.

Main features

Large-scale latest post scraping
Optimized for obtaining the latest posts of a large number of X accounts, supporting two modes: 'normal' (quickly obtain the latest snapshot, recommended for monitoring) and 'deep' (more in-depth collection of user historical data).
x402 pay-as-you-go
An innovative payment protocol that supports small, per-use top-ups with USDC on the Base or Solana chain. When the points are insufficient, the system will return a payment quote. After payment, you can immediately retry the task.
Remote MCP (Streamable HTTP)
A remote server fully compatible with the Model Context Protocol. AI agents can call tools through the standard JSON-RPC over HTTP protocol to create tasks, query status, and download results.
OpenAPI specification and Google ADK integration
Provides a complete OpenAPI 3.0 specification for easy integration by developers. It also provides end-to-end examples of Google ADK (Agent Development Kit) to lower the threshold for AI agent development.
Agent-friendly toolset
Provides a streamlined and focused core toolset: `create_crawl_task` (create a task), `get_task_status` (query status), `get_result_download_url` (get the download link), `x402_topup` (top up), and `cancel_task` (cancel a task).
Advantages
Pay-as-you-go with transparent costs: Using the x402 protocol, you only need to pay USDC for the actually consumed points, without the need for large pre - deposits or subscriptions.
Designed for agents: The tool interfaces are simple, and the error codes are clear (e.g., 402 payment required, 409 result not ready), facilitating logical judgment and process control for AI agents.
High throughput and low latency: Optimized for the 'latest post' scraping scenario, the 'normal' mode can quickly process a large number of accounts.
Multi-chain payment support: Supports USDC payments on both the Base and Solana chains, allowing users to choose according to their habits and costs.
Easy to integrate: Provides OpenAPI specifications, cURL examples, and Google ADK examples to reduce the integration difficulty.
Limitations
Focus on the latest content: Mainly designed for obtaining the latest posts, it may be slower or more costly for in-depth historical data scraping ('deep' mode).
Requires on-chain payment: Although the payment is flexible, users need to have a Base or Solana wallet and hold USDC, which poses a certain threshold for users unfamiliar with cryptocurrencies.
Fixed result file format: The default output is in XLSX format. If CSV is required, users need to convert it themselves.
Authentication dependency: All API calls (including result downloads) require a valid Bearer Token, so key management needs to be careful.

How to use

Get an API key
First, you need to register an account and obtain an API key (in the format of `xc_live_xxx`) for authentication. All subsequent requests need to carry this key in the HTTP header.
Check the point balance
Creating a scraping task requires consuming points. Before creating a task, it is recommended to query the remaining points of the current account first.
Create a scraping task
Specify the list of X usernames to be monitored and the scraping mode (recommended `normal`), and provide a unique `idempotency_key` to prevent duplicate creation. If the points are sufficient, the task will enter the queue.
Handle insufficient points (if necessary)
If the points are insufficient, the API will return a 402 status code with a payment quote. After paying USDC according to the quote, call the top-up interface with the payment voucher (transaction hash or signature), and then retry the task creation with the same `idempotency_key`.
Poll the task status and download the result
After creating the task, regularly query its status. When the status changes to `completed`, you can obtain the result file (in XLSX format) through the download link.

Usage examples

Case 1: AI agent automatically monitors competitors
An AI agent is set to automatically scrape the latest dynamics of 10 major competitor X accounts at 10 am every day to generate a daily competitive intelligence briefing.
Case 2: Event-driven public opinion alert
When a news event breaks out, users quickly start real-time monitoring of 50 relevant journalists, official accounts, and KOLs through an AI agent to track the evolution of public opinion.
Case 3: Integration into an automated workflow
A marketing team integrates Xcatcher into their automated platform (such as Zapier, n8n, or a custom script) to automatically collect the content of industry leaders every week for content creation inspiration.

Frequently Asked Questions

What is the x402 protocol? Do I have to pay with cryptocurrencies?
What's the difference between the 'normal' mode and the 'deep' mode? Which one should I choose?
After I received a 402 error, paid USDC, and successfully topped up, what should I do next?
Is the result file a public link? How can I ensure the security of my data?
How many accounts can be scraped at one time? Is there a limit?
Besides using the API, how else can I use Xcatcher?

Related resources

Official documentation
The official detailed documentation of Xcatcher, including API references, concept explanations, and update logs.
OpenAPI specification file
A complete OpenAPI 3.0.3 specification YAML file that can be directly imported into Postman, Swagger UI, or used to generate client SDKs.
Glama MCP server list
The Xcatcher MCP server list page on the Glama platform, including installation guides and community information.
Smithery search list
The list on the Smithery MCP server market.
GitHub repository (Manifest)
A code repository containing this README and the OpenAPI specification.

Installation

Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

P
Praisonai
PraisonAI is a production-ready multi-AI agent framework with self-reflection capabilities, designed to create AI agents to automate the solution of various problems from simple tasks to complex challenges. It simplifies the construction and management of multi-agent LLM systems by integrating PraisonAI agents, AG2, and CrewAI into a low-code solution, emphasizing simplicity, customization, and effective human-machine collaboration.
Python
4.3K
5 points
M
Maverick MCP
MaverickMCP is a personal stock analysis server based on FastMCP 2.0, providing professional level financial data analysis, technical indicator calculation, and investment portfolio optimization tools for MCP clients such as Claude Desktop. It comes pre-set with 520 S&P 500 stock data, supports multiple technical analysis strategies and parallel processing, and can run locally without complex authentication.
Python
9.0K
4 points
K
Klavis
Klavis AI is an open-source project that provides a simple and easy-to-use MCP (Model Context Protocol) service on Slack, Discord, and Web platforms. It includes various functions such as report generation, YouTube tools, and document conversion, supporting non-technical users and developers to use AI workflows.
TypeScript
14.6K
5 points
S
Scrapling
Scrapling is an adaptive web scraping library that can automatically learn website changes and re - locate elements. It supports multiple scraping methods and AI integration, providing high - performance parsing and a developer - friendly experience.
Python
12.8K
5 points
A
Apple Health MCP
An MCP server for querying Apple Health data via SQL, implemented based on DuckDB for efficient analysis, supporting natural language queries and automatic report generation.
TypeScript
11.0K
4.5 points
M
MCP Server Airbnb
Certified
MCP service for Airbnb listing search and details query
TypeScript
15.8K
4 points
F
Firecrawl MCP Server
The Firecrawl MCP Server is a Model Context Protocol server integrating Firecrawl's web - scraping capabilities, providing rich web - scraping, searching, and content - extraction functions.
TypeScript
91.5K
5 points
R
Rednote MCP
RedNote MCP is a tool that provides services for accessing Xiaohongshu content. It supports functions such as authentication management, keyword - based note search, and command - line initialization, and can access note content via URL.
TypeScript
13.0K
4.5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
17.6K
4.5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
57.9K
4.3 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
29.7K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
19.2K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
25.0K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
54.7K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
18.5K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
39.5K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2025AIBase