Seo Crawler MCP
SEO crawler MCP server for website crawling and SEO analysis. It stores and analyzes website data through a local SQLite database, providing comprehensive technical SEO checks and security audit functions.
rating : 2.5 points
downloads : 0
What is SEO Crawler MCP?
SEO Crawler MCP is a Model Context Protocol server that enables AI assistants (such as Claude) to crawl and analyze the technical SEO health of websites. Different from traditional external services, this tool runs entirely locally, uses an embedded SQLite database to store all crawled data, and detects content issues, technical SEO errors, security vulnerabilities, and optimization opportunities through more than 25 professional queries.How to use SEO Crawler MCP?
You can configure this MCP server through Claude Desktop and then use simple natural language commands to crawl websites and analyze the results. For example, tell Claude to 'Crawl https://example.com and analyze SEO issues', and Claude will call the corresponding tools to complete the entire process. You can also run the crawler directly in the terminal and then hand the results to Claude for analysis.Applicable scenarios
This tool is especially suitable for website administrators, SEO experts, content creators, and developers. Whether you want to regularly check the website's health status, fix technical SEO issues, conduct security audits, or optimize website performance, SEO Crawler MCP can provide professional analysis reports.Main features
Intelligent website crawling
Use the Crawlee library for efficient website crawling, supporting intelligent request scheduling, queue management, automatic retry, and error handling. You can configure the crawling depth, maximum number of pages, and user agent.
Local SQLite storage
All crawled data is stored in a local SQLite database, including HTML content, metadata, response headers, link relationships, and website structure. No external API is required, and the data is completely private.
More than 25 professional SEO queries
There are 28 built - in professional SQL queries covering 5 categories: critical issues, content quality, technical SEO, security, and optimization opportunities. Each query provides detailed impact analysis and repair suggestions.
Dual usage modes
Supports MCP mode (used through an AI assistant) and CLI mode (used through the terminal). You can run large - scale crawling tasks in the terminal and then analyze the results in Claude.
Structured analysis report
Generate an easy - to - understand SEO analysis report, organize issues by priority and category, and provide a list of affected URLs and specific repair suggestions.
Security audit function
Check for security issues such as missing security headers, insecure external links, and protocol - relative links to help you improve website security.
Advantages
Runs entirely locally, no external API required, ensuring data privacy
Seamlessly integrated with AI assistants, operable via natural language
Supports crawling large websites (thousands of pages)
Provides detailed repair suggestions, not just issue detection
Dual usage modes, flexibly adaptable to different workflows
Based on the mature Crawlee library, stable and reliable
Limitations
Requires a Node.js environment, which may be a bit of a threshold for non - technical users
Large - scale crawling may consume a lot of system resources
Lacks browser performance metrics such as Core Web Vitals
Cannot detect JavaScript - rendered content (unless using Playwright)
Requires manual configuration of Claude Desktop
Analysis results depend on the integrity of the crawled data
How to use
Installation and configuration
First, make sure Claude Desktop is installed. Then edit the Claude Desktop configuration file and add the SEO Crawler MCP server configuration.
Restart Claude Desktop
After saving the configuration file, restart Claude Desktop for the configuration to take effect. After restarting, four SEO tools will be available.
Start using
In the Claude chat window, use natural language commands to crawl and analyze websites. Claude will automatically call the corresponding tools.
Terminal usage (optional)
For large websites, you can run the crawl directly in the terminal and then provide the result path to Claude for analysis.
Usage examples
Complete SEO audit
Conduct a comprehensive technical SEO check on the website to discover all possible issues affecting rankings.
Security - specific check
Focus on the website's security configuration, check for missing security headers and potential vulnerabilities.
Content quality analysis
Check the website's content quality issues, such as duplicate titles, missing descriptions, thin content, etc.
Terminal crawling + Claude analysis
For large websites, first run the crawl in the terminal and then analyze the results in Claude.
Frequently Asked Questions
Do I need programming knowledge to use this tool?
Where is the crawled data stored? Is it secure?
Can this tool detect JavaScript - rendered content?
What problems may occur when crawling large websites?
How is this tool different from online SEO tools?
Do I need to pay for Claude Desktop?
Related resources
GitHub repository
Source code, issue tracking, and contribution guidelines
MCP server registry
Official MCP server registry to learn more about MCP servers
How to add an MCP server to Claude Desktop
Detailed configuration tutorial for beginners
Claude Desktop beginners guide
Learn the basic usage of Claude Desktop
Crawlee documentation
Understand the functions and configuration options of the underlying crawling library
Apache 2.0 license
The open - source license used by this project

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
24.4K
4.3 points

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.4K
4.5 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
71.7K
4.3 points

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
35.3K
5 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
32.1K
5 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
65.4K
4.5 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
22.0K
4.5 points

Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
98.1K
4.7 points




