Web Llm MCP Server
A local LLM inference MCP server based on Playwright and Web-LLM, realizing text generation, chat interaction, and model management functions through browser automation.
rating : 2 points
downloads : 5.1K
What is the Web-LLM MCP Server?
This is a service that runs a local large language model (LLM) in the browser through Playwright, allowing users to perform operations such as text generation and chat interactions via the web interface. It utilizes the @mlc-ai/web-llm library to achieve efficient inference.How to use the Web-LLM MCP Server?
By starting the server and calling the provided tool interfaces, users can easily interact with the LLM. Functions include text generation, chat conversations, model switching, etc., which are suitable for development and testing scenarios.Applicable Scenarios
Suitable for developers, researchers who need to run local LLMs in the browser, and application users who want to quickly test different models.Main Features
Browser-side LLM Inference
Run a local LLM in the browser and complete text generation tasks without relying on external APIs.
Playwright Integration
Control the browser through the automation tool Playwright to achieve seamless interaction with the LLM.
Multi-model Support
Support multiple pre-trained models, such as Llama, Phi, Gemma, etc., and switch between them at any time.
Real-time Chat Interaction
Provide a chat interface to support multi-round conversations with the LLM.
Status Monitoring and Screenshots
View the current status information of the LLM and take interface screenshots for debugging.
Advantages
Run a local LLM directly in the browser without an internet connection
Support multiple models for easy testing of different effects
Easy to integrate into existing applications
Provide an intuitive user interface and interaction method
Limitations
Downloading model files for the first time may take a long time
Model switching requires re-initialization, increasing waiting time
Have certain requirements for hardware resources
Not suitable for large-scale deployment scenarios
How to Use
Install Dependencies
First, install the dependency packages required for the project.
Install the Browser
Ensure that the Chromium browser is installed.
Start the Server
Run the main program to start the MCP server.
Use the Tools
Interact with the LLM by calling the provided tool functions.
Usage Examples
Generate an Article
The user wants to generate an article about the development of artificial intelligence.
Multi-round Conversations
The user wants to have multi-round conversations with the AI to discuss a certain topic.
Model Switching
The user wants to try the effects of different models.
Frequently Asked Questions
Why is the first run so slow?
Does it support custom models?
Can it run in non-headless mode?
How to get the help documentation?
Related Resources
Official Documentation
Learn detailed information and usage methods of @mlc-ai/web-llm.
GitHub Repository
Get the source code and the latest version updates.
Tutorial Video
Watch the video tutorial to learn how to use the server.

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
14.8K
4.5 points

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
23.8K
5 points

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
15.7K
4.3 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
45.2K
4.3 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
20.3K
5 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
45.1K
4.5 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
16.0K
4.5 points

Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
30.7K
4.8 points
