Web Llm MCP Server
W

Web Llm MCP Server

A local LLM inference MCP server based on Playwright and Web-LLM, realizing text generation, chat interaction, and model management functions through browser automation.
2 points
5.5K

What is the Web-LLM MCP Server?

This is a service that runs a local large language model (LLM) in the browser through Playwright, allowing users to perform operations such as text generation and chat interactions via the web interface. It utilizes the @mlc-ai/web-llm library to achieve efficient inference.

How to use the Web-LLM MCP Server?

By starting the server and calling the provided tool interfaces, users can easily interact with the LLM. Functions include text generation, chat conversations, model switching, etc., which are suitable for development and testing scenarios.

Applicable Scenarios

Suitable for developers, researchers who need to run local LLMs in the browser, and application users who want to quickly test different models.

Main Features

Browser-side LLM Inference
Run a local LLM in the browser and complete text generation tasks without relying on external APIs.
Playwright Integration
Control the browser through the automation tool Playwright to achieve seamless interaction with the LLM.
Multi-model Support
Support multiple pre-trained models, such as Llama, Phi, Gemma, etc., and switch between them at any time.
Real-time Chat Interaction
Provide a chat interface to support multi-round conversations with the LLM.
Status Monitoring and Screenshots
View the current status information of the LLM and take interface screenshots for debugging.
Advantages
Run a local LLM directly in the browser without an internet connection
Support multiple models for easy testing of different effects
Easy to integrate into existing applications
Provide an intuitive user interface and interaction method
Limitations
Downloading model files for the first time may take a long time
Model switching requires re-initialization, increasing waiting time
Have certain requirements for hardware resources
Not suitable for large-scale deployment scenarios

How to Use

Install Dependencies
First, install the dependency packages required for the project.
Install the Browser
Ensure that the Chromium browser is installed.
Start the Server
Run the main program to start the MCP server.
Use the Tools
Interact with the LLM by calling the provided tool functions.

Usage Examples

Generate an Article
The user wants to generate an article about the development of artificial intelligence.
Multi-round Conversations
The user wants to have multi-round conversations with the AI to discuss a certain topic.
Model Switching
The user wants to try the effects of different models.

Frequently Asked Questions

Why is the first run so slow?
Does it support custom models?
Can it run in non-headless mode?
How to get the help documentation?

Related Resources

Official Documentation
Learn detailed information and usage methods of @mlc-ai/web-llm.
GitHub Repository
Get the source code and the latest version updates.
Tutorial Video
Watch the video tutorial to learn how to use the server.

Installation

Copy the following command to your Client for configuration
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
10.6K
4.5 points
M
Moltbrain
MoltBrain is a long-term memory layer plugin designed for OpenClaw, MoltBook, and Claude Code, capable of automatically learning and recalling project context, providing intelligent search, observation recording, analysis statistics, and persistent storage functions.
TypeScript
9.2K
4.5 points
B
Bm.md
A feature-rich Markdown typesetting tool that supports multiple style themes and platform adaptation, providing real-time editing preview, image export, and API integration capabilities
TypeScript
14.9K
5 points
S
Security Detections MCP
Security Detections MCP is a server based on the Model Context Protocol that allows LLMs to query a unified security detection rule database covering Sigma, Splunk ESCU, Elastic, and KQL formats. The latest version 3.0 is upgraded to an autonomous detection engineering platform that can automatically extract TTPs from threat intelligence, analyze coverage gaps, generate SIEM-native format detection rules, run tests, and verify. The project includes over 71 tools, 11 pre-built workflow prompts, and a knowledge graph system, supporting multiple SIEM platforms.
TypeScript
7.8K
4 points
P
Paperbanana
Python
9.1K
5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
9.7K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
10.0K
5 points
A
Apify MCP Server
The Apify MCP Server is a tool based on the Model Context Protocol (MCP) that allows AI assistants to extract data from websites such as social media, search engines, and e-commerce through thousands of ready-to-use crawlers, scrapers, and automation tools (Apify Actors). It supports OAuth and Skyfire proxy payment and can be integrated into MCP clients such as Claude and VS Code through HTTPS endpoints or local stdio.
TypeScript
9.0K
5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
38.2K
5 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
28.5K
4.3 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
81.9K
4.3 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
24.1K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
38.6K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
69.9K
4.5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
24.1K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
55.5K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase