Lm Studio MCP Server
L

Lm Studio MCP Server

The LM Studio MCP Server is a service based on the Model Context Protocol that allows AI assistants to remotely manage models through the LM Studio API, including functions such as viewing, loading, and unloading models.
2 points
4.7K

What is the LM Studio MCP Server?

The LM Studio MCP Server is a connection bridge that allows AI assistants (such as Claude Desktop) to communicate with your local LM Studio software. Through this server, AI assistants can view which models you have downloaded, load models into memory, unload models to free up resources, and obtain detailed information about the models. In simple terms, it enables your AI assistant to directly manage the AI models on your computer as conveniently as local operations.

How to use the LM Studio MCP Server?

Using the LM Studio MCP Server requires three steps: 1. Ensure that the LM Studio software is running and the local server function is enabled. 2. Configure the MCP server connection in your AI assistant (such as Claude Desktop). 3. Send instructions through the AI assistant to manage your models. The entire process does not require writing code and only requires simple configuration.

Applicable Scenarios

This server is particularly suitable for the following scenarios: • Need to quickly switch between different AI models. • Want to directly manage local models through an AI assistant. • Need to monitor the memory usage of models. • Hope to automate the model loading and unloading process. • Use different dedicated models in multiple projects.

Main Features

Health Check
Verify the connection status with the LM Studio server to ensure normal communication.
View Model Library
Display all the downloaded AI models in your LM Studio, including model names, sizes, and architecture information.
View Loaded Models
Show the models currently running in memory to understand resource usage.
Load Model
Load the specified model into memory, and you can customize the identifier and configuration parameters.
Unload Model
Remove the specified model from memory to free up system resources.
Get Model Details
View the detailed information of the loaded model, including configuration parameters and functional features.
Advantages
No manual operation required: Manage models through an AI assistant without opening the LM Studio interface.
Resource optimization: Unload unused models at any time to save memory and video memory.
Flexible configuration: Support custom model identifiers and parameter configurations.
Status monitoring: View the model loading status and connection status in real - time.
Cross - platform support: Support Docker deployment and can be used in different environments.
Limitations
Dependent on LM Studio: The LM Studio software must be installed and running first.
Local connection: Usually needs to be on the same computer or network as LM Studio.
Learning cost: Need to understand basic configurations and commands.
Model compatibility: Only support model formats compatible with LM Studio.

How to Use

Install LM Studio
First, ensure that you have installed the LM Studio software and enabled the local server function. In the settings of LM Studio, find and enable the 'Local Server' option.
Configure Claude Desktop
Open the configuration file of Claude Desktop and add the configuration of the LM Studio MCP server. You can choose to run it using npx, local development, or Docker.
Start and Use
Restart Claude Desktop, and then you can manage your models through conversations. You can ask Claude to list models, load specific models, etc.

Usage Examples

Quickly Switch Between Different Models
When you need to use different models for different tasks, you can quickly switch without manually operating the LM Studio interface.
Model Resource Management
When you find that your computer's memory is insufficient, you can check which models are occupying resources and unload the unnecessary models.
Batch Model Operations
Before starting work, load all the required models at once to improve work efficiency.

Frequently Asked Questions

Do I need to install Node.js to use it?
Does LM Studio have to be on the same computer as the MCP server?
What should I do if the model fails to load?
Can I load multiple models simultaneously?
How do I know if a model supports specific functions?

Related Resources

LM Studio Official Website
Download the LM Studio software and obtain official documentation.
Model Context Protocol Documentation
Understand the technical specifications and standards of the MCP protocol.
GitHub Repository
View the source code, submit issues, and participate in development.
Docker Hub Image
Obtain the Docker image and view the container configuration.

Installation

Copy the following command to your Client for configuration
{
  "mcpServers": {
    "lmstudio": {
      "command": "npx",
      "args": ["@portertech/lm-studio-mcp-server"],
      "env": {
        "LMSTUDIO_HOST": "127.0.0.1",
        "LMSTUDIO_PORT": "1234"
      }
    }
  }
}

{
  "mcpServers": {
    "lmstudio": {
      "command": "npx",
      "args": ["tsx", "/path/to/lm-studio-mcp-server/src/index.ts"],
      "env": {
        "LMSTUDIO_HOST": "127.0.0.1",
        "LMSTUDIO_PORT": "1234"
      }
    }
  }
}

{
  "mcpServers": {
    "lmstudio": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "LMSTUDIO_HOST=127.0.0.1",
        "-e",
        "LMSTUDIO_PORT=1234",
        "portertech/lm-studio-mcp-server:latest"
      ]
    }
  }
}
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

V
Vestige
Vestige is an AI memory engine based on cognitive science. By implementing 29 neuroscience modules such as prediction error gating, FSRS - 6 spaced repetition, and memory dreaming, it provides long - term memory capabilities for AI. It includes a 3D visualization dashboard and 21 MCP tools, runs completely locally, and does not require the cloud.
Rust
10.6K
4.5 points
M
Moltbrain
MoltBrain is a long-term memory layer plugin designed for OpenClaw, MoltBook, and Claude Code, capable of automatically learning and recalling project context, providing intelligent search, observation recording, analysis statistics, and persistent storage functions.
TypeScript
9.2K
4.5 points
B
Bm.md
A feature-rich Markdown typesetting tool that supports multiple style themes and platform adaptation, providing real-time editing preview, image export, and API integration capabilities
TypeScript
14.9K
5 points
S
Security Detections MCP
Security Detections MCP is a server based on the Model Context Protocol that allows LLMs to query a unified security detection rule database covering Sigma, Splunk ESCU, Elastic, and KQL formats. The latest version 3.0 is upgraded to an autonomous detection engineering platform that can automatically extract TTPs from threat intelligence, analyze coverage gaps, generate SIEM-native format detection rules, run tests, and verify. The project includes over 71 tools, 11 pre-built workflow prompts, and a knowledge graph system, supporting multiple SIEM platforms.
TypeScript
7.8K
4 points
P
Paperbanana
Python
9.1K
5 points
B
Better Icons
An MCP server and CLI tool that provides search and retrieval of over 200,000 icons, supports more than 150 icon libraries, and helps AI assistants and developers quickly obtain and use icons.
TypeScript
9.7K
4.5 points
A
Assistant Ui
assistant - ui is an open - source TypeScript/React library for quickly building production - grade AI chat interfaces, providing composable UI components, streaming responses, accessibility, etc., and supporting multiple AI backends and models.
TypeScript
10.0K
5 points
A
Apify MCP Server
The Apify MCP Server is a tool based on the Model Context Protocol (MCP) that allows AI assistants to extract data from websites such as social media, search engines, and e-commerce through thousands of ready-to-use crawlers, scrapers, and automation tools (Apify Actors). It supports OAuth and Skyfire proxy payment and can be integrated into MCP clients such as Claude and VS Code through HTTPS endpoints or local stdio.
TypeScript
9.0K
5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
38.2K
5 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
24.1K
4.5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
80.7K
4.3 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
27.6K
4.3 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
38.6K
5 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
69.9K
4.5 points
C
Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
107.8K
4.7 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
25.1K
4.5 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2026AIBase