๐ LacyLights MCP Server
An MCP (Model Context Protocol) server that offers AI-powered theatrical lighting design capabilities for the LacyLights system.
โจ Features
Fixture Management
get_fixture_inventory - Query available lighting fixtures and their capabilities
analyze_fixture_capabilities - Analyze specific fixtures for color mixing, positioning, effects, etc.
Scene Generation
generate_scene - Generate lighting scenes based on script context and design preferences
analyze_script - Extract lighting-relevant information from theatrical scripts
optimize_scene - Optimize existing scenes for energy efficiency, dramatic impact, etc.
Cue Management
create_cue_sequence - Create sequences of lighting cues from existing scenes
generate_act_cues - Generate complete cue suggestions for theatrical acts
optimize_cue_timing - Optimize cue timing for smooth transitions or dramatic effect
analyze_cue_structure - Analyze and recommend improvements to cue lists
๐ฆ Installation
- Install dependencies:
npm install
- Set up environment variables:
cp .env.example .env
- Build the project:
npm run build
๐ง ChromaDB Setup (Optional)
The MCP server currently uses an in-memory pattern storage system for simplicity. If you want to use ChromaDB for persistent vector storage and more advanced RAG capabilities:
Option 1: Docker (Recommended)
docker-compose up -d chromadb
curl http://localhost:8000/api/v2/heartbeat
Option 2: Local Installation
pip install chromadb
chroma run --host localhost --port 8000
Then update your .env file:
CHROMA_HOST=localhost
CHROMA_PORT=8000
โ ๏ธ Important Note
The current implementation works without ChromaDB using built-in lighting patterns. ChromaDB enhances the system with vector similarity search for more sophisticated pattern matching.
๐ ๏ธ Configuration
Required Environment Variables
OPENAI_API_KEY - OpenAI API key for AI-powered lighting generation
LACYLIGHTS_GRAPHQL_ENDPOINT - GraphQL endpoint for your lacylights-node backend (default: http://localhost:4000/graphql)
Running the Server
Make sure your lacylights-node backend is running first, then:
npm run dev
npm run build
npm start
You should see:
RAG service initialized with in-memory patterns
LacyLights MCP Server running on stdio
Optional Environment Variables
CHROMA_HOST - ChromaDB host for RAG functionality (default: localhost)
CHROMA_PORT - ChromaDB port (default: 8000)
๐ป Usage Examples
Running the Server
npm run dev
npm start
Integration with Claude
Add this server to your Claude configuration:
{
"mcpServers": {
"lacylights": {
"command": "/usr/local/bin/node",
"args": ["/Users/bernard/src/lacylights/lacylights-mcp/run-mcp.js"],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"LACYLIGHTS_GRAPHQL_ENDPOINT": "http://localhost:4000/graphql"
}
}
}
}
โ ๏ธ Important Note
If the above doesn't work, you may need to specify the exact path to your Node.js 14+ installation. You can find it with:
which node
๐ก Usage Tip
Use the absolute path to run-mcp.js in your configuration. This wrapper ensures proper CommonJS module loading.
Basic Usage
Generate a Scene
Use the generate_scene tool to create lighting for:
- Scene: "Lady Macbeth's sleepwalking scene"
- Script Context: "A dark castle at night, Lady Macbeth enters carrying a candle, tormented by guilt"
- Mood: "mysterious"
- Color Palette: ["deep blue", "pale white", "cold"]
Analyze a Script
Use the analyze_script tool with the full text of Act 1 of Macbeth to:
- Extract all lighting cues
- Suggest scenes for each moment
- Identify key lighting moments
Create Cue Sequence
Use the create_cue_sequence tool to create a cue list for Act 1 using the scenes generated from script analysis.
๐ง Technical Details
AI-Powered Features
Script Analysis with RAG
- Analyzes theatrical scripts to extract lighting-relevant information
- Uses vector embeddings to match script contexts with lighting patterns
- Provides intelligent suggestions based on dramatic context
Intelligent Scene Generation
- Creates detailed DMX values for fixtures based on artistic intent
- Considers fixture capabilities and positioning
- Applies color theory and lighting design principles
Cue Timing Optimization
- Analyzes cue sequences for optimal timing
- Considers dramatic pacing and technical constraints
- Provides multiple optimization strategies
Development
Project Structure
src/
โโโ tools/ # MCP tool implementations
โ โโโ fixture-tools.ts
โ โโโ scene-tools.ts
โ โโโ cue-tools.ts
โโโ services/ # Core services
โ โโโ graphql-client.ts
โ โโโ rag-service.ts
โ โโโ ai-lighting.ts
โโโ types/ # TypeScript type definitions
โ โโโ lighting.ts
โโโ index.ts # MCP server entry point
Adding New Tools
- Create tool implementation in appropriate file under
src/tools/
- Add tool definition to
src/index.ts in the ListToolsRequestSchema handler
- Add tool handler in the
CallToolRequestSchema handler
- Update this README with tool documentation
Testing
npm test
๐ค Integration with LacyLights
This MCP server is designed to work with the existing LacyLights system:
- lacylights-node - Provides GraphQL API for fixture and scene management
- lacylights-fe - Frontend for manual lighting control and visualization
The MCP server acts as an AI layer that enhances the existing system with intelligent automation and design assistance.
๐ Troubleshooting
Common Issues
-
Module import errors
- The server uses ES modules with cross-fetch for GraphQL requests
- Ensure Node.js version is 18+ as specified in package.json
-
GraphQL connection errors
- Verify your
lacylights-node backend is running on port 4000
- Check the
LACYLIGHTS_GRAPHQL_ENDPOINT environment variable
-
OpenAI API errors
- Ensure your
OPENAI_API_KEY is set in the .env file
- Verify the API key has access to GPT-4
-
MCP connection errors in Claude
- Make sure to use the
run-mcp.js wrapper script, not dist/index.js directly
- Use the full absolute path in your Claude configuration
- Restart Claude after updating the MCP configuration
-
"Unexpected token ?" error
- This means Claude is using an old Node.js version (< 14)
- Update your config to use the full path to your Node.js installation
- On macOS with Homebrew:
"command": "/opt/homebrew/bin/node"
- On other systems, find your node path with:
which node
Dependencies
The simplified implementation uses:
- Direct fetch requests instead of Apollo Client for better ESM compatibility
- In-memory pattern storage instead of ChromaDB (can be upgraded later)
- Cross-fetch polyfill for Node.js fetch support
๐ License
MIT