Ctx Sys
ctx-sys is a locally prioritized hybrid retrieval-augmented generation (RAG) tool designed for AI programming assistants. It analyzes the codebase through a tree parser, combines semantic vector search, keyword retrieval, and graph relationship traversal to provide in-depth project understanding for AI assistants. All operations are performed locally to protect code privacy.
rating : 2.5 points
downloads : 3.1K
What is ctx-sys?
ctx-sys is a locally prioritized code intelligent retrieval system specifically designed for AI programming assistants. It solves the problem that AI assistants cannot view the entire codebase due to the context window limitation. By indexing your code, understanding the relationships between symbols, and retrieving the correct context when the AI needs it, ctx-sys acts like an intelligent librarian, allowing the AI assistant to deeply understand your project.How to use ctx-sys?
Using ctx-sys is very simple: First, install and start Ollama. Then, initialize and index your project. Finally, connect ctx-sys to your AI assistant (such as Claude Desktop, Cursor, etc.). Once connected, your AI assistant can access your codebase through 12 tools for intelligent search and context retrieval.Applicable scenarios
ctx-sys is particularly suitable for the following scenarios: 1. Large codebase projects where it's difficult for AI assistants to understand all the code at once. 2. Complex projects that require cross-file understanding of code relationships. 3. Long-term development projects where you want the AI assistant to remember previous conversation content. 4. Projects with code privacy requirements that do not want the code to leave the local environment.Main features
Hybrid RAG retrieval
Combines three retrieval methods: vector search, keyword search, and graph traversal, and uses reciprocal rank fusion technology to provide the most relevant results.
Locally prioritized architecture
All processing is done locally, and your code will never leave your machine. Use Ollama for embedding and summary generation to ensure data privacy and security.
Code-aware parsing
Uses the tree-sitter AST parser to extract functions, classes, imports, and relationships, truly understanding the code structure rather than simple text matching.
MCP compatibility
Compatible with any client that supports the Model Context Protocol, including mainstream AI programming assistants such as Claude Desktop, Claude Code, and Cursor.
Multi-language support
Supports multiple programming languages such as TypeScript/JavaScript, Python, Rust, Go, Java, C/C++, C#, and document formats such as Markdown and HTML.
Conversation memory management
Manages conversation sessions, stores messages, and tracks decisions, allowing the AI assistant to remember previous conversation content.
Advantages
Complete data localization to protect code privacy and security.
Deep understanding of code structure, not just text matching.
Support for multiple programming languages and document formats.
Seamless integration with mainstream AI programming assistants.
Intelligent retrieval combines multiple technologies for more accurate results.
Support for conversation memory and project status management.
Limitations
Requires local installation and running of Ollama, which occupies system resources.
Indexing a large codebase for the first time may take a long time.
Requires certain technical knowledge for configuration and troubleshooting.
Currently mainly targeted at developers and technical teams.
Relatively limited support for non-code documents.
How to use
Install ctx-sys and Ollama
First, install the ctx-sys global package, then install and start the Ollama service. Ollama is responsible for generating code embeddings and summaries.
Initialize and index the project
Enter your project directory, run the initialization command to create a configuration file, and then run the indexing command to analyze your codebase.
Configure the AI assistant connection
Add ctx-sys as an MCP server to your AI assistant configuration. The configuration methods for different assistants vary slightly.
Start using the AI assistant
Restart your AI assistant. Now it can access your codebase and answer questions about the project structure, implementation details, etc.
Usage examples
Understand complex project structure
When you join a new project, you can use ctx-sys to help the AI assistant quickly understand the project architecture and key components.
Find the implementation of a specific function
When you need to find the implementation code of a specific function (such as user authentication), ctx-sys can help quickly locate the relevant code.
Debugging and error troubleshooting
When encountering an error, you can use ctx-sys to find the relevant error handling code and possible solutions.
Code refactoring suggestions
When refactoring code, ctx-sys can help understand code dependencies and avoid breaking existing functions.
Frequently Asked Questions
Will ctx-sys send my code to the cloud?
What kind of hardware configuration do I need?
Which AI programming assistants are supported?
How long does the indexing process take?
How to update the indexed code?
Can documents and comments be indexed?
Related resources
Official documentation
Complete ctx-sys documentation, including detailed configuration guides and API references.
GitHub repository
Source code, issue tracking, and contribution guidelines.
Technical white paper
In-depth understanding of the architecture design and technical implementation of ctx-sys.
Ollama official website
Local large language model running environment, a dependency component of ctx-sys.
Model Context Protocol
MCP protocol specification. ctx-sys communicates with AI assistants based on this protocol.
Community discussion
Communicate with other users, share experiences, and solve problems.

Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
24.4K
4.3 points

Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
20.4K
4.5 points

Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
71.8K
4.3 points

Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
34.3K
5 points

Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
31.1K
5 points

Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
65.4K
4.5 points

Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
21.0K
4.5 points

Context7
Context7 MCP is a service that provides real-time, version-specific documentation and code examples for AI programming assistants. It is directly integrated into prompts through the Model Context Protocol to solve the problem of LLMs using outdated information.
TypeScript
96.8K
4.7 points




