Jinni
Jinni is an efficient tool for providing project context to large language models. It overcomes the limitations of reading files one by one by integrating relevant files and their metadata.
3.5 points
9.2K

What is Jinni?

Jinni is an intelligent project context extraction tool designed specifically for AI programming assistants. It can automatically scan your code project, extract all relevant file contents, and provide them to large language models (LLMs) in a structured manner, enabling AI to have a deeper understanding of the overall picture of your project.

How to use Jinni?

Simply configure your AI development environment (such as Cursor/Roo/Claude, etc.), and Jinni will work in the background. When the AI needs to understand the project, it will automatically obtain the complete code context.

Applicable scenarios

It is particularly suitable for scenarios such as the maintenance of complex code libraries, new members quickly getting familiar with the project, and cross - file code refactoring. It allows the AI assistant to understand your project like a senior developer.

Main features

Intelligent context extraction
Automatically identify and extract source code files in the project, excluding irrelevant content such as binaries/logs
Seamless MCP integration
Deeply integrate with mainstream AI development tools through the Model Context Protocol
Custom filtering rules
Support.gitignore - style rules to precisely control which files should be included or excluded
Complete metadata
Provide metadata such as file path, size, and modification time to help understand file relationships
Seamless WSL support
Automatically handle Windows/WSL path conversion, ensuring worry - free cross - platform development
Advantages
Get a complete project view with one click without manually viewing each file
The intelligent filtering mechanism ensures that only relevant code files are included
Ready - to - use with mainstream AI programming tools
Support large - scale projects and automatically handle path and encoding issues
Limitations
For extremely large projects, you may need to adjust the inclusion rules to avoid overly long context
Simple development environment settings are required for the first - time configuration
Binary/non - text files are not included by default (can be modified through configuration)

How to use

Install Jinni
Install the Jinni package via pip or uv
Configure the MCP client
Add Jinni server settings to the configuration file of your AI development tool
Specify the project root directory (optional)
For security reasons, you can restrict Jinni to access only specific directories
Start using
Restart your development environment, and now the AI can request to read the project context

Usage examples

New members getting familiar with the project
Newly joined developers use AI to quickly understand the project structure and main modules
Cross - file refactoring
Ensure consistency when refactoring code that affects multiple files
Dependency analysis
Sort out the module dependencies in a complex project

Frequently Asked Questions

Why can't the AI sometimes see some of my files?
How to limit the directory scope that Jinni can access?
How to handle Windows/WSL paths?
What should I do if the project is too large and the context exceeds the limit?

Related resources

Official GitHub repository
Get the latest version and source code
MCP protocol documentation
Understand the technical details of the Model Context Protocol
Cursor integration guide
How to configure Jinni in Cursor

Installation

Copy the following command to your Client for configuration
{
    "mcpServers": {
        "jinni": {
            "command": "uvx",
            "args": ["jinni-server"]
        }
    }
}

{
      "mcpServers": {
        "jinni": {
          "command": "uvx",
          "args": ["jinni-server"]
        }
      }
    }

{
      "mcpServers": {
        "jinni": {
          // Adjust python path if needed, or ensure the correct environment is active
          "command": "python -m jinni.server"
          // Optionally constrain the server to only read within a tree (recommended for security):
          // "command": "python -m jinni.server --root /absolute/path/to/repo"
        }
      }
    }
Note: Your key is sensitive information, do not share it with anyone.

Alternatives

K
Klavis
Klavis AI is an open-source project that provides a simple and easy-to-use MCP (Model Context Protocol) service on Slack, Discord, and Web platforms. It includes various functions such as report generation, YouTube tools, and document conversion, supporting non-technical users and developers to use AI workflows.
TypeScript
8.1K
5 points
M
MCP
The Microsoft official MCP server provides search and access functions for the latest Microsoft technical documentation for AI assistants
9.9K
5 points
A
Aderyn
Aderyn is an open - source Solidity smart contract static analysis tool written in Rust, which helps developers and security researchers discover vulnerabilities in Solidity code. It supports Foundry and Hardhat projects, can generate reports in multiple formats, and provides a VSCode extension.
Rust
5.9K
5 points
D
Devtools Debugger MCP
The Node.js Debugger MCP server provides complete debugging capabilities based on the Chrome DevTools protocol, including breakpoint setting, stepping execution, variable inspection, and expression evaluation.
TypeScript
5.4K
4 points
S
Scrapling
Scrapling is an adaptive web scraping library that can automatically learn website changes and re - locate elements. It supports multiple scraping methods and AI integration, providing high - performance parsing and a developer - friendly experience.
Python
8.9K
5 points
M
Mcpjungle
MCPJungle is a self-hosted MCP gateway used to centrally manage and proxy multiple MCP servers, providing a unified tool access interface for AI agents.
Go
0
4.5 points
C
Cipher
Cipher is an open-source memory layer framework designed for programming AI agents. It integrates with various IDEs and AI coding assistants through the MCP protocol, providing core functions such as automatic memory generation, team memory sharing, and dual-system memory management.
TypeScript
0
5 points
N
Nexus
Nexus is an AI tool aggregation gateway that supports connecting multiple MCP servers and LLM providers, providing tool search, execution, and model routing functions through a unified endpoint, and supporting security authentication and rate limiting.
Rust
0
4 points
G
Gitlab MCP Server
Certified
The GitLab MCP server is a project based on the Model Context Protocol that provides a comprehensive toolset for interacting with GitLab accounts, including code review, merge request management, CI/CD configuration, and other functions.
TypeScript
16.6K
4.3 points
N
Notion Api MCP
Certified
A Python-based MCP Server that provides advanced to-do list management and content organization functions through the Notion API, enabling seamless integration between AI models and Notion.
Python
14.8K
4.5 points
M
Markdownify MCP
Markdownify is a multi-functional file conversion service that supports converting multiple formats such as PDFs, images, audio, and web page content into Markdown format.
TypeScript
23.5K
5 points
D
Duckduckgo MCP Server
Certified
The DuckDuckGo Search MCP Server provides web search and content scraping services for LLMs such as Claude.
Python
45.0K
4.3 points
F
Figma Context MCP
Framelink Figma MCP Server is a server that provides access to Figma design data for AI programming tools (such as Cursor). By simplifying the Figma API response, it helps AI more accurately achieve one - click conversion from design to code.
TypeScript
45.4K
4.5 points
U
Unity
Certified
UnityMCP is a Unity editor plugin that implements the Model Context Protocol (MCP), providing seamless integration between Unity and AI assistants, including real - time state monitoring, remote command execution, and log functions.
C#
19.2K
5 points
G
Gmail MCP Server
A Gmail automatic authentication MCP server designed for Claude Desktop, supporting Gmail management through natural language interaction, including complete functions such as sending emails, label management, and batch operations.
TypeScript
14.8K
4.5 points
M
Minimax MCP Server
The MiniMax Model Context Protocol (MCP) is an official server that supports interaction with powerful text-to-speech, video/image generation APIs, and is suitable for various client tools such as Claude Desktop and Cursor.
Python
29.3K
4.8 points
AIBase
Zhiqi Future, Your AI Solution Think Tank
© 2025AIBase