Initialize project configuration
Switch between different rules for context generation
Select files for inclusion in context
Select files for outline generation
Generate and copy context to clipboard
Generate project instructions for LLMs
Process LLM file requests
List files modified since last context generation
Generate outlines for code files
Extract code implementations requested by LLMs
MCP server for direct LLM integration
LLM Context is a powerful tool that helps developers efficiently share relevant code and text from their projects with Large Language Models. It leverages smart file selection using .gitignore patterns and provides both a streamlined clipboard workflow and direct LLM integration through the Model Context Protocol (MCP). The tool offers rule-based customization that enables easy switching between different tasks like code review and documentation. It includes smart code outlining to help LLMs understand project structure and supports extracting specific code implementations on demand. Whether you're using Claude Desktop via MCP or any other LLM chat interface via clipboard, LLM Context optimizes your AI-assisted development workflow.
LLM Context helps developers quickly inject relevant content from code or text projects into Large Language Model chat interfaces. It provides two main workflows:
The tool is particularly effective for projects that fit within an LLM's context window and works well with interfaces that have persistent context like Claude Projects and Custom GPTs.
Install LLM Context using uv:
uv tool install "llm-context>=0.3.0"
To upgrade to the latest version:
uv tool upgrade llm-context
To integrate LLM Context with Claude Desktop via the Model Context Protocol:
{
"mcpServers": {
"CyberChitta": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
For optimal results, combine initial context through Claude's Project Knowledge UI with dynamic code access via MCP. This provides both comprehensive understanding and access to the latest changes.
For using LLM Context with any LLM chat interface:
Navigate to your project's root directory
Initialize repository: lc-init
(only needed once)
Select files: lc-sel-files
(Optional) Review selected files in .llm-context/curr_ctx.yaml
Generate context: lc-context
(with optional flags: -p
for prompt, -u
for user notes)
Paste the generated context into your preferred LLM interface:
lc-context -p
to include instructionsWhen the LLM requests additional files:
lc-clip-files
lc-init
: Initialize project configurationlc-set-rule <n>
: Switch rules (system rules are prefixed with "lc-")lc-sel-files
: Select files for inclusionlc-sel-outlines
: Select files for outline generationlc-context [-p] [-u] [-f FILE]
: Generate and copy context
-p
: Include prompt instructions-u
: Include user notes-f FILE
: Write to output filelc-prompt
: Generate project instructions for LLMslc-clip-files
: Process LLM file requestslc-changed
: List files modified since last context generationlc-outlines
: Generate outlines for code fileslc-clip-implementations
: Extract code implementations requested by LLMs (doesn't support C/C++)LLM Context uses a Markdown (+ YAML front matter)-based rules system that allows you to create different profiles for different use cases:
Rules can be switched using the lc-set-rule
command.
The tool uses .gitignore
patterns for intelligent file selection, making it easy to include only relevant files in your context.
For more detailed documentation, refer to the User Guide.