Back to MCP Catalog

LLM Context MCP Server

Developer ToolsPython
Share code with LLMs via Model Context Protocol or clipboard with smart file selection and rule-based customization
Available Tools

lc-init

Initialize project configuration

lc-set-rule

Switch between different rules for context generation

lc-sel-files

Select files for inclusion in context

lc-sel-outlines

Select files for outline generation

lc-context

Generate and copy context to clipboard

lc-prompt

Generate project instructions for LLMs

lc-clip-files

Process LLM file requests

lc-changed

List files modified since last context generation

lc-outlines

Generate outlines for code files

lc-clip-implementations

Extract code implementations requested by LLMs

lc-mcp

MCP server for direct LLM integration

LLM Context is a powerful tool that helps developers efficiently share relevant code and text from their projects with Large Language Models. It leverages smart file selection using .gitignore patterns and provides both a streamlined clipboard workflow and direct LLM integration through the Model Context Protocol (MCP). The tool offers rule-based customization that enables easy switching between different tasks like code review and documentation. It includes smart code outlining to help LLMs understand project structure and supports extracting specific code implementations on demand. Whether you're using Claude Desktop via MCP or any other LLM chat interface via clipboard, LLM Context optimizes your AI-assisted development workflow.

Overview

LLM Context helps developers quickly inject relevant content from code or text projects into Large Language Model chat interfaces. It provides two main workflows:

  1. Direct LLM Integration via Model Context Protocol (MCP) with Claude Desktop
  2. Clipboard Workflow for use with any LLM chat interface

The tool is particularly effective for projects that fit within an LLM's context window and works well with interfaces that have persistent context like Claude Projects and Custom GPTs.

Installation

Install LLM Context using uv:

uv tool install "llm-context>=0.3.0"

To upgrade to the latest version:

uv tool upgrade llm-context

Using with Claude Desktop (MCP)

To integrate LLM Context with Claude Desktop via the Model Context Protocol:

  1. Add the following configuration to your 'claude_desktop_config.json' file:
{
  "mcpServers": {
    "CyberChitta": {
      "command": "uvx",
      "args": ["--from", "llm-context", "lc-mcp"]
    }
  }
}
  1. Start working with your project by either:
    • Saying to Claude: "I would like to work with my project"
    • Or directly specifying: "I would like to work with my project /path/to/your/project"

For optimal results, combine initial context through Claude's Project Knowledge UI with dynamic code access via MCP. This provides both comprehensive understanding and access to the latest changes.

CLI Workflow

For using LLM Context with any LLM chat interface:

  1. Navigate to your project's root directory

  2. Initialize repository: lc-init (only needed once)

  3. Select files: lc-sel-files

  4. (Optional) Review selected files in .llm-context/curr_ctx.yaml

  5. Generate context: lc-context (with optional flags: -p for prompt, -u for user notes)

  6. Paste the generated context into your preferred LLM interface:

    • Project Knowledge (Claude Pro): Paste into knowledge section
    • GPT Knowledge (Custom GPTs): Paste into knowledge section
    • Regular chats: Use lc-context -p to include instructions
  7. When the LLM requests additional files:

    • Copy the file list from the LLM
    • Run lc-clip-files
    • Paste the contents back to the LLM

Core Commands

  • lc-init: Initialize project configuration
  • lc-set-rule <n>: Switch rules (system rules are prefixed with "lc-")
  • lc-sel-files: Select files for inclusion
  • lc-sel-outlines: Select files for outline generation
  • lc-context [-p] [-u] [-f FILE]: Generate and copy context
    • -p: Include prompt instructions
    • -u: Include user notes
    • -f FILE: Write to output file
  • lc-prompt: Generate project instructions for LLMs
  • lc-clip-files: Process LLM file requests
  • lc-changed: List files modified since last context generation
  • lc-outlines: Generate outlines for code files
  • lc-clip-implementations: Extract code implementations requested by LLMs (doesn't support C/C++)

Advanced Features

Rule-Based Customization

LLM Context uses a Markdown (+ YAML front matter)-based rules system that allows you to create different profiles for different use cases:

  • System rules (prefixed with "lc-") provide default functionality
  • User-defined rules can be created independently or extend existing rules

Rules can be switched using the lc-set-rule command.

Smart File Selection

The tool uses .gitignore patterns for intelligent file selection, making it easy to include only relevant files in your context.

Code Navigation Features

  1. Smart Code Outlines: Automatically generates outlines highlighting important definitions to help LLMs understand your codebase structure
  2. Definition Implementation Extraction: Extract full implementations of specific definitions requested by LLMs after they review code outlines

For more detailed documentation, refer to the User Guide.

Related MCPs

Apple Shortcuts
Developer ToolsJavaScript

Control Apple Shortcuts automations from AI assistants

Clojars Dependency Lookup
Developer ToolsJavaScript

Fetch dependency information from Clojars, the Clojure community's artifact repository

Simple Timeserver
Developer ToolsPython

Provides Claude with current time and timezone information

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.