Back to MCP Catalog

Minima MCP Server

Knowledge & MemoryPython
On-premises conversational RAG with configurable containers
Available Tools

minima_query

Query your local documents using Minima RAG system

query

Minima is an open-source Retrieval-Augmented Generation (RAG) system that operates on-premises through configurable containers. It allows users to query their local documents using various LLM interfaces while maintaining data privacy and security. The system supports multiple operation modes, including fully local installation, ChatGPT integration, and Anthropic Claude integration through MCP.

Overview

Minima is a powerful on-premises RAG system that lets you search and query your local documents using AI. It offers three distinct operation modes to suit different needs:

  1. Isolated Installation: Run completely on-premises with containers, free from external dependencies. All neural networks (LLM, reranker, embedding) operate on your cloud or PC, ensuring your data remains secure.

  2. ChatGPT Integration: Query your local documents using ChatGPT app or web with custom GPTs. The indexer runs on your cloud or local PC, while ChatGPT serves as the primary LLM.

  3. Anthropic Claude Integration: Use Anthropic Claude app to query your local documents via MCP. The indexer operates on your local PC, while Claude serves as the primary LLM.

Installation

Prerequisites

  • Docker and Docker Compose
  • Python 3.10 or higher (for MCP usage)
  • uv package installed (for MCP usage)

Setting Up Environment Variables

  1. Create a .env file in the project's root directory based on the provided .env.sample.
  2. Configure the following variables:
LOCAL_FILES_PATH=/path/to/your/documents/
EMBEDDING_MODEL_ID=sentence-transformers/all-mpnet-base-v2
EMBEDDING_SIZE=768

For fully local installation, also add:

OLLAMA_MODEL=qwen2:0.5b
RERANKER_MODEL=BAAI/bge-reranker-base

For ChatGPT integration, add:

USER_ID=your-email@example.com
PASSWORD=your-password

Installation Methods

Docker Compose (Manual Installation)

Choose one of the following commands based on your preferred mode:

  • Fully Local Installation:

    docker compose -f docker-compose-ollama.yml --env-file .env up --build
    
  • ChatGPT Integration:

    docker compose -f docker-compose-chatgpt.yml --env-file .env up --build
    
  • MCP Integration (for Anthropic Claude):

    docker compose -f docker-compose-mcp.yml --env-file .env up --build
    

Smithery Installation (for MCP usage)

For automatic installation with Claude Desktop:

npx -y @smithery/cli install minima --client claude

Accessing Minima

  • Fully Local Installation: Navigate to cd electron, run npm install and npm start to launch the Minima electron app.
  • Web Interface: Access the chat UI at http://localhost:3000
  • ChatGPT Integration: Copy the OTP from the terminal where you launched Docker and use Minima GPT
  • Claude Integration: Configure Claude Desktop by adding Minima to your Claude configuration file.

Configuration Details

Environment Variables Explained

  • LOCAL_FILES_PATH: Root folder for indexing (recursive). Supports .pdf, .xls, .docx, .txt, .md, .csv files.
  • EMBEDDING_MODEL_ID: Embedding model to use (Sentence Transformer models).
  • EMBEDDING_SIZE: Embedding dimension provided by the model.
  • OLLAMA_MODEL: Ollama LLM model ID (for local installation).
  • RERANKER_MODEL: Reranker model (BAAI rerankers recommended).
  • USER_ID: Email for ChatGPT authentication.
  • PASSWORD: Password for ChatGPT authentication.

Claude Desktop Configuration

To use Minima with Claude Desktop, add the following to your Claude configuration file at /Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "minima": {
      "command": "uv",
      "args": [
        "--directory",
        "/path_to_cloned_minima_project/mcp-server",
        "run",
        "minima"
      ]
    }
  }
}

Replace /path_to_cloned_minima_project with the actual path to your cloned Minima repository.

Supported File Types

Minima can index and search through the following file types:

  • PDF (.pdf)
  • Excel (.xls)
  • Word (.docx)
  • Text (.txt)
  • Markdown (.md)
  • CSV (.csv)

All documents within the specified LOCAL_FILES_PATH folder and its subfolders will be indexed automatically.

Related MCPs

Knowledge Graph Memory
Knowledge & MemoryTypeScript

A persistent memory system using a local knowledge graph

MemoryMesh
Knowledge & MemoryTypeScript

A knowledge graph server for structured memory persistence in AI models

Cognee
Knowledge & MemoryPython

Knowledge management and retrieval system with code graph capabilities

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.