Back to MCP Catalog

Prometheus Metrics MCP Server

MonitoringPython
Query and analyze Prometheus metrics through standardized interfaces
Available Tools

list_metrics

Lists available metrics in the Prometheus server, optionally filtered by a pattern

get_metric_metadata

Retrieves metadata for a specific metric

query

Executes an instant PromQL query at a specific time

query_range

Executes a PromQL query over a range of time with a specified step interval

Prometheus MCP Server provides AI assistants with direct access to your Prometheus metrics and query capabilities. This integration allows AI systems to execute PromQL queries, discover available metrics, and analyze time-series data through standardized Model Context Protocol interfaces. With support for authentication methods including basic auth and bearer tokens, this server enables secure access to your monitoring data. The containerized deployment options make it easy to integrate with existing infrastructure while maintaining isolation and security.

Overview

Prometheus MCP Server enables AI assistants to interact with your Prometheus metrics system through a standardized interface. This allows AI systems to query metrics, analyze trends, and help troubleshoot issues by accessing your monitoring data.

Installation

You can install and run the Prometheus MCP Server in several ways:

Method 1: Local Installation with UV

  1. Clone the repository:

    git clone https://github.com/pab1it0/prometheus-mcp-server.git
    cd prometheus-mcp-server
    
  2. Install dependencies using UV:

    curl -LsSf https://astral.sh/uv/install.sh | sh
    uv venv
    source .venv/bin/activate  # On Unix/macOS
    .venv\Scripts\activate     # On Windows
    uv pip install -e .
    
  3. Create a .env file based on the template:

    PROMETHEUS_URL=http://your-prometheus-server:9090
    
    # Optional authentication (use as needed)
    PROMETHEUS_USERNAME=your_username
    PROMETHEUS_PASSWORD=your_password
    # OR
    PROMETHEUS_TOKEN=your_token
    
    # Optional for multi-tenant setups
    ORG_ID=your_organization_id
    

Method 2: Using Docker

You can use the pre-built Docker image:

docker pull ghcr.io/pab1it0/prometheus-mcp-server:latest

Or build it locally:

docker build -t prometheus-mcp-server .

Run the container with your environment variables:

docker run -it --rm \
  -e PROMETHEUS_URL=http://your-prometheus-server:9090 \
  -e PROMETHEUS_USERNAME=your_username \
  -e PROMETHEUS_PASSWORD=your_password \
  ghcr.io/pab1it0/prometheus-mcp-server:latest

Configuration

The server requires the following configuration:

  • PROMETHEUS_URL: The URL of your Prometheus server (required)
  • Authentication (optional, choose one if needed):
    • Basic auth: PROMETHEUS_USERNAME and PROMETHEUS_PASSWORD
    • Bearer token: PROMETHEUS_TOKEN
  • For multi-tenant setups (optional): ORG_ID

Client Configuration

To use this MCP server with an AI assistant like Claude Desktop, add the server configuration to your client configuration file.

For Local Installation:

{
  "mcpServers": {
    "prometheus": {
      "command": "uv",
      "args": [
        "--directory",
        "/full/path/to/prometheus-mcp-server",
        "run",
        "src/prometheus_mcp_server/main.py"
      ],
      "env": {
        "PROMETHEUS_URL": "http://your-prometheus-server:9090",
        "PROMETHEUS_USERNAME": "your_username",
        "PROMETHEUS_PASSWORD": "your_password"
      }
    }
  }
}

If you encounter Error: spawn uv ENOENT in Claude Desktop, you may need to specify the full path to uv or set NO_UV=1 in the configuration.

For Docker Installation:

{
  "mcpServers": {
    "prometheus": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-e", "PROMETHEUS_URL",
        "-e", "PROMETHEUS_USERNAME",
        "-e", "PROMETHEUS_PASSWORD",
        "ghcr.io/pab1it0/prometheus-mcp-server:latest"
      ],
      "env": {
        "PROMETHEUS_URL": "http://your-prometheus-server:9090",
        "PROMETHEUS_USERNAME": "your_username",
        "PROMETHEUS_PASSWORD": "your_password"
      }
    }
  }
}

Usage

Once configured, your AI assistant can interact with your Prometheus metrics using natural language. The assistant can:

  • Execute PromQL queries
  • List available metrics
  • Get metadata for specific metrics
  • View instant query results
  • View range query results with different step intervals

Simply ask the assistant questions about your metrics or request specific queries to be executed.

Related MCPs

Axiom Query
MonitoringGo

Query your Axiom data using APL (Axiom Processing Language)

Sentry Issue Analyzer
MonitoringPython

Retrieve and analyze issues from Sentry.io

Raygun
MonitoringTypeScript

Access and manage Raygun crash reporting and user monitoring data

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.