Back to MCP Catalog

ZenML Integration MCP Server

Developer ToolsPython
Connect to ZenML MLOps and LLMOps pipelines
Available Tools

get_users

Retrieve information about users in the ZenML server

get_stacks

Retrieve information about stacks in the ZenML server

get_pipelines

Retrieve information about pipelines in the ZenML server

get_pipeline_runs

Retrieve information about pipeline runs in the ZenML server

get_pipeline_steps

Retrieve information about pipeline steps in the ZenML server

get_services

Retrieve information about services in the ZenML server

get_stack_components

Retrieve information about stack components in the ZenML server

get_flavors

Retrieve information about flavors in the ZenML server

get_run_templates

Retrieve information about pipeline run templates in the ZenML server

get_schedules

Retrieve information about schedules in the ZenML server

get_artifacts

Retrieve metadata about data artifacts in the ZenML server

get_service_connectors

Retrieve information about service connectors in the ZenML server

get_step_code

Retrieve code for a specific pipeline step

get_step_logs

Retrieve logs for a specific pipeline step (if run on a cloud-based stack)

most_recent_runs

Retrieve the most recent pipeline runs from the ZenML server

trigger_pipeline_run

Trigger a new pipeline run using an existing run template

The ZenML Integration provides a seamless connection between AI assistants and your ZenML MLOps platform. Access information about your ML pipelines, runs, artifacts, and services directly through AI interfaces like Claude Desktop or Cursor. This integration enables you to monitor, analyze, and even trigger new pipeline runs without leaving your AI assistant.

Overview

The ZenML Integration allows AI assistants to interact with your ZenML MLOps platform through the Model Context Protocol (MCP). This integration provides access to critical information about your machine learning pipelines, runs, artifacts, and services, enabling you to monitor and manage your ML workflows directly through AI interfaces.

Prerequisites

Before setting up the ZenML Integration, you'll need:

  1. Access to a ZenML Cloud server (sign up for a free trial at ZenML Cloud if needed)
  2. The uv package manager installed locally (install via their installer script or via brew on macOS)
  3. A local copy of the ZenML MCP repository

Installation

Step 1: Clone the Repository

First, clone the ZenML MCP repository to your local machine:

git clone https://github.com/zenml-io/mcp-zenml.git

Step 2: Configure Your MCP File

Create an MCP configuration file with the following structure:

{
    "mcpServers": {
        "zenml": {
            "command": "/path/to/uv",
            "args": ["run", "/full/path/to/zenml_server.py"],
            "env": {
                "LOGLEVEL": "INFO",
                "NO_COLOR": "1",
                "PYTHONUNBUFFERED": "1",
                "PYTHONIOENCODING": "UTF-8",
                "ZENML_STORE_URL": "https://your-zenml-server-url.com",
                "ZENML_STORE_API_KEY": "your-api-key-here"
            }
        }
    }
}

Replace the following placeholders:

  • /path/to/uv: The full path to your uv executable
  • /full/path/to/zenml_server.py: The full path to the zenml_server.py file in the cloned repository
  • https://your-zenml-server-url.com: Your ZenML server URL (e.g., https://d534d987a-zenml.cloudinfra.zenml.io)
  • your-api-key-here: Your ZenML API key (consider using a service account for this purpose)

Step 3: Set Up with Your AI Assistant

For Claude Desktop:

  1. Install Claude Desktop
  2. Open Claude Desktop and go to Settings → Developer
  3. Click "Edit Config" to open your configuration file
  4. Paste your MCP configuration into this file
  5. Restart Claude Desktop

For a better experience with tool outputs, go to Settings → Profile and add the following to your preferences:

When using zenml tools which return JSON strings and you're asked a question, you might want to consider using markdown tables to summarize the results or make them easier to view!

For Cursor:

  1. Install Cursor
  2. In your repository, create a .cursor folder
  3. Inside this folder, create a mcp.json file with your MCP configuration
  4. Open Cursor settings and enable the ZenML server

Usage

Once configured, you can interact with your ZenML server through your AI assistant. You can ask questions about:

  • Users and teams
  • Stacks and stack components
  • Pipelines and pipeline runs
  • Pipeline steps and their logs
  • Services and service connectors
  • Artifacts and their metadata
  • Schedules and pipeline run templates

You can also trigger new pipeline runs if you have run templates configured.

Troubleshooting

  • If you encounter connection issues, verify your ZenML server URL and API key
  • Ensure the paths to uv and zenml_server.py are correct and absolute
  • Check that your ZenML API key has the necessary permissions
  • For Cursor, the server might show a red error indicator even when working correctly - test it by asking about your ZenML resources

Feedback and Support

This integration is in beta/experimental release. Join the ZenML Slack community to share your experience and help improve the integration.

Related MCPs

Apple Shortcuts
Developer ToolsJavaScript

Control Apple Shortcuts automations from AI assistants

Clojars Dependency Lookup
Developer ToolsJavaScript

Fetch dependency information from Clojars, the Clojure community's artifact repository

Simple Timeserver
Developer ToolsPython

Provides Claude with current time and timezone information

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.