Back to MCP Catalog

Model Context Protocol CLI MCP Server

Command LinePython
A powerful command-line interface for interacting with Model Context Protocol servers
Available Tools

chat

Start a conversational interface with direct LLM interaction and automated tool usage

interactive

Start a command-driven interface for direct server operations

cmd

Run in Unix-friendly mode for scriptable automation and pipelines

tools list

List all available tools from connected servers

tools call

Call a specific tool on a connected server

provider

Display or manage LLM providers and their configurations

model

Display or change the current LLM model

conversation

Show, filter, or export conversation history

toolhistory

Show history of tool calls in the current session

MCP CLI provides a feature-rich command-line interface for seamless communication with Large Language Models through the Model Context Protocol. Built on top of the CHUK-MCP protocol library, it offers multiple operational modes including chat, interactive, and command modes for different use cases. The CLI supports various LLM providers such as OpenAI, Anthropic, and Ollama, with robust tool management and conversation tracking capabilities.

Overview

The Model Context Protocol CLI (MCP CLI) is a comprehensive command-line interface designed to interact with MCP servers. It provides a bridge between users and Large Language Models (LLMs) through various operational modes, supporting multiple providers and offering advanced conversation management features.

Installation

Prerequisites

  • Python 3.11 or higher
  • API keys for providers you plan to use:
    • OpenAI: Set OPENAI_API_KEY environment variable
    • Anthropic: Set ANTHROPIC_API_KEY environment variable
    • Ollama: Local Ollama installation
  • Server configuration file (default: server_config.json)

Install from Source

  1. Clone the repository:
git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
  1. Install the package with development dependencies:
pip install -e ".[cli,dev]"
  1. Verify installation:
mcp-cli --help

Alternative Installation with UV

If you prefer using UV for dependency management:

# Install UV if not already installed
pip install uv

# Install dependencies
uv sync --reinstall

# Run using UV
uv run mcp-cli --help

Usage

Global Command-line Arguments

These options are available for all modes and commands:

  • --server: Specify the server(s) to connect to (comma-separated for multiple)
  • --config-file: Path to server configuration file (default: server_config.json)
  • --provider: LLM provider to use (openai, anthropic, ollama, default: openai)
  • --model: Specific model to use (provider-dependent defaults)
  • --disable-filesystem: Disable filesystem access (default: true)

Operational Modes

1. Chat Mode

Chat mode provides a natural language interface for interacting with LLMs:

# Default mode
mcp-cli

# Explicit chat mode
mcp-cli chat --server sqlite

# With specific provider and model
mcp-cli chat --server sqlite --provider openai --model gpt-4o

2. Interactive Mode

Interactive mode provides a command-driven shell interface:

mcp-cli interactive --server sqlite

3. Command Mode

Command mode provides a Unix-friendly interface for automation:

mcp-cli cmd --server sqlite [options]

4. Direct Commands

Run individual commands without entering an interactive mode:

# List available tools
mcp-cli tools list --server sqlite

# Call a specific tool
mcp-cli tools call --server sqlite

Chat Mode Commands

In chat mode, use these slash commands:

General Commands

  • /help: Show available commands
  • /help <command>: Show detailed help for a specific command
  • /quickhelp or /qh: Display a quick reference of common commands
  • exit or quit: Exit chat mode

Provider and Model Commands

  • /provider or /p: Display or manage LLM providers
  • /model or /m: Display or change the current model

Tool Commands

  • /tools: Display all available tools with their server information
  • /toolhistory or /th: Show history of tool calls in the current session

Conversation Commands

  • /conversation or /ch: Show the conversation history
  • /save <filename>: Save the current conversation to a file

Troubleshooting

CLI Argument Format Issue

If you encounter a "Missing argument 'KWARGS'" error, use one of these approaches:

  1. Use the equals sign format:
mcp-cli tools call --server=sqlite
mcp-cli chat --server=sqlite --provider=ollama --model=llama3.2
  1. Add a double-dash (--) after the command:
mcp-cli tools call -- --server sqlite
mcp-cli chat -- --server sqlite --provider ollama --model llama3.2

Advanced Features

Multiple Provider Support

The CLI supports various LLM providers:

  • OpenAI: gpt-4o-mini, gpt-4o, gpt-4-turbo, etc.
  • Ollama: llama3.2, qwen2.5-coder, etc.
  • Anthropic: claude-3-opus, claude-3-sonnet, etc.

You can configure multiple providers and switch between them during sessions.

Conversation Management

  • Track complete conversation history
  • Filter and view specific message ranges
  • Export conversations to JSON for debugging or analysis
  • Compact conversations to reduce token usage

Tool System

  • Automatic discovery of server-provided tools
  • Server-aware tool execution
  • Tool call history tracking and analysis
  • Support for complex, multi-step tool chains

Related MCPs

iTerm Terminal Control
Command LineTypeScript

Execute and interact with commands in your active iTerm terminal session

Command Runner
Command LineTypeScript

Run shell commands directly from your AI assistant

CLI Command Executor
Command LinePython

Secure command-line interface with customizable security policies

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.