Start a conversational interface with direct LLM interaction and automated tool usage
Start a command-driven interface for direct server operations
Run in Unix-friendly mode for scriptable automation and pipelines
List all available tools from connected servers
Call a specific tool on a connected server
Display or manage LLM providers and their configurations
Display or change the current LLM model
Show, filter, or export conversation history
Show history of tool calls in the current session
MCP CLI provides a feature-rich command-line interface for seamless communication with Large Language Models through the Model Context Protocol. Built on top of the CHUK-MCP protocol library, it offers multiple operational modes including chat, interactive, and command modes for different use cases. The CLI supports various LLM providers such as OpenAI, Anthropic, and Ollama, with robust tool management and conversation tracking capabilities.
The Model Context Protocol CLI (MCP CLI) is a comprehensive command-line interface designed to interact with MCP servers. It provides a bridge between users and Large Language Models (LLMs) through various operational modes, supporting multiple providers and offering advanced conversation management features.
OPENAI_API_KEY
environment variableANTHROPIC_API_KEY
environment variableserver_config.json
)git clone https://github.com/chrishayuk/mcp-cli
cd mcp-cli
pip install -e ".[cli,dev]"
mcp-cli --help
If you prefer using UV for dependency management:
# Install UV if not already installed
pip install uv
# Install dependencies
uv sync --reinstall
# Run using UV
uv run mcp-cli --help
These options are available for all modes and commands:
--server
: Specify the server(s) to connect to (comma-separated for multiple)--config-file
: Path to server configuration file (default: server_config.json
)--provider
: LLM provider to use (openai
, anthropic
, ollama
, default: openai
)--model
: Specific model to use (provider-dependent defaults)--disable-filesystem
: Disable filesystem access (default: true)Chat mode provides a natural language interface for interacting with LLMs:
# Default mode
mcp-cli
# Explicit chat mode
mcp-cli chat --server sqlite
# With specific provider and model
mcp-cli chat --server sqlite --provider openai --model gpt-4o
Interactive mode provides a command-driven shell interface:
mcp-cli interactive --server sqlite
Command mode provides a Unix-friendly interface for automation:
mcp-cli cmd --server sqlite [options]
Run individual commands without entering an interactive mode:
# List available tools
mcp-cli tools list --server sqlite
# Call a specific tool
mcp-cli tools call --server sqlite
In chat mode, use these slash commands:
/help
: Show available commands/help <command>
: Show detailed help for a specific command/quickhelp
or /qh
: Display a quick reference of common commandsexit
or quit
: Exit chat mode/provider
or /p
: Display or manage LLM providers/model
or /m
: Display or change the current model/tools
: Display all available tools with their server information/toolhistory
or /th
: Show history of tool calls in the current session/conversation
or /ch
: Show the conversation history/save <filename>
: Save the current conversation to a fileIf you encounter a "Missing argument 'KWARGS'" error, use one of these approaches:
mcp-cli tools call --server=sqlite
mcp-cli chat --server=sqlite --provider=ollama --model=llama3.2
--
) after the command:mcp-cli tools call -- --server sqlite
mcp-cli chat -- --server sqlite --provider ollama --model llama3.2
The CLI supports various LLM providers:
gpt-4o-mini
, gpt-4o
, gpt-4-turbo
, etc.llama3.2
, qwen2.5-coder
, etc.claude-3-opus
, claude-3-sonnet
, etc.You can configure multiple providers and switch between them during sessions.