MCPHost is a powerful CLI application that serves as a host in the Model Context Protocol (MCP) architecture, enabling Large Language Models to interact with external tools and services. It supports multiple AI models including Claude 3.5 Sonnet, Ollama models, Google Gemini, and OpenAI-compatible models, providing a unified interface for tool discovery and integration. With MCPHost, language models can access external tools and data sources, maintain consistent context across interactions, and execute commands safely. The application supports multiple concurrent MCP servers, dynamic tool discovery, and configurable message history for context management.
MCPHost is a command-line interface (CLI) application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP). It acts as a host in the MCP client-server architecture, managing connections between LLMs and tool servers.
Before installing MCPHost, ensure you have:
Install MCPHost using Go:
go install github.com/mark3labs/mcphost@latest
Set up the necessary environment variables based on your chosen model:
export ANTHROPIC_API_KEY='your-api-key'
export GOOGLE_API_KEY='your-api-key'
ollama pull mistral
ollama serve
MCPHost automatically creates a configuration file at ~/.mcp.json
if it doesn't exist. You can specify a custom location using the --config
flag.
MCPHost supports two types of MCP servers:
{
"mcpServers": {
"sqlite": {
"command": "uvx",
"args": [
"mcp-server-sqlite",
"--db-path",
"/tmp/foo.db"
]
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/tmp"
]
}
}
}
{
"mcpServers": {
"server_name": {
"url": "http://some_host:8000/sse",
"headers": [
"Authorization: Bearer my-token"
]
}
}
}
You can specify a custom system prompt using the --system-prompt
flag. Create a JSON file with your instructions:
{
"systemPrompt": "You're a helpful assistant specialized in data analysis."
}
Then use it with:
mcphost --system-prompt ./my-system-prompt.json
MCPHost supports various models that can be specified using the --model
(or -m
) flag:
anthropic:claude-3-5-sonnet-latest
openai:gpt-4
ollama:modelname
google:gemini-2.0-flash
Example commands:
# Use Ollama with Qwen model
mcphost -m ollama:qwen2.5:3b
# Use OpenAI's GPT-4
mcphost -m openai:gpt-4
# Use OpenAI-compatible model
mcphost --model openai:<your-model-name> \
--openai-url <your-base-url> \
--openai-api-key <your-api-key>
MCPHost offers several flags for advanced configuration:
--anthropic-url string
: Base URL for Anthropic API (defaults to api.anthropic.com)--anthropic-api-key string
: Anthropic API key--config string
: Config file location (default is $HOME/.mcp.json)--system-prompt string
: System prompt file location--debug
: Enable debug mode for troubleshootingFor questions or discussions about MCPHost, join the community on Discord.