Send a request to the configured AI provider
Unichat provides a unified interface to interact with multiple AI providers including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception. This MCP server allows you to send requests to these providers using either a general-purpose tool or predefined prompts for common code-related tasks. The server requires API keys for the respective providers to function.
Unichat MCP Server provides a unified interface to interact with multiple AI providers through the Model Context Protocol. It supports sending requests to OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception using a single consistent interface.
You can install Unichat MCP Server in two ways:
~/Library/Application\ Support/Claude/claude_desktop_config.json
%APPDATA%/Claude/claude_desktop_config.json
"mcpServers": {
"unichat-mcp-server": {
"command": "uvx",
"args": [
"unichat-mcp-server"
],
"env": {
"UNICHAT_MODEL": "SELECTED_UNICHAT_MODEL",
"UNICHAT_API_KEY": "YOUR_UNICHAT_API_KEY"
}
}
}
For automatic installation via Smithery:
npx -y @smithery/cli install unichat-mcp-server --client claude
You need to configure the following environment variables:
UNICHAT_MODEL
: The model you want to use. A list of supported models can be found in the unichat models.py file.UNICHAT_API_KEY
: Your API key for the selected model's provider.Example configuration for using OpenAI's GPT-4o-mini:
"env": {
"UNICHAT_MODEL": "gpt-4o-mini",
"UNICHAT_API_KEY": "YOUR_OPENAI_API_KEY"
}
Once installed, you can use Unichat through:
unichat
tool for sending any messages to the configured AI providerFor debugging issues with the MCP server, you can use the MCP Inspector:
npx @modelcontextprotocol/inspector uv --directory PATH_TO_YOUR_PROJECT_DIRECTORY/unichat-mcp-server run unichat-mcp-server
This will display a URL you can access in your browser to begin debugging.