Relays a question to the configured AI Chat Provider and returns the response
This MCP server enables seamless integration between Claude and any OpenAI SDK-compatible chat completion API, including OpenAI, Perplexity, Groq, xAI, PyroPrompts, and more. It provides a simple way to leverage multiple AI models directly within Claude Desktop or other MCP-compatible clients. The server implements the Model Context Protocol, allowing Claude to communicate with external AI services through a standardized interface. With this integration, you can easily query different AI models for specific tasks without leaving your Claude environment.
The Any Chat Completions API Integration allows you to connect Claude to any OpenAI SDK-compatible chat completion API. This enables you to leverage multiple AI models directly within your Claude Desktop or other MCP-compatible clients like LibreChat.
The easiest way to install this MCP server is using npx. You'll need to add the server configuration to your Claude Desktop config file:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Add the following configuration to your claude_desktop_config.json
file:
{
"mcpServers": {
"chat-openai": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "YOUR_OPENAI_API_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
If you prefer to clone the repository and build it yourself:
Clone the repository:
git clone https://github.com/pyroprompts/any-chat-completions-mcp.git
Install dependencies:
npm install
Build the server:
npm run build
Add the server to your Claude Desktop configuration:
{
"mcpServers": {
"chat-openai": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "YOUR_OPENAI_API_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
You can also install this MCP server automatically using Smithery:
npx -y @smithery/cli install any-chat-completions-mcp-server --client claude
The MCP server is configured through environment variables:
AI_CHAT_KEY
: Your API key for the serviceAI_CHAT_NAME
: Display name for the service (e.g., "OpenAI", "Perplexity")AI_CHAT_MODEL
: The model to use (e.g., "gpt-4o", "sonar")AI_CHAT_BASE_URL
: Base URL for the API endpointYou can configure multiple AI providers by adding multiple entries to your configuration, each pointing to the same MCP server but with different environment variables:
{
"mcpServers": {
"chat-pyroprompts": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "YOUR_PYROPROMPTS_KEY",
"AI_CHAT_NAME": "PyroPrompts",
"AI_CHAT_MODEL": "ash",
"AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1"
}
},
"chat-perplexity": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "YOUR_PERPLEXITY_KEY",
"AI_CHAT_NAME": "Perplexity",
"AI_CHAT_MODEL": "sonar",
"AI_CHAT_BASE_URL": "https://api.perplexity.ai"
}
}
}
}
To use this MCP server with LibreChat, add the following to your LibreChat configuration:
chat-perplexity:
type: stdio
command: npx
args:
- -y
- @pyroprompts/any-chat-completions-mcp
env:
AI_CHAT_KEY: "YOUR_PERPLEXITY_API_KEY"
AI_CHAT_NAME: Perplexity
AI_CHAT_MODEL: sonar
AI_CHAT_BASE_URL: "https://api.perplexity.ai"
PATH: '/usr/local/bin:/usr/bin:/bin'
Since MCP servers communicate over stdio, debugging can be challenging. You can use the MCP Inspector for debugging:
npm run inspector
This will provide a URL to access debugging tools in your browser.