This MCP server enables seamless integration between Claude and OpenAI's powerful language models. It acts as a bridge that allows Claude to make direct API calls to OpenAI models like GPT-4, giving you the ability to leverage both AI platforms simultaneously. With this integration, you can harness the unique strengths of OpenAI's models without leaving your Claude environment. This creates a powerful workflow where you can compare responses, use specialized capabilities from each platform, or create hybrid solutions that combine the best of both AI ecosystems.
The OpenAI MCP Server allows you to query OpenAI models directly from Claude using the Model Context Protocol (MCP). This integration enables you to leverage OpenAI's models alongside Claude, creating a powerful multi-model workflow.
git clone https://github.com/pierrebrunelle/mcp-server-openai
cd mcp-server-openai
pip install -e .
Add the following configuration to your claude_desktop_config.json
file:
{
"mcpServers": {
"openai-server": {
"command": "python",
"args": ["-m", "src.mcp_server_openai.server"],
"env": {
"PYTHONPATH": "/path/to/your/mcp-server-openai",
"OPENAI_API_KEY": "your-openai-api-key-here"
}
}
}
}
Make sure to:
/path/to/your/mcp-server-openai
with the actual path to where you cloned the repositoryyour-openai-api-key-here
with your actual OpenAI API keyOnce configured, you can use the OpenAI MCP Server directly from Claude. The server will handle the communication between Claude and OpenAI's API, allowing you to:
To verify that your installation is working correctly, you can run the included tests:
# Run tests from project root
pytest -v test_openai.py -s
A successful test will show output similar to:
Testing OpenAI API call...
OpenAI Response: Hello! I'm doing well, thank you for asking...
PASSED
If you encounter issues: