Back to MCP Catalog

Wolfram Alpha MCP Server

Knowledge & MemoryPython
Connect your chat to Wolfram Alpha computational intelligence
Available Tools

query_wolfram_alpha

Query the Wolfram Alpha API with a question or computation and return the results

query

This MCP server provides a direct connection to the Wolfram Alpha API, allowing your AI assistant to leverage Wolfram's powerful computational knowledge engine. It enables your assistant to perform complex calculations, solve mathematical problems, access scientific data, and answer factual queries with precision by tapping into Wolfram Alpha's vast knowledge base. Similar to the "!wa" bang command in DuckDuckGo search, this integration enhances your assistant's capabilities with accurate, computation-based responses for technical, scientific, and mathematical questions.

Overview

The Wolfram Alpha MCP server connects your AI assistant to Wolfram Alpha's computational intelligence engine, providing access to a vast knowledge base for mathematical, scientific, and factual queries.

Prerequisites

Before using this MCP server, you need to:

  1. Obtain a Wolfram Alpha API key from Wolfram Alpha Developer Portal
  2. Have Python installed on your system

Installation

Option 1: Install from PyPI

pip install mcp-wolfram-alpha

Option 2: Install from Source

  1. Clone the repository:
git clone https://github.com/SecretiveShell/MCP-wolfram-alpha.git
cd MCP-wolfram-alpha
  1. Install the package:
pip install -e .

Option 3: Use Docker

docker pull ghcr.io/secretiveshell/mcp-wolfram-alpha:latest
docker run -e WOLFRAM_API_KEY=your-api-key -p 8000:8000 ghcr.io/secretiveshell/mcp-wolfram-alpha:latest

Configuration

You must set the WOLFRAM_API_KEY environment variable with your Wolfram Alpha API key. This MCP was tested with the full results API, but a simpler API might also work.

Add the following configuration to your MCP client (Claude Desktop, Cursor, etc.):

{
    "mcpServers": {
        "MCP-wolfram-alpha": {
            "command": "uv",
            "args": [
                "--directory",
                "PATH_TO_YOUR_PROJECT_DIRECTORY",
                "run",
                "MCP-wolfram-alpha"
            ],
            "env": {
                "WOLFRAM_API_KEY": "your-app-id"
            }
        }
    }
}

Replace PATH_TO_YOUR_PROJECT_DIRECTORY with the full path to where you installed the MCP server, and your-app-id with your actual Wolfram Alpha API key.

Usage

Once configured, your AI assistant can use Wolfram Alpha to answer computational questions. You can prompt your assistant with queries like:

  • "Calculate the derivative of x^2 * sin(x)"
  • "What is the distance from Earth to Mars right now?"
  • "Solve the equation 3x^2 + 2x - 5 = 0"
  • "What is the boiling point of mercury?"
  • "Convert 150 pounds to kilograms"

The assistant will use the Wolfram Alpha API to provide accurate, computation-based answers.

Debugging

For debugging purposes, you can use mcp-cli-inspector:

  1. Create a config.json file with your server configuration
  2. Run:
npx @wong2/mcp-cli -c ./config.json

This allows you to test the MCP server directly without going through an AI assistant.

Related MCPs

Knowledge Graph Memory
Knowledge & MemoryTypeScript

A persistent memory system using a local knowledge graph

MemoryMesh
Knowledge & MemoryTypeScript

A knowledge graph server for structured memory persistence in AI models

Cognee
Knowledge & MemoryPython

Knowledge management and retrieval system with code graph capabilities

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.