Back to MCP Catalog

Deepseek Thinker MCP Server

Developer ToolsTypeScript
Access Deepseek's reasoning process and chain-of-thought capabilities
Available Tools

get-deepseek-thinker

Perform reasoning using the Deepseek model

originPrompt

Deepseek Thinker provides access to the reasoning capabilities of Deepseek's language models through the Model Context Protocol. It captures the model's chain-of-thought process, allowing AI clients like Claude Desktop to leverage Deepseek's reasoning abilities. The server supports two operational modes: connecting to the Deepseek API service with your API key, or running locally through Ollama for those who prefer to keep processing on their own machine. This flexibility makes it suitable for both cloud-based and privacy-focused workflows.

Overview

Deepseek Thinker is a Model Context Protocol (MCP) server that provides access to Deepseek's reasoning capabilities. It allows MCP-enabled AI clients like Claude Desktop to leverage Deepseek's chain-of-thought processes for enhanced reasoning.

Installation

You can install and run Deepseek Thinker in several ways:

Using npx (Recommended)

The simplest way to use Deepseek Thinker is through npx. Add the following configuration to your AI client's configuration file:

For OpenAI API Mode:

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "API_KEY": "<Your API Key>",
        "BASE_URL": "<Your Base URL>"
      }
    }
  }
}

For Ollama Mode:

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "USE_OLLAMA": "true"
      }
    }
  }
}

Local Installation

For a local installation, you'll need to:

  1. Clone the repository:

    git clone https://github.com/ruixingshi/deepseek-thinker-mcp.git
    
  2. Install dependencies:

    cd deepseek-thinker-mcp
    npm install
    
  3. Build the project:

    npm run build
    
  4. Configure your AI client to use the local installation:

    {
      "mcpServers": {
        "deepseek-thinker": {
          "command": "node",
          "args": [
            "/your-path/deepseek-thinker-mcp/build/index.js"
          ],
          "env": {
            "API_KEY": "<Your API Key>",
            "BASE_URL": "<Your Base URL>"
          }
        }
      }
    }
    

Configuration

Environment Variables

  • API_KEY: Your OpenAI API key (required for OpenAI API mode)
  • BASE_URL: The base URL for the API (required for OpenAI API mode)
  • USE_OLLAMA: Set to "true" to use Ollama mode instead of OpenAI API mode

Troubleshooting

Request Timeout Error

If you encounter an error like "MCP error -32001: Request timed out", this typically happens when:

  • The Deepseek API response is too slow
  • The reasoning content output is too long, causing the MCP server to timeout

Try simplifying your query or checking your network connection if you encounter this issue.

Related MCPs

Apple Shortcuts
Developer ToolsJavaScript

Control Apple Shortcuts automations from AI assistants

Clojars Dependency Lookup
Developer ToolsJavaScript

Fetch dependency information from Clojars, the Clojure community's artifact repository

Simple Timeserver
Developer ToolsPython

Provides Claude with current time and timezone information

About Model Context Protocol

Model Context Protocol (MCP) allows AI models to access external tools and services, extending their capabilities beyond their training data.

Generate Cursor Documentation

Save time on coding by generating custom documentation and prompts for Cursor IDE.