AM

A MCP provider for Deepseek reasoning content to MCP-enabled AI Clients.

Created 3 months ago

A MCP provider for Deepseek reasoning content to MCP-enabled AI Clients.

development documentation public AI MCP

What is A MCP provider for Deepseek reasoning content to MCP-enabled AI Clients.?

Deepseek Thinker MCP Server is a Model Context Protocol provider that enables access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server. It supports dual mode operation with OpenAI API and Ollama local mode, capturing Deepseek's reasoning process and providing structured text responses.

Documentation

Deepseek Thinker MCP Server

smithery badge

A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.

Core Features

  • 🤖 Dual Mode Support
  • OpenAI API mode support
  • Ollama local mode support
  • 🎯 Focused Reasoning
  • Captures Deepseek's thinking process
  • Provides reasoning output

Available Tools# get-deepseek-thinker

  • Description: Perform reasoning using the Deepseek model
  • Input Parameters:
  • originPrompt (string): User's original prompt
  • Returns: Structured text response containing the reasoning process

Environment Configuration# OpenAI API Mode

Set the following environment variables:

API_KEY=<Your OpenAI API Key>
BASE_URL=<API Base URL>

Ollama Mode

Set the following environment variable:

USE_OLLAMA=true

Usage# Integration with AI Client, like Claude Desktop

Add the following configuration to your claude_desktop_config.json:

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "API_KEY": "<Your API Key>",
        "BASE_URL": "<Your Base URL>"
      }
    }
  }
}

Using Ollama Mode

{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "npx",
      "args": [
        "-y",
        "deepseek-thinker-mcp"
      ],
      "env": {
        "USE_OLLAMA": "true"
      }
    }
  }
}
```\n\n### Local Server Configuration

```json
{
  "mcpServers": {
    "deepseek-thinker": {
      "command": "node",
      "args": [
        "/your-path/deepseek-thinker-mcp/build/index.js"
      ],
      "env": {
        "API_KEY": "<Your API Key>",
        "BASE_URL": "<Your Base URL>"
      }
    }
  }
}

Development Setup

npm install

# Build project
npm run build

# Run service
node build/index.js

FAQ# Response like this: "MCP error -32001: Request timed out"

This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.

Tech Stack

  • TypeScript
  • @modelcontextprotocol/sdk
  • OpenAI API
  • Ollama
  • Zod (parameter validation)

License

This project is licensed under the MIT License. See the LICENSE file for details.

Server Config

{
  "mcpServers": {
    "a-mcp-provider-for-deepseek-reasoning-content-to-mcp-enabled-ai-clients.-server": {
      "command": "npx",
      "args": [
        "a-mcp-provider-for-deepseek-reasoning-content-to-mcp-enabled-ai-clients."
      ]
    }
  }
}

Links & Status

Repository: github.com
Hosted: No
Global: No
Official: No

Project Info

Hosted Featured
Created At: Aug 07, 2025
Updated At: Aug 07, 2025
Author: ruixingshi
Category: AI Services
License: MIT License
Tags:
development documentation public