DE

Deepseek_R1

Created 6 months ago

A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model.

development documentation public

What is Deepseek_R1?

A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)

Documentation

Deepseek R1 MCP Server

A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model.

Quick Start# Installing manually

git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install\n\n# Set up environment
cp .env.example .env\n\n# Then add your API key\n\n# Build and run
npm run build

Prerequisites

  • Node.js (v18 or higher)
  • npm
  • Claude Desktop
  • Deepseek API key

Model Selection

By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts:

// For DeepSeek-R1 (default)
model: "deepseek-reasoner"
// For DeepSeek-V3 model:
model: "deepseek-chat"

Project Structure

deepseek-r1-mcp/
├── src/
│   ├── index.ts # Main server implementation
├── build/ # Compiled files
│   ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json

Configuration

  1. Create a .env file:
DEEPSEEK_API_KEY=your-api-key-here
  1. Update Claude Desktop configuration:
{
  "mcpServers": {
    "deepseek_r1": {
      "command": "node",
      "args": ["/path/to/deepseek-r1-mcp/build/index.js"],
      "env": {
        "DEEPSEEK_API_KEY": "your-api-key"
      }
    }
  }
}

Development

npm run dev # Watch mode
npm run build # Build for production

Features

  • Advanced text generation with Deepseek R1 (8192 token context window)
  • Configurable parameters (max_tokens, temperature)
  • Robust error handling with detailed error messages
  • Full MCP protocol support
  • Claude Desktop integration
  • Support for both DeepSeek-R1 and DeepSeek-V3 models

API Usage

{
  "name": "deepseek_r1",
  "arguments": {
    "prompt": "Your prompt here",
    "max_tokens": 8192, // Maximum tokens to generate
    "temperature": 0.2 // Controls randomness
  }
}

The Temperature Parameter

The default value of temperature is 0.2. Deepseek recommends setting the temperature according to your specific use case:

USE CASE TEMPERATURE EXAMPLE
Coding / Math 0.0 Code generation, mathematical calculations
Data Cleaning / Data Analysis 1.0 Data processing tasks
General Conversation 1.3 Chat and dialogue
Translation 1.3 Language translation
Creative Writing / Poetry 1.5 Story writing, poetry generation

Error Handling

The server provides detailed error messages for common issues:

  • API authentication errors
  • Invalid parameters
  • Rate limiting
  • Network issues

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

Server Config

{
  "mcpServers": {
    "deepseek_r1-server": {
      "command": "npx",
      "args": [
        "deepseek_r1"
      ]
    }
  }
}

Links & Status

Repository: github.com
Hosted: No
Global: No
Official: No

Project Info

Hosted Featured
Created At: May 23, 2025
Updated At: Aug 07, 2025
Author: 66julienmartin
Category: community
License: MIT
Tags:
development documentation public