development
location
documentation
public
MCP
text-to-sql
What is XiYan MCP Server enables natural language queries to databases powered by XiYan-SQL.?
XiYan MCP Server is a Model Context Protocol (MCP) server that allows users to fetch data from databases using natural language queries. It supports various models including general LLMs and the SOTA text-to-SQL model, XiYanSQL. The server can operate in both remote and local modes, ensuring high security and flexibility in database interactions.
π Fetch data by natural language through XiYanSQL
π€ Support general LLMs (GPT,qwenmax), Text-to-SQL SOTA model
π» Support pure local mode (high security!)
π Support MySQL and PostgreSQL.
π±οΈ List available tables as resources
π§ Read table contents
Preview# Architecture
There are two ways to integrate this server in your project, as shown below:
The left is remote mode, which is the default mode. It requires an API key to access the xiyanSQL-qwencoder-32B model from service provider (see Configuration).
Another mode is local mode, which is more secure. It does not require the API key.
The following figure illustrates the performance of the XiYan MCP server as measured by the MCPBench benchmark. The XiYan MCP server demonstrates superior performance compared to both the MySQL MCP server and the PostgreSQL MCP server, achieving a lead of 2-22 percentage points. The detailed experiment results can be found at MCPBench and the report "Evaluation Report on MCP Servers".
Tools Preview
The tool get_data provides a natural language interface for retrieving data from a database. This server will convert the input natural language into SQL using a built-in model and call the database to return the query results.
The {dialect}://{table_name} resource allows obtaining a portion of sample data from the database for model reference when a specific table_name is specified.
The {dialect}:// resource will list the names of the current databases
Installation# Installing from pip
Python 3.11+ is required.
You can install the server through pip, and it will install the latest version:
pip install xiyan-mcp-server
If you want to install the development version from source, you can install from source code on github:
You can set the transport protocol to stdio or sse.\n\n#### STDIO
For stdio protocol, you can set just like this:
mcp:
transport: "stdio"
```\n\n#### SSE
For sse protocol, you can set mcp config as below:
```yaml
mcp:
transport: "sse"
port: 8000
log_level: "INFO"
The default port is 8000. You can change the port if needed.
The default log level is ERROR. We recommend to set log level to INFO for more detailed information.
Other configurations like debug, host, sse_path, message_path can be customized as well, but normally you don't need to modify them.
LLM Configuration
Name is the name of the model to use, key is the API key of the model, url is the API url of the model. We support following models.
versions
general LLMs(GPT,qwenmax)
SOTA model by Modelscope
SOTA model by Dashscope
Local LLMs
description
basic, easy to use
best performance, stable, recommand
best performance, for trial
slow, high-security
name
the official model name (e.g. gpt-3.5-turbo,qwen-max)
XGenerationLab/XiYanSQL-QwenCoder-32B-2412
xiyansql-qwencoder-32b
xiyansql-qwencoder-3b
key
the API key of the service provider (e.g. OpenAI, Alibaba Cloud)
the API key of modelscope
the API key via email
""
url
the endpoint of the service provider (e.g."https://api.openai.com/v1")
If you want to use Qwen from Alibaba, e.g. Qwen-max, you can use following config:
model:
name: "qwen-max"
key: "YOUR KEY "
url: "https://dashscope.aliyuncs.com/compatible-mode/v1"
database:
```\n\n#### Text-to-SQL SOTA model
We recommend the XiYanSQL-qwencoder-32B (https://github.com/XGenerationLab/XiYanSQL-QwenCoder), which is the SOTA model in text-to-sql, see [Bird benchmark](https://bird-bench.github.io/).
There are two ways to use the model. You can use either of them.
(1) [Modelscope](https://www.modelscope.cn/models/XGenerationLab/XiYanSQL-QwenCoder-32B-2412), (2) Alibaba Cloud DashScope.
##### (1) Modelscope version
You need to apply a ``key`` of API-inference from Modelscope, https://www.modelscope.cn/docs/model-service/API-Inference/intro
Then you can use the following config:
```yaml
model:
name: "XGenerationLab/XiYanSQL-QwenCoder-32B-2412"
key: ""
url: "https://api-inference.modelscope.cn/v1/"
We deployed the model on Alibaba Cloud DashScope, so you need to set the following environment variables:
Send me your email to get the key. ( [email protected] )
In the email, please attach the following information:
name: "YOUR NAME",
email: "YOUR EMAIL",
organization: "your college or Company or Organization"
We will send you a key according to your email. And you can fill the key in the yml file.
The key will be expired by 1 month or 200 queries or other legal restrictions.
Note that dialect should be postgresql for postgresql.\n\n## Launch# Server Launch
If you want to launch server with sse, you have to run the following command in a terminal:
YML=path/to/yml python -m xiyan_mcp_server
Then you should see the information on http://localhost:8000/sse in your browser. (Defaultly, change if your mcp server runs on other host/port)
Otherwise, if you use stdio transport protocol, you usually declare the mcp server command in specific mcp application instead of launching it in a terminal.
However, you can still debug with this command if needed.
Client Setting## Claude Desktop
Add this in your Claude Desktop config file, ref Claude Desktop config example
Please note that the Python command here requires the complete path to the Python executable (/xxx/python); otherwise, the Python interpreter cannot be found. You can determine this path by using the command which python. The same applies to other applications as well.
Claude Desktop currently does not support the SSE transport protocol.
Add an env: key is YML and value is the path to your yml file.
Ref Witsy config example
It Does Not Work!
Contact us:
Ding GroupιιηΎ€ο½
Follow me on Weibo
Other Related Links
Citation
If you find our work helpful, feel free to give us a cite.
@article{XiYanSQL,
title={XiYan-SQL: A Novel Multi-Generator Framework For Text-to-SQL},
author={Yifu Liu and Yin Zhu and Yingqi Gao and Zhiling Luo and Xiaoxia Li and Xiaorong Shi and Yuntao Hong and Jinyang Gao and Yu Li and Bolin Ding and Jingren Zhou},
year={2025},
eprint={2507.04701},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2507.04701},
}
@article{xiyansql_pre,
title={A Preview of XiYan-SQL: A Multi-Generator Ensemble Framework for Text-to-SQL},
author={Yingqi Gao and Yifu Liu and Xiaoxia Li and Xiaorong Shi and Yin Zhu and Yiming Wang and Shiqi Li and Wei Li and Yuntao Hong and Zhiling Luo and Jinyang Gao and Liyu Mou and Yu Li},
year={2024},
journal={arXiv preprint arXiv:2411.08599},
url={https://arxiv.org/abs/2411.08599},
primaryClass={cs.AI}
}