development
load testing
MCP
AI integration
Locust
What is A Model Context Protocol (MCP) server implementation for running Locust load tests.?
This server enables seamless integration of Locust load testing capabilities with AI-powered development environments. It supports headless and UI modes, configurable test parameters, and provides an easy-to-use API for running tests with real-time output.
Documentation
🚀 ⚡️ locust-mcp-server
A Model Context Protocol (MCP) server implementation for running Locust load tests. This server enables seamless integration of Locust load testing capabilities with AI-powered development environments.
✨ Features
Simple integration with Model Context Protocol framework
Support for headless and UI modes
Configurable test parameters (users, spawn rate, runtime)
Easy-to-use API for running Locust load tests
Real-time test execution output
HTTP/HTTPS protocol support out of the box
Custom task scenarios support
🔧 Prerequisites
Before you begin, ensure you have the following installed:
Set up environment variables (optional):
Create a .env file in the project root:
LOCUST_HOST=http://localhost:8089 # Default host for your tests
LOCUST_USERS=3 # Default number of users
LOCUST_SPAWN_RATE=1 # Default user spawn rate
LOCUST_RUN_TIME=10s # Default test duration
🚀 Getting Started
Create a Locust test script (e.g., hello.py):
from locust import HttpUser, task, between
class QuickstartUser(HttpUser):
wait_time = between(1, 5)
@task
def hello_world(self):
self.client.get("/hello")
self.client.get("/world")
@task(3)
def view_items(self):
for item_id in range(10):
self.client.get(f"/item?id={item_id}", name="/item")
time.sleep(1)
def on_start(self):
self.client.post("/login", json={"username":"foo", "password":"bar"})
Configure the MCP server using the below specs in your favorite MCP client (Claude Desktop, Cursor, Windsurf and more):