What is An MCP server implementation providing a standardized interface for LLMs to interact with the Atla API.?
The Atla MCP Server is an implementation that provides a standardized interface for LLMs to interact with the Atla API for state-of-the-art LLMJ evaluation. It includes tools for evaluating LLM responses based on various criteria. Note that this repository was archived on July 21, 2025, and the Atla API is no longer active.
Documentation
Atla MCP Server
[!CAUTION]
This repository was archived on July 21, 2025. The Atla API is no longer active.
An MCP server implementation providing a standardized interface for LLMs to interact with the Atla API for state-of-the-art LLMJ evaluation.
Learn more about Atla here. Learn more about the Model Context Protocol here.
Available Tools
evaluate_llm_response: Evaluate an LLM's response to a prompt using a given evaluation criteria. This function uses an Atla evaluation model under the hood to return a dictionary containing a score for the model's response and a textual critique containing feedback on the model's response.
evaluate_llm_response_on_multiple_criteria: Evaluate an LLM's response to a prompt across multiple evaluation criteria. This function uses an Atla evaluation model under the hood to return a list of dictionaries, each containing an evaluation score and critique for a given criteria.
Usage
To use the MCP server, you will need an Atla API key. You can find your existing API key here or create a new one here.
Installation
We recommend using uv to manage the Python environment. See here for installation instructions.
Manually running the server
Once you have uv installed and have your Atla API key, you can manually run the MCP server using uvx (which is provided by uv):
ATLA_API_KEY=<your-api-key> uvx atla-mcp-server
Connecting to the server
Having issues or need help connecting to another client? Feel free to open an issue or contact us!
OpenAI Agents SDK
For more details on using the OpenAI Agents SDK with MCP servers, refer to the official documentation.
Install the OpenAI Agents SDK:
pip install openai-agents
Use the OpenAI Agents SDK to connect to the server:
import os
from agents import Agent
from agents.mcp import MCPServerStdio
async with MCPServerStdio(
params={
"command": "uvx",
"args": ["atla-mcp-server"],
"env": {"ATLA_API_KEY": os.environ.get("ATLA_API_KEY")}
}
) as atla_mcp_server:
...