Skip to content

MCP Server

XeroML provides an MCP (Model Context Protocol) server that exposes your traces, scores, prompts, and datasets as tools and resources for AI coding agents. This lets you ask an AI assistant questions about your LLM application’s behavior directly from your editor.

What the MCP Server Exposes

  • Traces — search and inspect production traces
  • Scores — read evaluation scores and aggregate metrics
  • Prompts — fetch and compare prompt versions
  • Datasets — browse dataset items
  • Users — query per-user usage data

Setup

Add the XeroML MCP server to your editor’s MCP configuration:

Claude Code:

Terminal window
# Install the XeroML MCP server
npx @xeroml/mcp-server init

Or add manually to your MCP config:

{
"mcpServers": {
"xeroml": {
"command": "npx",
"args": ["@xeroml/mcp-server"],
"env": {
"XEROML_PUBLIC_KEY": "pk-xm-...",
"XEROML_SECRET_KEY": "sk-xm-...",
"XEROML_BASE_URL": "https://cloud.xeroml.com"
}
}
}
}

Cursor / other MCP clients: Follow the same JSON configuration in your editor’s MCP settings file.

Example Queries

Once connected, you can ask your AI coding agent:

  • “Show me the 10 most recent traces with low accuracy scores”
  • “What’s the average token cost per trace in the last 7 days?”
  • “Compare the helpfulness scores for prompt versions 3 and 4”
  • “Find traces from user X that resulted in negative feedback”
  • “What are the most common failure patterns in this week’s traces?”

LLM Connections

The MCP server also exposes LLM Connection management — you can configure which models are available to the XeroML Playground and Prompt Experiments through your project settings.

Navigate to Project Settings → LLM Connections to add connections for:

  • OpenAI
  • Anthropic
  • Azure OpenAI
  • Google Vertex AI
  • Any OpenAI-compatible endpoint