Prompt Management
XeroML Prompt Management gives teams a central store for LLM prompts. Instead of hardcoding prompts in your application code, you store them in XeroML where they can be versioned, tested in the Playground, and deployed to production independently of code deployments.
Why Prompt Management?
Separation of concerns
With prompts in code, changing a system instruction requires a code review, a deployment, and a wait for the CI pipeline. With XeroML, non-technical team members can iterate on prompt text directly via the UI, and changes deploy instantly via a label update — no engineering involvement required for a text change.
Performance with no latency cost
The XeroML SDK caches prompts client-side. After the first fetch, prompts are served from memory — as fast as a local variable read. This means you get the benefits of centralized management without any runtime overhead.
Traceability
Every trace records which prompt version was used to generate it. You can compare quality metrics across prompt versions directly in the dashboard — no manual correlation required.
Core Concepts
Versions Every time you save a prompt, XeroML creates a new immutable version numbered sequentially (1, 2, 3, …). Versions are permanent records of what the prompt looked like at that point in time.
Labels
Labels are pointers to specific versions. You deploy a prompt by pointing the production label at a version. Your application fetches the production label, so changing which version production points to is all that’s needed to deploy a new prompt.
Built-in labels:
production— the currently deployed versionlatest— always points to the most recent version (auto-updated)- Custom labels —
staging,experiment-a,v2-test, anything you define
Prompt Types
- Text prompts — a single string, for use as a system message or template
- Chat prompts — an array of messages with roles, for multi-turn conversation setups
Getting Started
→ Prompt Management: Get Started — create your first prompt and fetch it in your application
→ Data Model — understand types, variables, labels, and caching in depth
Key Features
| Feature | Description |
|---|---|
| Version Control & Labels | Deploy prompts without code changes |
| Caching | Client-side caching for zero-latency prompt fetching |
| Link to Traces | Correlate prompt versions with trace quality metrics |
| Playground | Test prompts interactively before deployment |