Integrations
XeroML integrates natively with the most popular LLM frameworks and libraries. Native integrations automatically capture traces without manual instrumentation — just install the wrapper package and replace your import.
For frameworks not listed here, you can use the OpenTelemetry endpoint with any OTEL-compatible instrumentation library.
Model Providers
| Integration | Language | Link |
|---|---|---|
| OpenAI | Python | /integrations/model-providers/openai-py |
| OpenAI | JS/TS | /integrations/model-providers/openai-js |
| Ollama | Python/TS | /integrations/model-providers/ollama |
Frameworks
| Integration | Language | Link |
|---|---|---|
| LangChain | Python | /integrations/frameworks/langchain |
| LangChain | JS/TS | /integrations/frameworks/langchain |
| Vercel AI SDK | JS/TS | /integrations/frameworks/vercel-ai-sdk |
| LlamaIndex | Python | /integrations/frameworks/llamaindex |
| CrewAI | Python | /integrations/frameworks/crewai |
| AutoGen | Python | /integrations/frameworks/autogen |
| Google ADK | Python | /integrations/frameworks/google-adk |
Gateways
| Integration | Link |
|---|---|
| LiteLLM | /integrations/gateways/litellm |
Protocol-Based
| Integration | Link |
|---|---|
| OpenTelemetry | /integrations/native/opentelemetry |
Don’t See Your Framework?
XeroML supports any framework that can export OpenTelemetry spans. If your framework has an OTEL integration, configure it to export to XeroML’s OTEL endpoint.
Alternatively, use the Python SDK or TypeScript SDK for manual instrumentation — they work with any framework.