Skip to content

Integrations

XeroML integrates natively with the most popular LLM frameworks and libraries. Native integrations automatically capture traces without manual instrumentation — just install the wrapper package and replace your import.

For frameworks not listed here, you can use the OpenTelemetry endpoint with any OTEL-compatible instrumentation library.

Model Providers

IntegrationLanguageLink
OpenAIPython/integrations/model-providers/openai-py
OpenAIJS/TS/integrations/model-providers/openai-js
OllamaPython/TS/integrations/model-providers/ollama

Frameworks

IntegrationLanguageLink
LangChainPython/integrations/frameworks/langchain
LangChainJS/TS/integrations/frameworks/langchain
Vercel AI SDKJS/TS/integrations/frameworks/vercel-ai-sdk
LlamaIndexPython/integrations/frameworks/llamaindex
CrewAIPython/integrations/frameworks/crewai
AutoGenPython/integrations/frameworks/autogen
Google ADKPython/integrations/frameworks/google-adk

Gateways

IntegrationLink
LiteLLM/integrations/gateways/litellm

Protocol-Based

IntegrationLink
OpenTelemetry/integrations/native/opentelemetry

Don’t See Your Framework?

XeroML supports any framework that can export OpenTelemetry spans. If your framework has an OTEL integration, configure it to export to XeroML’s OTEL endpoint.

OpenTelemetry Integration

Alternatively, use the Python SDK or TypeScript SDK for manual instrumentation — they work with any framework.