Skip to content

LangChain

XeroML provides a native LangChain integration via a CallbackHandler that automatically captures all LangChain events as structured observations.

Installation:

Terminal window
pip install xeroml langchain-openai

Usage:

from xeroml import CallbackHandler
from langchain_openai import ChatOpenAI
from langchain.schema import HumanMessage
handler = CallbackHandler()
llm = ChatOpenAI(model="gpt-4o", callbacks=[handler])
response = llm.invoke([HumanMessage(content="What is XeroML?")])

With chains:

from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
template = PromptTemplate(
input_variables=["product"],
template="Write a tagline for {product}."
)
chain = LLMChain(llm=llm, prompt=template)
result = chain.run("XeroML")

Adding user/session context:

from xeroml import CallbackHandler, propagate_attributes
def handle_request(message: str, user_id: str, session_id: str) -> str:
handler = CallbackHandler()
llm = ChatOpenAI(model="gpt-4o", callbacks=[handler])
with propagate_attributes(user_id=user_id, session_id=session_id):
return llm.invoke([HumanMessage(content=message)]).content

What Gets Captured

The LangChain integration captures:

  • Chain start/end with inputs and outputs
  • LLM call inputs (prompt), outputs (completion), and token usage
  • Tool calls and their results
  • Agent reasoning steps
  • Retriever queries and retrieved documents
  • Errors at any step