Integrate Kalibr into your Python applications for automatic tracing, observability, and cost tracking across LLM SDKs.
From PyPI:
pip install kalibr
Verify installation:
python -c "import kalibr; print(kalibr.__version__)"
Kalibr instruments supported SDKs automatically when imported first:
import kalibr # must be imported FIRST
from openai import OpenAI
from anthropic import Anthropic
import google.generativeai as genai
| Provider | Package | Status |
|---|---|---|
| OpenAI | openai >= 1.0.0 | ✅ Full |
| Anthropic | anthropic >= 0.18.0 | ✅ Full |
| google-generativeai >= 0.3.0 | ✅ Full |
import kalibr
from openai import OpenAI
client = OpenAI()
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)
print("Trace ID:", kalibr.get_trace_id())
import kalibr
from fastapi import FastAPI
from openai import OpenAI
app = FastAPI()
client = OpenAI()
@app.post("/chat")
async def chat(message: str):
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": message}]
)
return {"response": response.choices[0].message.content}
Result: each HTTP request → parent span, each LLM call → child span, all linked by the same trace ID.
import kalibr
from openai import OpenAI
from anthropic import Anthropic
def compare_models(prompt: str):
openai_client = OpenAI()
anthropic_client = Anthropic()
gpt_resp = openai_client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}]
)
claude_resp = anthropic_client.messages.create(
model="claude-3-haiku-20240307",
messages=[{"role": "user", "content": prompt}]
)
return gpt_resp.choices[0].message.content, claude_resp.content[0].text
Trace your own functions:
from kalibr import trace
@trace(operation="query", provider="postgres", model="v1")
def query_user(user_id: str):
return db.fetch_user(user_id)
Parameters:
import kalibr
# Get current trace ID
current_trace_id = kalibr.get_trace_id()
# Get parent span ID
parent_span_id = kalibr.get_parent_span_id()
# Generate new trace ID
new_id = kalibr.new_trace_id()
# Force a specific trace context
with kalibr.trace_context(trace_id="custom-trace-123"):
client.chat.completions.create(...)
| Variable | Default | Description |
|---|---|---|
| KALIBR_AUTO_INSTRUMENT | true | Enable auto-instrumentation |
| KALIBR_SERVICE_NAME | kalibr | Service name in spans |
| CLICKHOUSE_HOST | localhost | ClickHouse host |
| MONGO_URL | mongodb://localhost:27017 | Mongo metadata store |
Import order matters — always import kalibr before model SDKs.
Check active instrumentation:
from kalibr import get_instrumented_providers
print(get_instrumented_providers())
# Expected: ['openai', 'anthropic', 'google']
If spans don't appear:
/tmp/kalibr_otel_spans.jsonl exists and is non-emptydocker ps | grep otel-bridgesudo chmod 666 /tmp/kalibr_otel_spans.jsonl