SDK Integration Guide

Integrate Kalibr into your Python applications for automatic tracing, observability, and cost tracking across LLM SDKs.

Installation

From PyPI:

pip install kalibr

Verify installation:

python -c "import kalibr; print(kalibr.__version__)"

Auto-Instrumentation

Kalibr instruments supported SDKs automatically when imported first:

import kalibr  # must be imported FIRST
from openai import OpenAI
from anthropic import Anthropic
import google.generativeai as genai

Supported SDKs

Provider Package Status
OpenAI openai >= 1.0.0 ✅ Full
Anthropic anthropic >= 0.18.0 ✅ Full
Google google-generativeai >= 0.3.0 ✅ Full

Integration Examples

1. Simple Script

import kalibr
from openai import OpenAI

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello"}]
)

print(response.choices[0].message.content)
print("Trace ID:", kalibr.get_trace_id())

2. FastAPI App

import kalibr
from fastapi import FastAPI
from openai import OpenAI

app = FastAPI()
client = OpenAI()

@app.post("/chat")
async def chat(message: str):
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": message}]
    )
    return {"response": response.choices[0].message.content}

Result: each HTTP request → parent span, each LLM call → child span, all linked by the same trace ID.

3. Multi-Provider Workflow

import kalibr
from openai import OpenAI
from anthropic import Anthropic

def compare_models(prompt: str):
    openai_client = OpenAI()
    anthropic_client = Anthropic()
    
    gpt_resp = openai_client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": prompt}]
    )
    
    claude_resp = anthropic_client.messages.create(
        model="claude-3-haiku-20240307",
        messages=[{"role": "user", "content": prompt}]
    )
    
    return gpt_resp.choices[0].message.content, claude_resp.content[0].text

Custom Spans

Trace your own functions:

from kalibr import trace

@trace(operation="query", provider="postgres", model="v1")
def query_user(user_id: str):
    return db.fetch_user(user_id)

Parameters:

  • operation: logical name (e.g., "query", "upsert")
  • provider: system name (e.g., "postgres", "redis", "internal-service")
  • model: version or variant identifier (optional)

Context Utilities

import kalibr

# Get current trace ID
current_trace_id = kalibr.get_trace_id()

# Get parent span ID
parent_span_id = kalibr.get_parent_span_id()

# Generate new trace ID
new_id = kalibr.new_trace_id()

# Force a specific trace context
with kalibr.trace_context(trace_id="custom-trace-123"):
    client.chat.completions.create(...)

Configuration

Variable Default Description
KALIBR_AUTO_INSTRUMENT true Enable auto-instrumentation
KALIBR_SERVICE_NAME kalibr Service name in spans
CLICKHOUSE_HOST localhost ClickHouse host
MONGO_URL mongodb://localhost:27017 Mongo metadata store

Troubleshooting

Import order matters — always import kalibr before model SDKs.

Check active instrumentation:

from kalibr import get_instrumented_providers
print(get_instrumented_providers())
# Expected: ['openai', 'anthropic', 'google']

If spans don't appear:

  • Confirm /tmp/kalibr_otel_spans.jsonl exists and is non-empty
  • Check docker ps | grep otel-bridge
  • Fix permissions if needed: sudo chmod 666 /tmp/kalibr_otel_spans.jsonl