You own the architecture. You define the goals, the paths, and the success contracts. Kalibr observes outcomes and improves routing over time. Get to working routing in about five minutes.
Your Tenant ID is assigned when you create an account. You need to create an API key separately in Settings > API Keys. The key is only shown once on creation. Copy it before closing the dialog.
pip install kalibr openai anthropic
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... OPENAI_API_KEY=sk-... ANTHROPIC_API_KEY=sk-ant-...
import kalibr must be the first import in the file where your LLM client is created. This is how the SDK patches the client at import time.
import kalibr # must be first
from kalibr import Router
router = Router(
goal="extract_company",
paths=["gpt-4o-mini", "claude-haiku-3-5"],
success_when=lambda out: "company" in out.lower()
)
response = router.completion(
messages=[{"role": "user", "content": "Extract company: Hi from Stripe."}]
)
router.report(success=True)npm install @kalibr/sdk openai
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... OPENAI_API_KEY=sk-...
import { Router } from "@kalibr/sdk";
const router = new Router({
goal: "extract_company",
paths: ["gpt-4o-mini", "claude-haiku-3-5"],
successWhen: (out) => out.toLowerCase().includes("company"),
});
const res = await router.completion([{ role: "user", content: "Extract company: Hi from Stripe." }]);
await router.report(true);pip install kalibr[langchain] langchain langchain-openai langchain-anthropic
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... OPENAI_API_KEY=sk-... ANTHROPIC_API_KEY=sk-ant-...
router.as_langchain() returns a drop-in LangChain LLM. Use it anywhere you would use ChatOpenAI or ChatAnthropic.
from kalibr import Router
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
router = Router(goal="summarize_text", paths=["gpt-4o-mini", "claude-haiku-3-5"])
chain = (
ChatPromptTemplate.from_template("Summarize: {text}")
| router.as_langchain() # drop-in LangChain LLM
| StrOutputParser()
)
result = chain.invoke({"text": "Your document..."})
router.report(success=len(result) > 50)pip install kalibr[crewai] crewai openai anthropic
Requires Python 3.10 or higher.
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... OPENAI_API_KEY=sk-... ANTHROPIC_API_KEY=sk-ant-...
That is the only change required.
from kalibr import Router
from crewai import Agent, Task, Crew
router = Router(goal="research_task", paths=["gpt-4o-mini", "claude-sonnet-4-5"])
researcher = Agent(
role="Researcher", goal="Find accurate information",
backstory="Research assistant.",
llm=router.as_langchain(), # only change required
)
task = Task(description="...", expected_output="...", agent=researcher)
result = Crew(agents=[researcher], tasks=[task]).kickoff()
router.report(success=len(str(result)) > 100)pip install kalibr[openai-agents] openai-agents
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... OPENAI_API_KEY=sk-...
get_policy() returns the current recommended model for a goal. Pass it directly to your Agent.
from kalibr import Router, get_policy from agents import Agent, Runner router = Router(goal="agent_task", paths=["gpt-4o", "gpt-4o-mini"]) policy = get_policy(goal="agent_task") agent = Agent(name="Assistant", instructions="...", model=policy["recommended_model"]) result = Runner.run_sync(agent, "Your task") router.report(success=len(result.final_output) > 0)
pip install kalibr huggingface_hub
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... HF_API_TOKEN=hf_... # optional, avoids free-tier rate limits
All 17 HuggingFace task types are supported. Manual instrumentation required: HuggingFace is not patched automatically on import kalibr. Use router.completion() directly or call kalibr.auto_instrument(["huggingface"]) explicitly.
import kalibr
kalibr.auto_instrument(["huggingface"]) # HF not auto-patched — must be explicit
from kalibr import Router
router = Router(
goal="transcribe_calls",
paths=["openai/whisper-large-v3", "facebook/seamless-m4t-v2-large"],
success_when=lambda out: len(out) > 50
)
result = router.execute(task="automatic_speech_recognition", input_data=audio_bytes)
router.report(success=True)pip install kalibr openai # DeepSeek uses the OpenAI SDK
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... DEEPSEEK_API_KEY=sk-... # from platform.deepseek.com
Use deepseek-chat, deepseek-reasoner as path names. Mix with other providers freely.
import kalibr
from kalibr import Router
router = Router(
goal="classify_icp",
paths=["gpt-4o-mini", "deepseek-chat", "claude-haiku-3-5"]
)
response = router.completion(messages=[{"role": "user", "content": "..."}])
router.report(success=True)pip install kalibr[voice] # ElevenLabs + Deepgram + OpenAI Audio
KALIBR_API_KEY=sk_... KALIBR_TENANT_ID=user_... OPENAI_API_KEY=sk-... ELEVENLABS_API_KEY=... DEEPGRAM_API_KEY=...
import kalibr # patches ElevenLabs + Deepgram
from kalibr import Router
tts = Router(goal="narrate", paths=["eleven_multilingual_v2", "tts-1"])
audio = tts.synthesize("Hello from Kalibr.", voice="alloy").audio
stt = Router(goal="transcribe_calls", paths=["whisper-1", "nova-2"])
transcript = stt.transcribe(audio_bytes, audio_duration_seconds=30.0).textRun this after your first execution. It confirms the SDK is installed, credentials are valid, and traces are arriving.
kalibr verify
The first successful trace activates Thompson Sampling. Routing improves with every run from that point.