Path 1

Developer integration

You own the architecture. You define the goals, the paths, and the success contracts. Kalibr observes outcomes and improves routing over time. Get to working routing in about five minutes.

i
Using a coding agent? If Claude Code, Codex, or Cursor is writing the integration, give it this one prompt and your credentials. It reads the full setup reference and handles everything.

Read https://kalibr.systems/llms.txt and integrate Kalibr into this project.

Then skip to the Verify step.

1. Create an API key

Your Tenant ID is assigned when you create an account. You need to create an API key separately in Settings > API Keys. The key is only shown once on creation. Copy it before closing the dialog.

KALIBR_API_KEY
sk_... (create in Settings > API Keys)
KALIBR_TENANT_ID
user_... (from your account, shown in Settings)

2. Install the SDK

terminal
pip install kalibr openai anthropic

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...

4. Replace your LLM call

import kalibr must be the first import in the file where your LLM client is created. This is how the SDK patches the client at import time.

agent.py
import kalibr            # must be first
from kalibr import Router

router = Router(
    goal="extract_company",
    paths=["gpt-4o-mini", "claude-haiku-3-5"],
    success_when=lambda out: "company" in out.lower()
)
response = router.completion(
    messages=[{"role": "user", "content": "Extract company: Hi from Stripe."}]
)
router.report(success=True)
terminal
npm install @kalibr/sdk openai

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
OPENAI_API_KEY=sk-...

4. Replace your LLM call

agent.ts
import { Router } from "@kalibr/sdk";

const router = new Router({
  goal: "extract_company",
  paths: ["gpt-4o-mini", "claude-haiku-3-5"],
  successWhen: (out) => out.toLowerCase().includes("company"),
});
const res = await router.completion([{ role: "user", content: "Extract company: Hi from Stripe." }]);
await router.report(true);
terminal
pip install kalibr[langchain] langchain langchain-openai langchain-anthropic

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...

4. Swap in the Kalibr LLM

router.as_langchain() returns a drop-in LangChain LLM. Use it anywhere you would use ChatOpenAI or ChatAnthropic.

chain.py
from kalibr import Router
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

router = Router(goal="summarize_text", paths=["gpt-4o-mini", "claude-haiku-3-5"])
chain = (
    ChatPromptTemplate.from_template("Summarize: {text}")
    | router.as_langchain()    # drop-in LangChain LLM
    | StrOutputParser()
)
result = chain.invoke({"text": "Your document..."})
router.report(success=len(result) > 50)
terminal
pip install kalibr[crewai] crewai openai anthropic

Requires Python 3.10 or higher.

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...

4. Pass router.as_langchain() as your agent LLM

That is the only change required.

crew.py
from kalibr import Router
from crewai import Agent, Task, Crew

router = Router(goal="research_task", paths=["gpt-4o-mini", "claude-sonnet-4-5"])
researcher = Agent(
    role="Researcher", goal="Find accurate information",
    backstory="Research assistant.",
    llm=router.as_langchain(),    # only change required
)
task   = Task(description="...", expected_output="...", agent=researcher)
result = Crew(agents=[researcher], tasks=[task]).kickoff()
router.report(success=len(str(result)) > 100)
terminal
pip install kalibr[openai-agents] openai-agents

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
OPENAI_API_KEY=sk-...

4. Ask Kalibr which model to use

get_policy() returns the current recommended model for a goal. Pass it directly to your Agent.

agent.py
from kalibr import Router, get_policy
from agents import Agent, Runner

router = Router(goal="agent_task", paths=["gpt-4o", "gpt-4o-mini"])
policy = get_policy(goal="agent_task")
agent  = Agent(name="Assistant", instructions="...", model=policy["recommended_model"])
result = Runner.run_sync(agent, "Your task")
router.report(success=len(result.final_output) > 0)
terminal
pip install kalibr huggingface_hub

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
HF_API_TOKEN=hf_...  # optional, avoids free-tier rate limits

4. Route between HuggingFace models

All 17 HuggingFace task types are supported. Manual instrumentation required: HuggingFace is not patched automatically on import kalibr. Use router.completion() directly or call kalibr.auto_instrument(["huggingface"]) explicitly.

agent.py
import kalibr
kalibr.auto_instrument(["huggingface"])  # HF not auto-patched — must be explicit
from kalibr import Router

router = Router(
    goal="transcribe_calls",
    paths=["openai/whisper-large-v3", "facebook/seamless-m4t-v2-large"],
    success_when=lambda out: len(out) > 50
)
result = router.execute(task="automatic_speech_recognition", input_data=audio_bytes)
router.report(success=True)
terminal
pip install kalibr openai  # DeepSeek uses the OpenAI SDK

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
DEEPSEEK_API_KEY=sk-...  # from platform.deepseek.com

4. Route between DeepSeek models

Use deepseek-chat, deepseek-reasoner as path names. Mix with other providers freely.

agent.py
import kalibr
from kalibr import Router

router = Router(
    goal="classify_icp",
    paths=["gpt-4o-mini", "deepseek-chat", "claude-haiku-3-5"]
)
response = router.completion(messages=[{"role": "user", "content": "..."}])
router.report(success=True)
terminal
pip install kalibr[voice]   # ElevenLabs + Deepgram + OpenAI Audio

3. Set credentials

.env
KALIBR_API_KEY=sk_...
KALIBR_TENANT_ID=user_...
OPENAI_API_KEY=sk-...
ELEVENLABS_API_KEY=...
DEEPGRAM_API_KEY=...

4. Route TTS and STT

voice_agent.py
import kalibr            # patches ElevenLabs + Deepgram
from kalibr import Router

tts = Router(goal="narrate", paths=["eleven_multilingual_v2", "tts-1"])
audio = tts.synthesize("Hello from Kalibr.", voice="alloy").audio

stt = Router(goal="transcribe_calls", paths=["whisper-1", "nova-2"])
transcript = stt.transcribe(audio_bytes, audio_duration_seconds=30.0).text

3. Verify

Run this after your first execution. It confirms the SDK is installed, credentials are valid, and traces are arriving.

terminal
kalibr verify

The first successful trace activates Thompson Sampling. Routing improves with every run from that point.

Next

Path 2: Agent as orchestrator

How routing works