Intelligent model routing for LLM applications
v0.5.0 โ Multi-providerLLM applications need different models for different tasks. Providers have varying strengths, costs differ wildly, and failures happen. LLM Router solves this with automatic routing and fallback.
Your App โ LLM Router โ Best Model for Task
โ
OpenRouter / OpenAI / Anthropic / Google / Ollama
OpenRouter, OpenAI, Anthropic, Google AI, and local Ollama in one fallback chain.
Auto-detect task type: code, tools, reasoning, conversation.
Auto-disable failing models, auto-retry after recovery.
Real-time USD cost estimation per request and total.
Define your own categories and model chains via API.
OpenAI-compatible API. Just change the base URL.
Mix providers in your fallback chain. If OpenAI fails โ try Anthropic โ fall back to local Ollama.
# Install & Run
git clone https://github.com/AkashaBot/llm-router.git
cd llm-router/service
pip install -r requirements.txt
export OPENROUTER_API_KEY=sk-or-v1-...
uvicorn main:app --host 0.0.0.0 --port 3456
# Use
curl http://localhost:3456/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"router","messages":[{"role":"user","content":"Hello"}]}'
| Category | Detection | Default Models |
|---|---|---|
tools | Request has tools array | kimi-k2.5 โ glm-5 โ gpt-4o-mini |
code | Keywords: python, function... | glm-5 โ gpt-4o-mini โ kimi-k2.5 |
reasoning | Keywords: why, how, explain... | kimi-k2.5 โ glm-5 โ gpt-4o-mini |
conversation | Short messages, greetings | glm-5 โ gpt-4o-mini |
| Endpoint | Description |
|---|---|
POST /v1/chat/completions | OpenAI-compatible chat |
GET /health | Health check |
GET /metrics | Usage, cost, circuit breaker |
GET /providers | Configured providers |
POST /config/category | Add custom category |
POST /circuit-breaker/reset-all | Reset all circuits |
Configure as a custom provider in ~/.openclaw/openclaw.json:
{
"models": {
"providers": {
"router": {
"baseUrl": "http://localhost:3456",
"api": "openai-completions",
"models": [{"id": "router"}]
}
}
},
"agents": {
"defaults": {
"model": {"primary": "router/router"}
}
}
}
Agents automatically get resilient routing with fallback chains.