๐Ÿ”ฎ LLM Router

Intelligent model routing for LLM applications

v0.5.0 โ€” Multi-provider

The Problem

LLM applications need different models for different tasks. Providers have varying strengths, costs differ wildly, and failures happen. LLM Router solves this with automatic routing and fallback.

Your App โ†’ LLM Router โ†’ Best Model for Task
                        โ†“
            OpenRouter / OpenAI / Anthropic / Google / Ollama

Features

๐Ÿ”€ Multi-provider

OpenRouter, OpenAI, Anthropic, Google AI, and local Ollama in one fallback chain.

๐ŸŽฏ Smart Routing

Auto-detect task type: code, tools, reasoning, conversation.

โšก Circuit Breaker

Auto-disable failing models, auto-retry after recovery.

๐Ÿ’ฐ Cost Tracking

Real-time USD cost estimation per request and total.

โš™๏ธ Customizable

Define your own categories and model chains via API.

๐Ÿ”Œ Drop-in Compatible

OpenAI-compatible API. Just change the base URL.

Providers

OpenRouter OpenAI Anthropic Google AI Ollama (local)

Mix providers in your fallback chain. If OpenAI fails โ†’ try Anthropic โ†’ fall back to local Ollama.

Quick Start

# Install & Run
git clone https://github.com/AkashaBot/llm-router.git
cd llm-router/service
pip install -r requirements.txt
export OPENROUTER_API_KEY=sk-or-v1-...
uvicorn main:app --host 0.0.0.0 --port 3456

# Use
curl http://localhost:3456/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model":"router","messages":[{"role":"user","content":"Hello"}]}'

Routing Logic

CategoryDetectionDefault Models
toolsRequest has tools arraykimi-k2.5 โ†’ glm-5 โ†’ gpt-4o-mini
codeKeywords: python, function...glm-5 โ†’ gpt-4o-mini โ†’ kimi-k2.5
reasoningKeywords: why, how, explain...kimi-k2.5 โ†’ glm-5 โ†’ gpt-4o-mini
conversationShort messages, greetingsglm-5 โ†’ gpt-4o-mini

API Endpoints

EndpointDescription
POST /v1/chat/completionsOpenAI-compatible chat
GET /healthHealth check
GET /metricsUsage, cost, circuit breaker
GET /providersConfigured providers
POST /config/categoryAdd custom category
POST /circuit-breaker/reset-allReset all circuits

For OpenClaw Agents

Configure as a custom provider in ~/.openclaw/openclaw.json:

{
  "models": {
    "providers": {
      "router": {
        "baseUrl": "http://localhost:3456",
        "api": "openai-completions",
        "models": [{"id": "router"}]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {"primary": "router/router"}
    }
  }
}

Agents automatically get resilient routing with fallback chains.

View on GitHub OpenClaw Guide