Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Cost-optimizing model router for OpenClaw. Automatically routes each request to the cheapest capable Claude model (Haiku/Sonnet/Opus) using weighted scoring....
Cost-optimizing model router for OpenClaw. Automatically routes each request to the cheapest capable Claude model (Haiku/Sonnet/Opus) using weighted scoring....
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Then review README.md for any prerequisites, environment setup, or post-install checks. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Then review README.md for any prerequisites, environment setup, or post-install checks. Summarize what changed and any follow-up checks I should run.
A zero-dependency proxy that sits between OpenClaw and the Anthropic API, routing each request to the cheapest capable model using a 14-dimension weighted scorer (<1ms overhead).
Run the install script to set up everything automatically: bash "$(dirname "$0")/scripts/install.sh" This will: Copy server.js and config.json to ~/.openclaw/workspace/router/ Create and start a systemd service (iblai-router) on port 8402 Register iblai-router/auto as an OpenClaw model provider After install, iblai-router/auto is available anywhere OpenClaw accepts a model ID.
curl -s http://127.0.0.1:8402/health | jq . curl -s http://127.0.0.1:8402/stats | jq .
Set iblai-router/auto as the model for any scope: ScopeHowCron jobSet model to iblai-router/auto in job configSubagentsagents.defaults.subagents.model = "iblai-router/auto"Per-session/model iblai-router/autoAll sessionsagents.defaults.model.primary = "iblai-router/auto" Tip: Keep the main interactive session on a fixed model (e.g. Opus). Use the router for cron jobs, subagents, and background tasks where cost savings compound.
All config lives in ~/.openclaw/workspace/router/config.json and hot-reloads on save β no restart needed.
Change the models per tier: { "models": { "LIGHT": "claude-3-5-haiku-20241022", "MEDIUM": "claude-sonnet-4-20250514", "HEAVY": "claude-opus-4-20250514" } }
Set apiBaseUrl to route through OpenRouter: { "models": { "LIGHT": "openai/gpt-4.1-mini", "MEDIUM": "openai/gpt-4.1", "HEAVY": "openai/o3" }, "apiBaseUrl": "https://openrouter.ai/api/v1" } Update the API key in the systemd service when switching providers, then systemctl daemon-reload && systemctl restart iblai-router.
Keyword lists control which tier handles a request: simpleKeywords, relayKeywords β push toward LIGHT (cheap) imperativeVerbs, codeKeywords, agenticKeywords β push toward MEDIUM technicalKeywords, reasoningKeywords, domainKeywords β push toward HEAVY (capable) Tune boundaries and weights in config.json to match your workload. See the full README for details.
bash "$(dirname "$0")/scripts/uninstall.sh" Stops the service, removes the systemd unit, and deletes router files. Reminder: switch any workloads using iblai-router/auto back to a direct model first.
Agent frameworks, memory systems, reasoning layers, and model-native orchestration.
Largest current source with strong distribution and engagement signals.