llm-supervisor-agent
// Automatically switches OpenClaw between cloud and local Ollama models when rate limits occur.
$ git log --oneline --stat
stars:1,933
forks:367
updated:March 4, 2026
SKILL.mdreadonly
LLM Supervisor
Automatically switches OpenClaw between cloud and local Ollama models when rate limits occur.
Features
- Detects rate-limit / overload errors from Anthropic/OpenAI
- Switches to a local Ollama fallback model
- Requires explicit confirmation before local code generation
- Supports manual commands:
Commands
/llm status/llm switch cloud/llm switch local
Default Local Model
qwen2.5:7b
Safety
Local code generation requires the user to type:
CONFIRM LOCAL CODE