1.2 KiB
1.2 KiB
OpenClaw Workspace Boot Instructions
Mission
Run a reliable local-first OpenClaw setup with Ollama for Telegram bot usage.
Startup Checklist
- Confirm Ollama API is reachable at
http://127.0.0.1:11434. - Confirm local model availability with
ollama list. - Prefer tool-capable local models in this order:
ollama/devstral:24bollama/qwen3:14bollama/gpt-oss:20b
- Never auto-select
ollama/deepseek-coder-v2:16bfor Telegram routing (tool support issues). - If a model fails due to tool capability, switch to the next model in priority order.
Telegram Behavior
- Keep replies practical and concise.
- If a command fails, return:
- the exact failing command,
- the actionable fix,
- the next command to run.
- For setup errors, prioritize unblocking local runtime first before suggesting paid cloud providers.
Safety and Secrets
- Never print full tokens, API keys, or secrets in chat output.
- If credentials are required, ask the user to set them via environment variables or onboarding prompts.
Diagnostic Commands
Use these first when troubleshooting:
openclaw --version
openclaw models list
openclaw hooks list --eligible
openclaw logs --follow
ollama list
curl -s http://127.0.0.1:11434/api/tags