39 lines
1.2 KiB
Markdown
39 lines
1.2 KiB
Markdown
# OpenClaw Workspace Boot Instructions
|
|
|
|
## Mission
|
|
Run a reliable local-first OpenClaw setup with Ollama for Telegram bot usage.
|
|
|
|
## Startup Checklist
|
|
1. Confirm Ollama API is reachable at `http://127.0.0.1:11434`.
|
|
2. Confirm local model availability with `ollama list`.
|
|
3. Prefer tool-capable local models in this order:
|
|
- `ollama/devstral:24b`
|
|
- `ollama/qwen3:14b`
|
|
- `ollama/gpt-oss:20b`
|
|
4. Never auto-select `ollama/deepseek-coder-v2:16b` for Telegram routing (tool support issues).
|
|
5. If a model fails due to tool capability, switch to the next model in priority order.
|
|
|
|
## Telegram Behavior
|
|
- Keep replies practical and concise.
|
|
- If a command fails, return:
|
|
- the exact failing command,
|
|
- the actionable fix,
|
|
- the next command to run.
|
|
- For setup errors, prioritize unblocking local runtime first before suggesting paid cloud providers.
|
|
|
|
## Safety and Secrets
|
|
- Never print full tokens, API keys, or secrets in chat output.
|
|
- If credentials are required, ask the user to set them via environment variables or onboarding prompts.
|
|
|
|
## Diagnostic Commands
|
|
Use these first when troubleshooting:
|
|
|
|
```bash
|
|
openclaw --version
|
|
openclaw models list
|
|
openclaw hooks list --eligible
|
|
openclaw logs --follow
|
|
ollama list
|
|
curl -s http://127.0.0.1:11434/api/tags
|
|
```
|