OpenClaw
AI coding assistant with a Web UI. Point it at BeansAI as a Custom Provider.
Overview
OpenClaw is an AI coding assistant that supports both a Web UI and a CLI. It lets you configure any OpenAI- or Anthropic-compatible endpoint as a Custom Provider. For BeansAI, the most reliable setup is OpenAI-compatible mode with gpt-5.5 as the default model.
Prerequisites
Make sure node --version reports v22 or higher before installing.
Installation
npm install -g openclaw@latestConfiguration
Step 1 — Write the config file first.
Write the file below to ~/.openclaw/openclaw.json. On native Windows, use %USERPROFILE%\.openclaw\openclaw.json.
{
"agents": {
"defaults": {
"model": {
"primary": "custom-api-beansai-dev/gpt-5.5"
}
}
},
"models": {
"providers": {
"custom-api-beansai-dev": {
"baseUrl": "https://api.beansai.dev/v1",
"apiKey": "sk-beans-...",
"api": "openai-completions",
"models": [
{
"id": "gpt-5.5",
"name": "GPT-5.5",
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.1",
"name": "GPT-5.1",
"contextWindow": 400000,
"maxTokens": 128000
},
{
"id": "gpt-5.2",
"name": "GPT-5.2",
"contextWindow": 400000,
"maxTokens": 128000
},
{
"id": "gpt-5.3-codex-spark",
"name": "GPT-5.3 Codex Spark",
"contextWindow": 128000,
"maxTokens": 16384
}
]
}
}
}
}Set contextWindow to 200000 and maxTokens to 8192. api must be openai-completions so OpenClaw uses the OpenAI-compatible /chat/completions endpoint, and agents.defaults.model.primary keeps gpt-5.5 selected by default. The name field inside models[] is required by the current schema, so do not omit it.
Step 2 — Validate the file before launching OpenClaw.
openclaw config validateStep 3 — Run onboarding only after validation passes.
openclaw onboard --install-daemonIf you prefer to configure everything interactively instead of writing JSON first, answer the wizard like this:
- Install daemon?
yes - Onboarding mode: pick the first option
- Model/auth provider:
Custom Provider (Any OpenAI or Anthropic compatible endpoint) - API Base URL:
https://api.beansai.dev/v1 - API Key: your BeansAI key (starts with
sk-beans-) - Endpoint compatibility:
OpenAI-compatible (Uses /chat/completions) - Model ID:
gpt-5.5 - Endpoint ID: accept the default
- Model alias (optional): e.g.
gpt55 - QuickStart / Skills / Hooks can all be skipped on first run
- How do you want to hatch your bot?
Open the Web UI
Windows
OpenClaw supports native Windows, but the project currently recommends WSL2 for the smoothest coding workflow. If you stay on native Windows, keep using PowerShell / CMD with the same commands above.
openclaw onboard --install-daemon will try to register a background task first; if Windows blocks that path, OpenClaw falls back to a Startup-based launch on the next sign-in.
Basic usage
After onboarding, the Web UI opens automatically in your browser. Start a new conversation, pick your configured model, and begin chatting. OpenClaw tracks sessions and costs just like BeansAI's console.
# Step 1: Install OpenClaw
npm install -g openclaw@latest
# Step 2: Validate config
openclaw config validate
# Step 3: Initialize
openclaw onboard --install-daemon
# If you use the interactive wizard instead of pasting the JSON below,
# configure it like this:
# - Install daemon: yes
# - Onboarding mode: first option
# - Model/auth provider: Custom Provider (Any OpenAI or Anthropic compatible endpoint)
# - API Base URL: https://api.beansai.dev/v1
# - API Key: sk-beans-...
# - Endpoint compatibility: OpenAI-compatible (Uses /chat/completions)
# - Model ID: gpt-5.5
# - Model alias (optional): gpt55
# - Hatch bot: Open the Web UI
# Step 4: Start OpenClaw
openclawTips
- Add more models by appending to the
modelsarray — any compatible model ID from the BeansAI catalog works. - First run should be a short sanity chat — confirm the key, base URL and model ID are right before tackling a real task. If
openclawrefuses to start, rerunopenclaw config validatefirst. - In this OpenAI-compatible setup,
baseUrlin the JSON should keep the/v1suffix.