← 返回文档首页

Codex CLI

一份配置文件,把 OpenAI 官方 Codex CLI 指向 BeansAI。

概览

Codex 启动时读 ~/.codex/config.toml ~/.codex/auth.json。把 BeansAI 声明成一个自定义 model_provider,CLI 自然就识别了。

安装

shell
npm install -g @openai/codex

配置

~/.codex/config.toml
model_provider = "BeansAI"
model = "gpt-5.4"
review_model = "gpt-5.4"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000

[model_providers.BeansAI]
name = "BeansAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
~/.codex/auth.json
{
  "OPENAI_API_KEY": "sk-beans-..."
}

macOS / Linux

bash
mkdir -p ~/.codex
cat > ~/.codex/config.toml <<'EOF'
model_provider = "BeansAI"
model = "gpt-5.4"
review_model = "gpt-5.4"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000

[model_providers.BeansAI]
name = "BeansAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
EOF
cat > ~/.codex/auth.json <<'EOF'
{
  "OPENAI_API_KEY": "sk-beans-..."
}
EOF
echo "Codex CLI configured."

Windows (PowerShell)

powershell
$dir = "$env:USERPROFILE\.codex"
if (!(Test-Path $dir)) { New-Item -ItemType Directory -Path $dir -Force | Out-Null }
@"
model_provider = "BeansAI"
model = "gpt-5.4"
review_model = "gpt-5.4"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000

[model_providers.BeansAI]
name = "BeansAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
"@ | Set-Content "$dir\config.toml" -Encoding UTF8
@"
{
  "OPENAI_API_KEY": "sk-beans-..."
}
"@ | Set-Content "$dir\auth.json" -Encoding UTF8

基础用法

在任意仓库运行 codex。默认经 BeansAI 调用 gpt-5.4。改 config.toml 里的 model 就能切到目录里其他 OpenAI 模型。

使用技巧

  • disable_response_storage = true,prompt 完全不会留在 OpenAI 服务器上。
  • wire_api = "responses" 使用 Responses API 格式,BeansAI 会 在上游转成 chat completions。
  • review 和主模型保持一致,避免跨模型 diff 不一致。