BeansAI
模型定价文档
登录注册
模型定价文档
登录注册

快速接入

  • 概览

客户端

  • Claude Code
  • CC Switch
  • OpenClaw
  • Roo Code
  • OpenCode
  • Codex CLI
  • GPT Image 2
  • Seedance 2.0
  • SkyReels V4
  • Mureka Song
  • Cursor
  • Cherry Studio

参考

  • 原生 API
← 返回文档首页

Codex CLI

一份配置文件,把 OpenAI 官方 Codex CLI 指向 BeansAI。

概览

Codex 启动时读 ~/.codex/config.toml 和 ~/.codex/auth.json。把 BeansAI 声明成一个自定义 model_provider,CLI 自然就识别了。

安装

shell
npm install -g @openai/codex

配置

~/.codex/config.toml
model_provider = "OpenAI"
model = "gpt-5.5"
review_model = "gpt-5.5"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000
approval_policy = "never"
sandbox_mode = "workspace-write"
suppress_unstable_features_warning = true

[shell_environment_policy]
inherit = "all"

[features]
image_generation = true
plan_tool = true
apply_patch_freeform = true
view_image_tool = true
hooks = true
memories = true

[windows]
sandbox = "unelevated"

[model_providers.OpenAI]
name = "OpenAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
~/.codex/auth.json
{
  "OPENAI_API_KEY": "sk-beans-..."
}

macOS / Linux

bash
mkdir -p ~/.codex
cat > ~/.codex/config.toml <<'EOF'
model_provider = "OpenAI"
model = "gpt-5.5"
review_model = "gpt-5.5"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000
approval_policy = "never"
sandbox_mode = "workspace-write"
suppress_unstable_features_warning = true

[shell_environment_policy]
inherit = "all"

[features]
image_generation = true
plan_tool = true
apply_patch_freeform = true
view_image_tool = true
hooks = true
memories = true

[windows]
sandbox = "unelevated"

[model_providers.OpenAI]
name = "OpenAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
EOF
cat > ~/.codex/auth.json <<'EOF'
{
  "OPENAI_API_KEY": "sk-beans-..."
}
EOF
echo "Codex CLI configured."

Image Generation

Codex 也可以直接通过 BeansAI 生图。应用下面这份生图配置后,直接用自然语言让 Codex 生成或编辑图片即可。

~/.codex/config.toml
model_provider = "OpenAI"
model = "gpt-5.5"
review_model = "gpt-5.5"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000
approval_policy = "never"
sandbox_mode = "workspace-write"
suppress_unstable_features_warning = true

[shell_environment_policy]
inherit = "all"

[features]
image_generation = true
plan_tool = true
apply_patch_freeform = true
view_image_tool = true
hooks = true
memories = true

[windows]
sandbox = "unelevated"

[model_providers.OpenAI]
name = "OpenAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
bash
mkdir -p ~/.codex
cat > ~/.codex/config.toml <<'EOF'
model_provider = "OpenAI"
model = "gpt-5.5"
review_model = "gpt-5.5"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000
approval_policy = "never"
sandbox_mode = "workspace-write"
suppress_unstable_features_warning = true

[shell_environment_policy]
inherit = "all"

[features]
image_generation = true
plan_tool = true
apply_patch_freeform = true
view_image_tool = true
hooks = true
memories = true

[windows]
sandbox = "unelevated"

[model_providers.OpenAI]
name = "OpenAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
EOF
cat > ~/.codex/auth.json <<'EOF'
{
  "OPENAI_API_KEY": "sk-beans-..."
}
EOF
echo "Codex CLI configured for image generation. Now run: codex \"生成一张赛博朋克风格的咖啡豆海报\""

Windows (PowerShell)

powershell
$enc = [System.Text.UTF8Encoding]::new($false)
$bom = [char]0xFEFF
$dir = "$env:USERPROFILE\.codex"
if (!(Test-Path $dir)) { New-Item -ItemType Directory -Path $dir -Force | Out-Null }
$configContent = (@'
model_provider = "OpenAI"
model = "gpt-5.5"
review_model = "gpt-5.5"
model_reasoning_effort = "high"
disable_response_storage = true
network_access = "enabled"
model_context_window = 1000000
model_auto_compact_token_limit = 900000
approval_policy = "never"
sandbox_mode = "workspace-write"
suppress_unstable_features_warning = true

[shell_environment_policy]
inherit = "all"

[features]
image_generation = true
plan_tool = true
apply_patch_freeform = true
view_image_tool = true
hooks = true
memories = true

[windows]
sandbox = "unelevated"

[model_providers.OpenAI]
name = "OpenAI"
base_url = "https://api.beansai.dev/v1"
wire_api = "responses"
requires_openai_auth = true
'@).TrimStart($bom)
[System.IO.File]::WriteAllText("$dir\config.toml", $configContent, $enc)
$authContent = (@'
{
  "OPENAI_API_KEY": "sk-beans-..."
}
'@).TrimStart($bom)
[System.IO.File]::WriteAllText("$dir\auth.json", $authContent, $enc)
Write-Host "Codex CLI configured."

基础用法

在任意仓库运行 codex。默认经 BeansAI 调用 gpt-5.5。改 config.toml 里的 model 就能切到目录里其他 OpenAI 模型。

使用技巧

  • 设 disable_response_storage = true,prompt 完全不会留在 OpenAI 服务器上。
  • wire_api = "responses" 使用 Responses API 格式,BeansAI 全链路原生支持,无需额外转换。
  • review 和主模型保持一致,避免跨模型 diff 不一致。