Skip to main content
Codex CLI can talk to Firmware by setting a custom base URL. This lets you use models from multiple providers with one key.

Acknowledge

Setup notes are adapted from guidance by AlexGS74. Big thanks for sharing the original walkthrough.

Install

Install the Codex CLI.
npm install -g @openai/codex
Confirm it’s on your PATH.
codex --version

Configure

Create or edit ~/.codex/config.toml. Set a provider that points to Firmware.
[model_providers.firmware]
name = "Firmware"
base_url = "https://app.firmware.ai/api/v1"
env_key = "FIRMWARE_API_KEY"

[profiles.firmware]
model_provider = "firmware"
model = "gpt-4o"
Export your API key in your shell.
export FIRMWARE_API_KEY="fw_api_..."

Pick a model

Choose a model ID from the models list. You can set it in the profile or pass --model per run.

Run

Use the Firmware profile for any session.
codex --profile firmware
Override the model when you need to.
codex --profile firmware --model "claude-opus-4-5"

Troubleshoot

If you get a 401, double-check the FIRMWARE_API_KEY value. If you get a 404, make sure base_url ends with /v1.

Learn more