OpenAI Codex CLI / AI Coding Tools

OpenAI Codex CLI Ignores Project-Local model_provider Config After PR #20098

Fix Codex CLI ignoring project-level model_provider and model_providers configuration Includes evidence for OpenAI Codex CLI troubleshooting demand.

Category
AI Coding Tools
Error signature
Ignored unsupported project-local config keys in .codex/config.toml: model_provider, model_providers
Quick fix
Compare the failing environment with a known working setup, then change one configuration value at a time.
Updated

What this error means

Ignored unsupported project-local config keys in .codex/config.toml: model_provider, model_providers is a OpenAI Codex CLI failure pattern reported for developers trying to fix codex cli ignoring project-level model_provider and model_providers configuration. Based on the imported evidence, treat this as a tool-specific troubleshooting page rather than a generic API error.

Why this happens

GitHub issue #22222 reports Codex CLI 0.130.0 ignoring .codex/config.toml model_provider settings after PR #20098. Falls back to default OpenAI route. Exact warning message: ‘Ignored unsupported project-local config keys’. Affects WSL/Linux developers with API & Pro subscriptions.

Common causes

Quick fixes

  1. Confirm the exact error signature matches Ignored unsupported project-local config keys in .codex/config.toml: model_provider, model_providers.
  2. Check the OpenAI Codex CLI account, local tool state, and provider configuration involved in the failing workflow.
  3. Compare the failing environment with a known working setup, then change one configuration value at a time.

Platform/tool-specific checks

Step-by-step troubleshooting

  1. Capture the exact error message and the command, editor action, or request that triggered it.
  2. Check whether the failure is account/auth, quota/rate, model/provider, local runtime, or deployment configuration.
  3. Review the source evidence below and compare it with your environment.
  4. Apply one change at a time and rerun the smallest failing action.
  5. Keep the working fix documented for the team or deployment environment.

How to prevent it

Sources checked

Evidence note: GitHub issue #22222 reports Codex CLI 0.130.0 ignoring .codex/config.toml model_provider settings after PR #20098. Falls back to default OpenAI route. Exact warning message: ‘Ignored unsupported project-local config keys’. Affects WSL/Linux developers with API & Pro subscriptions.

FAQ

What should I check first?

Start with the exact Ignored unsupported project-local config keys in .codex/config.toml: model_provider, model_providers text and the smallest action that reproduces it.

Can I ignore this error?

No. Treat it as a failed OpenAI Codex CLI workflow until the root cause is understood.

Is this guaranteed to have one fix?

No. The imported evidence supports the troubleshooting path above, but tool behavior can vary by account, plan, version, provider, and local configuration.

How do I know the fix worked?

Rerun the same command, editor action, or request. The fix is working when that action completes without Ignored unsupported project-local config keys in .codex/config.toml: model_provider, model_providers.