-
Notifications
You must be signed in to change notification settings - Fork 2.8k
OpenAI Codex provider #8281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
OpenAI Codex provider #8281
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds support for a new OpenAI Codex provider variant that uses ChatGPT web credentials instead of API keys. The implementation includes UI components, validation, and backend provider logic for the openai-native-codex provider.
Key Changes:
- Added new OpenAI Native Codex provider using local auth.json credentials
- Added UI components and validation for the optional OAuth credentials path
- Integrated the provider into settings, model selection, and request routing logic
Reviewed Changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| src/api/providers/openai-native-codex.ts | New provider implementation with auth.json credentials and Codex API integration |
| packages/types/src/providers/openai-codex.ts | Type definitions for Codex models (gpt-5, gpt-5-codex) |
| webview-ui/src/components/settings/providers/OpenAiNativeCodex.tsx | React component for Codex provider settings UI |
| webview-ui/src/i18n/locales/en/settings.json | Added validation text for optional auth path |
| webview-ui/src/components/settings/ApiOptions.tsx | Added provider URL mapping and settings integration |
| Multiple configuration files | Updated provider registrations, model mappings, and type definitions |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
|
I saw that the model codex-mini-latest is missing here, might also want to add it as a option to be used if people wish too, its a tuned version of o4-mini iirc. |
47043f3 to
5d63945
Compare
hannesrudolph
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Implemented requested follow-ups: corrected comment to reflect 'medium' default, improved error guidance for auth.json read/parse failures, and added codex-mini-latest model.
done |
| // Guard file size before reading to prevent loading unexpectedly large files | ||
| const MAX_OAUTH_SIZE = 1_000_000 // 1 MB | ||
| try { | ||
| const stat = await fs.stat(resolvedPath) | ||
| if (stat.size > MAX_OAUTH_SIZE) { | ||
| throw new Error( | ||
| t("common:errors.openaiNativeCodex.oauthFileTooLarge", { | ||
| path: resolvedPath, | ||
| size: stat.size, | ||
| max: MAX_OAUTH_SIZE, | ||
| }), | ||
| ) | ||
| } | ||
| } catch (e: any) { | ||
| // Surface read failure with localized error (e.g., file missing or inaccessible) | ||
| const base = t("common:errors.openaiNativeCodex.oauthReadFailed", { | ||
| path: resolvedPath, | ||
| error: e?.message || String(e), | ||
| }) | ||
| throw new Error(base) | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The oauthFileTooLarge error path is currently being caught and re-thrown as oauthReadFailed, which hides the more specific user-facing message. Splitting the fs.stat failure handling from the explicit size check keeps the correct i18n key.
| // Guard file size before reading to prevent loading unexpectedly large files | |
| const MAX_OAUTH_SIZE = 1_000_000 // 1 MB | |
| try { | |
| const stat = await fs.stat(resolvedPath) | |
| if (stat.size > MAX_OAUTH_SIZE) { | |
| throw new Error( | |
| t("common:errors.openaiNativeCodex.oauthFileTooLarge", { | |
| path: resolvedPath, | |
| size: stat.size, | |
| max: MAX_OAUTH_SIZE, | |
| }), | |
| ) | |
| } | |
| } catch (e: any) { | |
| // Surface read failure with localized error (e.g., file missing or inaccessible) | |
| const base = t("common:errors.openaiNativeCodex.oauthReadFailed", { | |
| path: resolvedPath, | |
| error: e?.message || String(e), | |
| }) | |
| throw new Error(base) | |
| } | |
| // Guard file size before reading to prevent loading unexpectedly large files | |
| const MAX_OAUTH_SIZE = 1_000_000 // 1 MB | |
| let stat | |
| try { | |
| stat = await fs.stat(resolvedPath) | |
| } catch (e: any) { | |
| throw new Error( | |
| t("common:errors.openaiNativeCodex.oauthReadFailed", { | |
| path: resolvedPath, | |
| error: e?.message || String(e), | |
| }), | |
| ) | |
| } | |
| if (stat.size > MAX_OAUTH_SIZE) { | |
| throw new Error( | |
| t("common:errors.openaiNativeCodex.oauthFileTooLarge", { | |
| path: resolvedPath, | |
| size: stat.size, | |
| max: MAX_OAUTH_SIZE, | |
| }), | |
| ) | |
| } |
Fix it with Roo Code or mention @roomote and request a fix.
| vertex: "apiModelId", | ||
| "openai-native": "openAiModelId", | ||
| "openai-native-codex": "apiModelId", | ||
| ollama: "ollamaModelId", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For openai-native-codex, modelIdKeysByProvider is set to apiModelId, but OpenAiNativeCodexHandler reads options.apiModelId and the UI writes apiModelId. Keeping this mapping as openAiModelId would likely make codex ignore the user's model selection (and could break org allowlist checks that read by modelIdKeysByProvider).
| ollama: "ollamaModelId", | |
| "openai-native-codex": "apiModelId", |
Fix it with Roo Code or mention @roomote and request a fix.
Review complete. A few correctness/consistency issues remain that should be addressed before merge.
Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues. |
| try { | ||
| const stat = await fs.stat(resolvedPath) | ||
| if (stat.size > MAX_OAUTH_SIZE) { | ||
| throw new Error( | ||
| t("common:errors.openaiNativeCodex.oauthFileTooLarge", { | ||
| path: resolvedPath, | ||
| size: stat.size, | ||
| max: MAX_OAUTH_SIZE, | ||
| }), | ||
| ) | ||
| } | ||
| } catch (e: any) { | ||
| // Surface read failure with localized error (e.g., file missing or inaccessible) | ||
| const base = t("common:errors.openaiNativeCodex.oauthReadFailed", { | ||
| path: resolvedPath, | ||
| error: e?.message || String(e), | ||
| }) | ||
| throw new Error(base) | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oauthFileTooLarge is currently masked: the fs.stat() guard throws a size error, but the surrounding try/catch treats it like a read failure and rethrows oauthReadFailed, so users never see the size-specific message.
| try { | |
| const stat = await fs.stat(resolvedPath) | |
| if (stat.size > MAX_OAUTH_SIZE) { | |
| throw new Error( | |
| t("common:errors.openaiNativeCodex.oauthFileTooLarge", { | |
| path: resolvedPath, | |
| size: stat.size, | |
| max: MAX_OAUTH_SIZE, | |
| }), | |
| ) | |
| } | |
| } catch (e: any) { | |
| // Surface read failure with localized error (e.g., file missing or inaccessible) | |
| const base = t("common:errors.openaiNativeCodex.oauthReadFailed", { | |
| path: resolvedPath, | |
| error: e?.message || String(e), | |
| }) | |
| throw new Error(base) | |
| } | |
| let stat: { size: number } | |
| try { | |
| stat = await fs.stat(resolvedPath) | |
| } catch (e: any) { | |
| throw new Error( | |
| t("common:errors.openaiNativeCodex.oauthReadFailed", { | |
| path: resolvedPath, | |
| error: e?.message || String(e), | |
| }), | |
| ) | |
| } | |
| if (stat.size > MAX_OAUTH_SIZE) { | |
| throw new Error( | |
| t("common:errors.openaiNativeCodex.oauthFileTooLarge", { | |
| path: resolvedPath, | |
| size: stat.size, | |
| max: MAX_OAUTH_SIZE, | |
| }), | |
| ) | |
| } |
Fix it with Roo Code or mention @roomote and request a fix.
| bedrock: "apiModelId", | ||
| vertex: "apiModelId", | ||
| "openai-native": "openAiModelId", | ||
| "openai-native-codex": "apiModelId", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
modelIdKeysByProvider maps openai-native-codex to apiModelId, but the UI also sets apiModelId on every model change for all providers (see ApiOptions), while some flows use provider-specific model id keys to read the selected model. This mismatch can cause allowlist validation to look at a different field than what the Codex provider actually uses, depending on the caller. Suggest aligning Codex with the existing OpenAI-native pattern: use a provider-specific key (e.g. openAiModelId) consistently, or change the UI so Codex does not rely on apiModelId being mirrored.
Fix it with Roo Code or mention @roomote and request a fix.
| const parsed = JSON.parse(data) | ||
| // Persist tier when available (parity with openai-native) | ||
| if (parsed.response?.service_tier) { | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This if (parsed.response?.service_tier) {} block is empty, so it is dead code and looks like an unfinished implementation. Either remove it or actually persist the tier (if it is meaningful for Codex), but leaving an empty conditional makes the SSE parser harder to reason about.
| } | |
| // (intentionally no service_tier handling for Codex) |
Fix it with Roo Code or mention @roomote and request a fix.
…ing; SSE parser cleanup - Validation text for OpenAiNativeCodex auth path now reflects optional with default to ~/.codex/auth.json - Provider Documentation link: map openai-native-codex -> openai to avoid 404 - Remove unused hasContent in SSE parser to satisfy lint - Verified: cd src && npx vitest run (all tests passed)
…nimal effort; add request timeout; improve auth.json error guidance; update ChatGPT link; clear reasoningEffort on model change\n\n- Remove service_tier handling entirely for Codex (server decides tier)\n- Do not auto-default GPT-5 to minimal for Codex; use model/user defaults only\n- Add AbortController using getApiRequestTimeout() to prevent hanging requests\n- Improve auth.json error messages with Codex CLI guidance\n- Update settings link to chatgpt.com\n- Clear reasoningEffort on model change for Codex like native OpenAI
…le Codex system prompt and override rationale\n\n- Add i18n keys under common.errors.openaiNativeCodex and use t() in handler\n- Explain immutability and strategy where we inject overrides in OpenAiNativeCodexHandler\n- Add commentary to codex prompt file describing canonical prompt and override rationale
…lt reasoning effort
…o4-mini) with pricing and caps
…used const, strengthen typing for message transform and auth.json parsing, extract helpers for testability
…e guard, remove previous_response_id, avoid as-any reasoning mutation
…or Codex - Added error message for oversized OAuth credential files in multiple languages (German, Spanish, French, Hindi, Indonesian, Italian, Japanese, Korean, Dutch, Polish, Portuguese, Russian, Turkish, Vietnamese, Chinese Simplified, Chinese Traditional). - Updated settings localization to include OAuth credential path descriptions and related information for Codex in multiple languages.
This reverts commit 9c083bc.
- Fix oauthFileTooLarge error being masked by oauthReadFailed catch path
- Remove empty if (parsed.response?.service_tier) {} block in SSE parser
- Remove unused shouldShowMinimalOption helper from ThinkingBudget
- modelIdKeysByProvider mapping confirmed correct (uses apiModelId)
bf0908d to
4275c92
Compare
Closes #8049
Important
Adds OpenAI Codex provider with new models, UI components, and i18n support, integrating it into the existing system.
openai-native-codexprovider toprovider-settings.tsandindex.ts.openAiNativeCodexSchemafor OAuth credentials inprovider-settings.ts.OpenAiNativeCodexHandlerinopenai-native-codex.tsfor handling API requests.openAiNativeCodexModelsinopenai-codex.tswith models likegpt-5andcodex-mini-latest.gpt-5.OpenAiNativeCodexcomponent inApiOptions.tsxfor UI settings.providerSettingsSchemaDiscriminatedandproviderSettingsSchemato includeopenAiNativeCodexSchema.common.jsonacross various locales.This description was created by
for 4275c92. You can customize this summary. It will automatically update as commits are pushed.