Skip to content

Conversation

@hannesrudolph
Copy link
Collaborator

@hannesrudolph hannesrudolph commented Sep 24, 2025

Closes #8049


Important

Adds OpenAI Codex provider with new models, UI components, and i18n support, integrating it into the existing system.

  • Behavior:
    • Adds openai-native-codex provider to provider-settings.ts and index.ts.
    • Introduces openAiNativeCodexSchema for OAuth credentials in provider-settings.ts.
    • Implements OpenAiNativeCodexHandler in openai-native-codex.ts for handling API requests.
  • Models:
    • Defines openAiNativeCodexModels in openai-codex.ts with models like gpt-5 and codex-mini-latest.
    • Sets default model to gpt-5.
  • UI:
    • Adds OpenAiNativeCodex component in ApiOptions.tsx for UI settings.
    • Updates i18n files for multiple languages to include OpenAI Codex settings.
  • Misc:
    • Updates providerSettingsSchemaDiscriminated and providerSettingsSchema to include openAiNativeCodexSchema.
    • Adds error handling and messages for OpenAI Codex in common.json across various locales.

This description was created by Ellipsis for 4275c92. You can customize this summary. It will automatically update as commits are pushed.

Copilot AI review requested due to automatic review settings September 24, 2025 07:04
@dosubot dosubot bot added size:XL This PR changes 500-999 lines, ignoring generated files. Documentation Improvements or additions to documentation labels Sep 24, 2025
@hannesrudolph hannesrudolph changed the title feat(codex): align validation with optional auth.json path; docs mapping; SSE parser cleanup OpenAI Codex provider Sep 24, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds support for a new OpenAI Codex provider variant that uses ChatGPT web credentials instead of API keys. The implementation includes UI components, validation, and backend provider logic for the openai-native-codex provider.

Key Changes:

  • Added new OpenAI Native Codex provider using local auth.json credentials
  • Added UI components and validation for the optional OAuth credentials path
  • Integrated the provider into settings, model selection, and request routing logic

Reviewed Changes

Copilot reviewed 14 out of 14 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
src/api/providers/openai-native-codex.ts New provider implementation with auth.json credentials and Codex API integration
packages/types/src/providers/openai-codex.ts Type definitions for Codex models (gpt-5, gpt-5-codex)
webview-ui/src/components/settings/providers/OpenAiNativeCodex.tsx React component for Codex provider settings UI
webview-ui/src/i18n/locales/en/settings.json Added validation text for optional auth path
webview-ui/src/components/settings/ApiOptions.tsx Added provider URL mapping and settings integration
Multiple configuration files Updated provider registrations, model mappings, and type definitions

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Sep 24, 2025
@hannesrudolph hannesrudolph moved this from Triage to PR [Needs Prelim Review] in Roo Code Roadmap Sep 24, 2025
roomote[bot]

This comment was marked as outdated.

@hannesrudolph hannesrudolph added PR - Needs Preliminary Review and removed Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. labels Sep 24, 2025
@RayBytes
Copy link

I saw that the model codex-mini-latest is missing here, might also want to add it as a option to be used if people wish too, its a tuned version of o4-mini iirc.

@daniel-lxs daniel-lxs moved this from PR [Needs Prelim Review] to PR [Draft / In Progress] in Roo Code Roadmap Sep 25, 2025
@daniel-lxs daniel-lxs marked this pull request as draft September 25, 2025 20:17
@hannesrudolph hannesrudolph force-pushed the feat/openai-native-codex-polish branch 2 times, most recently from 47043f3 to 5d63945 Compare September 25, 2025 23:09
Copy link
Collaborator Author

@hannesrudolph hannesrudolph left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implemented requested follow-ups: corrected comment to reflect 'medium' default, improved error guidance for auth.json read/parse failures, and added codex-mini-latest model.

@hannesrudolph
Copy link
Collaborator Author

hannesrudolph commented Sep 26, 2025

I saw that the model codex-mini-latest is missing here, might also want to add it as a option to be used if people wish too, its a tuned version of o4-mini iirc.

done

@github-project-automation github-project-automation bot moved this from Triage to Done in Roo Code Roadmap Dec 23, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Dec 23, 2025
Comment on lines 138 to 163
// Guard file size before reading to prevent loading unexpectedly large files
const MAX_OAUTH_SIZE = 1_000_000 // 1 MB
try {
const stat = await fs.stat(resolvedPath)
if (stat.size > MAX_OAUTH_SIZE) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthFileTooLarge", {
path: resolvedPath,
size: stat.size,
max: MAX_OAUTH_SIZE,
}),
)
}
} catch (e: any) {
// Surface read failure with localized error (e.g., file missing or inaccessible)
const base = t("common:errors.openaiNativeCodex.oauthReadFailed", {
path: resolvedPath,
error: e?.message || String(e),
})
throw new Error(base)
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The oauthFileTooLarge error path is currently being caught and re-thrown as oauthReadFailed, which hides the more specific user-facing message. Splitting the fs.stat failure handling from the explicit size check keeps the correct i18n key.

Suggested change
// Guard file size before reading to prevent loading unexpectedly large files
const MAX_OAUTH_SIZE = 1_000_000 // 1 MB
try {
const stat = await fs.stat(resolvedPath)
if (stat.size > MAX_OAUTH_SIZE) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthFileTooLarge", {
path: resolvedPath,
size: stat.size,
max: MAX_OAUTH_SIZE,
}),
)
}
} catch (e: any) {
// Surface read failure with localized error (e.g., file missing or inaccessible)
const base = t("common:errors.openaiNativeCodex.oauthReadFailed", {
path: resolvedPath,
error: e?.message || String(e),
})
throw new Error(base)
}
// Guard file size before reading to prevent loading unexpectedly large files
const MAX_OAUTH_SIZE = 1_000_000 // 1 MB
let stat
try {
stat = await fs.stat(resolvedPath)
} catch (e: any) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthReadFailed", {
path: resolvedPath,
error: e?.message || String(e),
}),
)
}
if (stat.size > MAX_OAUTH_SIZE) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthFileTooLarge", {
path: resolvedPath,
size: stat.size,
max: MAX_OAUTH_SIZE,
}),
)
}

Fix it with Roo Code or mention @roomote and request a fix.

vertex: "apiModelId",
"openai-native": "openAiModelId",
"openai-native-codex": "apiModelId",
ollama: "ollamaModelId",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For openai-native-codex, modelIdKeysByProvider is set to apiModelId, but OpenAiNativeCodexHandler reads options.apiModelId and the UI writes apiModelId. Keeping this mapping as openAiModelId would likely make codex ignore the user's model selection (and could break org allowlist checks that read by modelIdKeysByProvider).

Suggested change
ollama: "ollamaModelId",
"openai-native-codex": "apiModelId",

Fix it with Roo Code or mention @roomote and request a fix.

@hannesrudolph hannesrudolph reopened this Jan 10, 2026
@github-project-automation github-project-automation bot moved this from Done to New in Roo Code Roadmap Jan 10, 2026
@roomote
Copy link
Contributor

roomote bot commented Jan 10, 2026

Oroocle Clock   See task on Roo Cloud

Review complete. A few correctness/consistency issues remain that should be addressed before merge.

  • Fix oauthFileTooLarge being masked by the oauthReadFailed catch path in src/api/providers/openai-native-codex.ts.
  • Confirm modelIdKeysByProvider mapping for openai-native-codex aligns with the field the UI/provider actually use (avoid model-id-field mismatches in validation/allowlist).
  • Remove or implement the empty if (parsed.response?.service_tier) {} block in the Codex SSE parser (src/api/providers/openai-native-codex.ts).

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

Comment on lines 140 to 163
try {
const stat = await fs.stat(resolvedPath)
if (stat.size > MAX_OAUTH_SIZE) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthFileTooLarge", {
path: resolvedPath,
size: stat.size,
max: MAX_OAUTH_SIZE,
}),
)
}
} catch (e: any) {
// Surface read failure with localized error (e.g., file missing or inaccessible)
const base = t("common:errors.openaiNativeCodex.oauthReadFailed", {
path: resolvedPath,
error: e?.message || String(e),
})
throw new Error(base)
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oauthFileTooLarge is currently masked: the fs.stat() guard throws a size error, but the surrounding try/catch treats it like a read failure and rethrows oauthReadFailed, so users never see the size-specific message.

Suggested change
try {
const stat = await fs.stat(resolvedPath)
if (stat.size > MAX_OAUTH_SIZE) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthFileTooLarge", {
path: resolvedPath,
size: stat.size,
max: MAX_OAUTH_SIZE,
}),
)
}
} catch (e: any) {
// Surface read failure with localized error (e.g., file missing or inaccessible)
const base = t("common:errors.openaiNativeCodex.oauthReadFailed", {
path: resolvedPath,
error: e?.message || String(e),
})
throw new Error(base)
}
let stat: { size: number }
try {
stat = await fs.stat(resolvedPath)
} catch (e: any) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthReadFailed", {
path: resolvedPath,
error: e?.message || String(e),
}),
)
}
if (stat.size > MAX_OAUTH_SIZE) {
throw new Error(
t("common:errors.openaiNativeCodex.oauthFileTooLarge", {
path: resolvedPath,
size: stat.size,
max: MAX_OAUTH_SIZE,
}),
)
}

Fix it with Roo Code or mention @roomote and request a fix.

bedrock: "apiModelId",
vertex: "apiModelId",
"openai-native": "openAiModelId",
"openai-native-codex": "apiModelId",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modelIdKeysByProvider maps openai-native-codex to apiModelId, but the UI also sets apiModelId on every model change for all providers (see ApiOptions), while some flows use provider-specific model id keys to read the selected model. This mismatch can cause allowlist validation to look at a different field than what the Codex provider actually uses, depending on the caller. Suggest aligning Codex with the existing OpenAI-native pattern: use a provider-specific key (e.g. openAiModelId) consistently, or change the UI so Codex does not rely on apiModelId being mirrored.

Fix it with Roo Code or mention @roomote and request a fix.

const parsed = JSON.parse(data)
// Persist tier when available (parity with openai-native)
if (parsed.response?.service_tier) {
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This if (parsed.response?.service_tier) {} block is empty, so it is dead code and looks like an unfinished implementation. Either remove it or actually persist the tier (if it is meaningful for Codex), but leaving an empty conditional makes the SSE parser harder to reason about.

Suggested change
}
// (intentionally no service_tier handling for Codex)

Fix it with Roo Code or mention @roomote and request a fix.

…ing; SSE parser cleanup

- Validation text for OpenAiNativeCodex auth path now reflects optional with default to ~/.codex/auth.json
- Provider Documentation link: map openai-native-codex -> openai to avoid 404
- Remove unused hasContent in SSE parser to satisfy lint
- Verified: cd src && npx vitest run (all tests passed)
…nimal effort; add request timeout; improve auth.json error guidance; update ChatGPT link; clear reasoningEffort on model change\n\n- Remove service_tier handling entirely for Codex (server decides tier)\n- Do not auto-default GPT-5 to minimal for Codex; use model/user defaults only\n- Add AbortController using getApiRequestTimeout() to prevent hanging requests\n- Improve auth.json error messages with Codex CLI guidance\n- Update settings link to chatgpt.com\n- Clear reasoningEffort on model change for Codex like native OpenAI
…le Codex system prompt and override rationale\n\n- Add i18n keys under common.errors.openaiNativeCodex and use t() in handler\n- Explain immutability and strategy where we inject overrides in OpenAiNativeCodexHandler\n- Add commentary to codex prompt file describing canonical prompt and override rationale
…used const, strengthen typing for message transform and auth.json parsing, extract helpers for testability
…e guard, remove previous_response_id, avoid as-any reasoning mutation
…or Codex

- Added error message for oversized OAuth credential files in multiple languages (German, Spanish, French, Hindi, Indonesian, Italian, Japanese, Korean, Dutch, Polish, Portuguese, Russian, Turkish, Vietnamese, Chinese Simplified, Chinese Traditional).
- Updated settings localization to include OAuth credential path descriptions and related information for Codex in multiple languages.
- Fix oauthFileTooLarge error being masked by oauthReadFailed catch path
- Remove empty if (parsed.response?.service_tier) {} block in SSE parser
- Remove unused shouldShowMinimalOption helper from ThinkingBudget
- modelIdKeysByProvider mapping confirmed correct (uses apiModelId)
@hannesrudolph hannesrudolph force-pushed the feat/openai-native-codex-polish branch from bf0908d to 4275c92 Compare January 10, 2026 02:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Documentation Improvements or additions to documentation Enhancement New feature or request lgtm This PR has been approved by a maintainer PR - Draft / In Progress size:XL This PR changes 500-999 lines, ignoring generated files. UI/UX UI/UX related or focused

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

[ENHANCEMENT] Codex CLI provider with local login (no API key)

5 participants