Skip to content

fix: preserve custom model IDs instead of falling back to defaults#11943

Draft
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/custom-model-id-fallback
Draft

fix: preserve custom model IDs instead of falling back to defaults#11943
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/custom-model-id-fallback

Conversation

@roomote-v0
Copy link
Contributor

@roomote-v0 roomote-v0 bot commented Mar 17, 2026

Summary

This PR attempts to address Issue #11936. When users configure a custom model ID (e.g. for use with third-party API gateways), several providers silently replaced it with the default model ID if it was not found in the predefined models map. This meant custom models configured via the UI were ignored.

Problem

The pattern across many providers was:

let id = modelId && modelId in knownModels ? (modelId as KnownModelId) : defaultModelId

This means if a user sets apiModelId to "my-custom-model" and it is not in the predefined models map, both the model ID and the model info fall back to the default -- the user's configured model is completely ignored.

Fix

Changed to follow the pattern already used by DeepSeek and Moonshot:

const id = modelId ?? defaultModelId
const info = knownModels[id as KnownModelId] ?? knownModels[defaultModelId]

Now:

  • If modelId is provided, it is always used as the model ID in API requests
  • If modelId is not in the known models map, the default model's info (capabilities, pricing, context window, etc.) is used as a sensible fallback
  • If modelId is not provided at all (undefined/null), the default model ID is used as before

Affected Providers

  • base-openai-compatible-provider.ts (affects SambaNova, Fireworks, Baseten, and others extending it)
  • anthropic.ts
  • gemini.ts
  • openai-native.ts
  • openai-codex.ts
  • vertex.ts
  • anthropic-vertex.ts
  • minimax.ts
  • xai.ts

Testing

  • All 789 existing provider tests pass
  • Updated 2 tests that explicitly tested the old fallback-to-default behavior to verify the new preserve-custom-model-id behavior
  • All lints and type checks pass

Feedback and guidance are welcome.

Interactively review PR in Roo Code Cloud

When users configure a custom model ID (e.g. for third-party gateways),
several providers silently replaced it with the default model ID if it
was not found in the predefined models map. This fix ensures the
user-provided model ID is always used as-is, with only the model info
(capabilities, pricing, etc.) falling back to the default when the
custom model is not in the known models map.

Affected providers: Anthropic, Gemini, OpenAI Native, OpenAI Codex,
Vertex, Anthropic-Vertex, Minimax, xAI, and all providers extending
BaseOpenAiCompatibleProvider (SambaNova, Fireworks, Baseten, etc.).

Fixes #11936
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant