fix: preserve custom model IDs instead of falling back to defaults#11943
Draft
roomote-v0[bot] wants to merge 1 commit intomainfrom
Draft
fix: preserve custom model IDs instead of falling back to defaults#11943roomote-v0[bot] wants to merge 1 commit intomainfrom
roomote-v0[bot] wants to merge 1 commit intomainfrom
Conversation
When users configure a custom model ID (e.g. for third-party gateways), several providers silently replaced it with the default model ID if it was not found in the predefined models map. This fix ensures the user-provided model ID is always used as-is, with only the model info (capabilities, pricing, etc.) falling back to the default when the custom model is not in the known models map. Affected providers: Anthropic, Gemini, OpenAI Native, OpenAI Codex, Vertex, Anthropic-Vertex, Minimax, xAI, and all providers extending BaseOpenAiCompatibleProvider (SambaNova, Fireworks, Baseten, etc.). Fixes #11936
Open
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
This PR attempts to address Issue #11936. When users configure a custom model ID (e.g. for use with third-party API gateways), several providers silently replaced it with the default model ID if it was not found in the predefined models map. This meant custom models configured via the UI were ignored.
Problem
The pattern across many providers was:
This means if a user sets
apiModelIdto"my-custom-model"and it is not in the predefined models map, both the model ID and the model info fall back to the default -- the user's configured model is completely ignored.Fix
Changed to follow the pattern already used by DeepSeek and Moonshot:
Now:
modelIdis provided, it is always used as the model ID in API requestsmodelIdis not in the known models map, the default model's info (capabilities, pricing, context window, etc.) is used as a sensible fallbackmodelIdis not provided at all (undefined/null), the default model ID is used as beforeAffected Providers
base-openai-compatible-provider.ts(affects SambaNova, Fireworks, Baseten, and others extending it)anthropic.tsgemini.tsopenai-native.tsopenai-codex.tsvertex.tsanthropic-vertex.tsminimax.tsxai.tsTesting
Feedback and guidance are welcome.
Interactively review PR in Roo Code Cloud