feat: add Novita AI as LLM provider#3109
Conversation
- Add AIProvider_Novita constant to provider list - Add Novita endpoint URL and API token secret name constants - Add Novita provider defaults (OpenAI-compatible API with tools and images support) - Register Novita in schema and config enums Users can now use Novita AI by setting ai:provider to 'novita' and storing their API key in the NOVITA_API_KEY secret.
WalkthroughThis pull request adds support for Novita as an AI provider. Changes include introducing a new Estimated code review effort🎯 2 (Simple) | ⏱️ ~12 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Code Review SummaryStatus: No Issues Found | Recommendation: Merge Files Reviewed (4 files)
The implementation follows the established patterns for other AI providers (Groq, OpenRouter, NanoGPT) and uses an OpenAI-compatible API endpoint as described in the PR. Reviewed by minimax-m2.5 · 240,008 tokens |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@pkg/aiusechat/usechat-mode.go`:
- Around line 130-143: When config.Provider equals uctypes.AIProvider_Novita,
force config.APIType to uctypes.APIType_OpenAIChat unconditionally (do not allow
a manual ai:apitype override such as openai-responses) to prevent mismatched
request/endpoint behavior; update the block around
config.Provider/uctypes.AIProvider_Novita to always set config.APIType =
uctypes.APIType_OpenAIChat (overwriting any existing value) and optionally add a
short comment explaining Novita only supports OpenAI chat/completions endpoints.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 40b24f61-e396-4799-9fd3-b7779905442b
📒 Files selected for processing (4)
pkg/aiusechat/uctypes/uctypes.gopkg/aiusechat/usechat-mode.gopkg/wconfig/settingsconfig.goschema/waveai.json
| if config.Provider == uctypes.AIProvider_Novita { | ||
| if config.APIType == "" { | ||
| config.APIType = uctypes.APIType_OpenAIChat | ||
| } | ||
| if config.Endpoint == "" { | ||
| config.Endpoint = NovitaChatEndpoint | ||
| } | ||
| if config.APITokenSecretName == "" { | ||
| config.APITokenSecretName = NovitaAPITokenSecretName | ||
| } | ||
| if len(config.Capabilities) == 0 { | ||
| config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages} | ||
| } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Novita AI OpenAI-compatible API docs: does Novita support /v1/responses, or is /v1/chat/completions the only supported generation endpoint? Please provide the official documentation URL.
💡 Result:
Novita AI does not support the /v1/responses endpoint. The supported OpenAI-compatible generation endpoints are /v1/chat/completions and /v1/completions. No mention of /v1/responses appears in the official documentation across multiple pages. Official documentation URLs: - API Reference Overview: https://novita.ai/docs/api-reference/api-reference-overview (lists LLM APIs including Create Chat Completion and Create Completion) - LLM API Guide: https://novita.ai/docs/model-api/reference/llm/llm.html and https://novita.ai/docs/guides/llm-api (shows curl examples for /openai/v1/chat/completions and /openai/v1/completions) - List Models: https://novita.ai/docs/api-reference/model-apis-llm-list-models (compatible with OpenAI /v1/models)
Citations:
- 1: https://novita.ai/docs/api-reference/api-reference-overview
- 2: https://novita.ai/docs/model-api/reference/llm/llm.html
- 3: https://novita.ai/docs/api-reference/model-apis-llm-list-models
🏁 Script executed:
# Find definitions of APIType in the codebase
fd -t f "\.go$" | xargs rg "APIType_OpenAI|type APIType|const.*APIType" | head -50Repository: wavetermdev/waveterm
Length of output: 3747
🏁 Script executed:
# Look for the uctypes package to understand available APIType values
fd -t f "uctypes" -o -t d "uctypes" | head -20Repository: wavetermdev/waveterm
Length of output: 180
🏁 Script executed:
# Search for how Novita provider is configured elsewhere in the codebase
rg "AIProvider_Novita" -A 5 -B 5 | head -100Repository: wavetermdev/waveterm
Length of output: 1389
🏁 Script executed:
# Find NovitaChatEndpoint definition
rg "NovitaChatEndpoint|NovitaAPITokenSecretName" -A 2 -B 2Repository: wavetermdev/waveterm
Length of output: 1428
🏁 Script executed:
# Look for endpoint URL patterns and how APIType maps to endpoints
rg "chat/completions|/responses|Endpoint.*APIType" -B 3 -A 3 | head -100Repository: wavetermdev/waveterm
Length of output: 6797
🏁 Script executed:
# Check the usechat-backend.go to see how APIType_OpenAIResponses vs APIType_OpenAIChat are handled
cat pkg/aiusechat/usechat-backend.go | head -200Repository: wavetermdev/waveterm
Length of output: 8505
🏁 Script executed:
# Look for validation of APIType/endpoint combinations
rg "isValidAzureResourceName|isAzureAPIType|isOpenAIAPIType" -B 2 -A 10Repository: wavetermdev/waveterm
Length of output: 3454
🏁 Script executed:
# Check if there's any validation or constraint for Novita in particular
rg -A 20 "func.*Novita|Novita.*func" pkg/aiusechat/usechat-mode.goRepository: wavetermdev/waveterm
Length of output: 46
🏁 Script executed:
# Look at OpenAI provider configuration to see if they have similar constraints
rg -B 5 -A 15 "AIProvider_OpenAI {" pkg/aiusechat/usechat-mode.goRepository: wavetermdev/waveterm
Length of output: 165
🏁 Script executed:
# Look for the complete usechat-mode.go file around the Novita section to see full context
sed -n '120,200p' pkg/aiusechat/usechat-mode.goRepository: wavetermdev/waveterm
Length of output: 2840
Constrain Novita provider to OpenAI Chat API to prevent endpoint/request format mismatches.
Novita's OpenAI-compatible API only supports /v1/chat/completions and /v1/completions endpoints—not /v1/responses. The current code defaults to Chat Completions but allows ai:apitype to be manually set to openai-responses, which would cause runtime 4xx failures if the request formatter doesn't match the actual endpoint.
Suggested hardening
if config.Provider == uctypes.AIProvider_Novita {
- if config.APIType == "" {
- config.APIType = uctypes.APIType_OpenAIChat
- }
+ // Novita only supports chat-completions; force to prevent APIType/endpoint mismatch.
+ if config.APIType == "" || config.APIType != uctypes.APIType_OpenAIChat {
+ config.APIType = uctypes.APIType_OpenAIChat
+ }
if config.Endpoint == "" {
config.Endpoint = NovitaChatEndpoint
}
if config.APITokenSecretName == "" {
config.APITokenSecretName = NovitaAPITokenSecretName
}
if len(config.Capabilities) == 0 {
config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages}
}
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if config.Provider == uctypes.AIProvider_Novita { | |
| if config.APIType == "" { | |
| config.APIType = uctypes.APIType_OpenAIChat | |
| } | |
| if config.Endpoint == "" { | |
| config.Endpoint = NovitaChatEndpoint | |
| } | |
| if config.APITokenSecretName == "" { | |
| config.APITokenSecretName = NovitaAPITokenSecretName | |
| } | |
| if len(config.Capabilities) == 0 { | |
| config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages} | |
| } | |
| } | |
| if config.Provider == uctypes.AIProvider_Novita { | |
| // Novita only supports chat-completions; force to prevent APIType/endpoint mismatch. | |
| if config.APIType == "" || config.APIType != uctypes.APIType_OpenAIChat { | |
| config.APIType = uctypes.AIType_OpenAIChat | |
| } | |
| if config.Endpoint == "" { | |
| config.Endpoint = NovitaChatEndpoint | |
| } | |
| if config.APITokenSecretName == "" { | |
| config.APITokenSecretName = NovitaAPITokenSecretName | |
| } | |
| if len(config.Capabilities) == 0 { | |
| config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages} | |
| } | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@pkg/aiusechat/usechat-mode.go` around lines 130 - 143, When config.Provider
equals uctypes.AIProvider_Novita, force config.APIType to
uctypes.APIType_OpenAIChat unconditionally (do not allow a manual ai:apitype
override such as openai-responses) to prevent mismatched request/endpoint
behavior; update the block around config.Provider/uctypes.AIProvider_Novita to
always set config.APIType = uctypes.APIType_OpenAIChat (overwriting any existing
value) and optionally add a short comment explaining Novita only supports OpenAI
chat/completions endpoints.
Summary
Add Novita AI as a new LLM provider. Novita offers an OpenAI-compatible API with competitive pricing and a wide selection of open-source models.
Changes
NOVITA_API_KEYenvironment variableConfiguration
Endpoint:
https://api.novita.ai/openai