Skip to content

feat: add Novita AI as LLM provider#3109

Open
Alex-wuhu wants to merge 1 commit intowavetermdev:mainfrom
Alex-wuhu:novita-integration
Open

feat: add Novita AI as LLM provider#3109
Alex-wuhu wants to merge 1 commit intowavetermdev:mainfrom
Alex-wuhu:novita-integration

Conversation

@Alex-wuhu
Copy link

Summary

Add Novita AI as a new LLM provider. Novita offers an OpenAI-compatible API with competitive pricing and a wide selection of open-source models.

Changes

  • New Novita provider following existing provider patterns
  • API key configuration via NOVITA_API_KEY environment variable
  • Support for chat, completion, and embedding models

Configuration

NOVITA_API_KEY=your_key_here

Endpoint: https://api.novita.ai/openai

- Add AIProvider_Novita constant to provider list
- Add Novita endpoint URL and API token secret name constants
- Add Novita provider defaults (OpenAI-compatible API with tools and images support)
- Register Novita in schema and config enums

Users can now use Novita AI by setting ai:provider to 'novita' and
storing their API key in the NOVITA_API_KEY secret.
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 24, 2026

Walkthrough

This pull request adds support for Novita as an AI provider. Changes include introducing a new AIProvider_Novita constant, configuring default settings for Novita provider with endpoint and API token details, and updating schema validation across multiple configuration files to recognize "novita" as a valid provider option.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~12 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The pull request title accurately summarizes the main change: adding Novita AI as a new LLM provider. It is concise, specific, and clearly describes the primary objective.
Description check ✅ Passed The pull request description is directly related to the changeset, providing context about Novita AI, listing the key changes made, and documenting the configuration requirements.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@kilo-code-bot
Copy link
Contributor

kilo-code-bot bot commented Mar 24, 2026

Code Review Summary

Status: No Issues Found | Recommendation: Merge

Files Reviewed (4 files)
  • pkg/aiusechat/uctypes/uctypes.go - Added novita provider constant
  • pkg/aiusechat/usechat-mode.go - Added endpoint, API key secret name, and provider defaults
  • pkg/wconfig/settingsconfig.go - Added novita to jsonschema enum
  • schema/waveai.json - Added novita to JSON schema enum

The implementation follows the established patterns for other AI providers (Groq, OpenRouter, NanoGPT) and uses an OpenAI-compatible API endpoint as described in the PR.


Reviewed by minimax-m2.5 · 240,008 tokens

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@pkg/aiusechat/usechat-mode.go`:
- Around line 130-143: When config.Provider equals uctypes.AIProvider_Novita,
force config.APIType to uctypes.APIType_OpenAIChat unconditionally (do not allow
a manual ai:apitype override such as openai-responses) to prevent mismatched
request/endpoint behavior; update the block around
config.Provider/uctypes.AIProvider_Novita to always set config.APIType =
uctypes.APIType_OpenAIChat (overwriting any existing value) and optionally add a
short comment explaining Novita only supports OpenAI chat/completions endpoints.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 40b24f61-e396-4799-9fd3-b7779905442b

📥 Commits

Reviewing files that changed from the base of the PR and between 2b11043 and fd319e0.

📒 Files selected for processing (4)
  • pkg/aiusechat/uctypes/uctypes.go
  • pkg/aiusechat/usechat-mode.go
  • pkg/wconfig/settingsconfig.go
  • schema/waveai.json

Comment on lines +130 to +143
if config.Provider == uctypes.AIProvider_Novita {
if config.APIType == "" {
config.APIType = uctypes.APIType_OpenAIChat
}
if config.Endpoint == "" {
config.Endpoint = NovitaChatEndpoint
}
if config.APITokenSecretName == "" {
config.APITokenSecretName = NovitaAPITokenSecretName
}
if len(config.Capabilities) == 0 {
config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages}
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

Novita AI OpenAI-compatible API docs: does Novita support /v1/responses, or is /v1/chat/completions the only supported generation endpoint? Please provide the official documentation URL.

💡 Result:

Novita AI does not support the /v1/responses endpoint. The supported OpenAI-compatible generation endpoints are /v1/chat/completions and /v1/completions. No mention of /v1/responses appears in the official documentation across multiple pages. Official documentation URLs: - API Reference Overview: https://novita.ai/docs/api-reference/api-reference-overview (lists LLM APIs including Create Chat Completion and Create Completion) - LLM API Guide: https://novita.ai/docs/model-api/reference/llm/llm.html and https://novita.ai/docs/guides/llm-api (shows curl examples for /openai/v1/chat/completions and /openai/v1/completions) - List Models: https://novita.ai/docs/api-reference/model-apis-llm-list-models (compatible with OpenAI /v1/models)

Citations:


🏁 Script executed:

# Find definitions of APIType in the codebase
fd -t f "\.go$" | xargs rg "APIType_OpenAI|type APIType|const.*APIType" | head -50

Repository: wavetermdev/waveterm

Length of output: 3747


🏁 Script executed:

# Look for the uctypes package to understand available APIType values
fd -t f "uctypes" -o -t d "uctypes" | head -20

Repository: wavetermdev/waveterm

Length of output: 180


🏁 Script executed:

# Search for how Novita provider is configured elsewhere in the codebase
rg "AIProvider_Novita" -A 5 -B 5 | head -100

Repository: wavetermdev/waveterm

Length of output: 1389


🏁 Script executed:

# Find NovitaChatEndpoint definition
rg "NovitaChatEndpoint|NovitaAPITokenSecretName" -A 2 -B 2

Repository: wavetermdev/waveterm

Length of output: 1428


🏁 Script executed:

# Look for endpoint URL patterns and how APIType maps to endpoints
rg "chat/completions|/responses|Endpoint.*APIType" -B 3 -A 3 | head -100

Repository: wavetermdev/waveterm

Length of output: 6797


🏁 Script executed:

# Check the usechat-backend.go to see how APIType_OpenAIResponses vs APIType_OpenAIChat are handled
cat pkg/aiusechat/usechat-backend.go | head -200

Repository: wavetermdev/waveterm

Length of output: 8505


🏁 Script executed:

# Look for validation of APIType/endpoint combinations
rg "isValidAzureResourceName|isAzureAPIType|isOpenAIAPIType" -B 2 -A 10

Repository: wavetermdev/waveterm

Length of output: 3454


🏁 Script executed:

# Check if there's any validation or constraint for Novita in particular
rg -A 20 "func.*Novita|Novita.*func" pkg/aiusechat/usechat-mode.go

Repository: wavetermdev/waveterm

Length of output: 46


🏁 Script executed:

# Look at OpenAI provider configuration to see if they have similar constraints
rg -B 5 -A 15 "AIProvider_OpenAI {" pkg/aiusechat/usechat-mode.go

Repository: wavetermdev/waveterm

Length of output: 165


🏁 Script executed:

# Look for the complete usechat-mode.go file around the Novita section to see full context
sed -n '120,200p' pkg/aiusechat/usechat-mode.go

Repository: wavetermdev/waveterm

Length of output: 2840


Constrain Novita provider to OpenAI Chat API to prevent endpoint/request format mismatches.

Novita's OpenAI-compatible API only supports /v1/chat/completions and /v1/completions endpoints—not /v1/responses. The current code defaults to Chat Completions but allows ai:apitype to be manually set to openai-responses, which would cause runtime 4xx failures if the request formatter doesn't match the actual endpoint.

Suggested hardening
 if config.Provider == uctypes.AIProvider_Novita {
-	if config.APIType == "" {
-		config.APIType = uctypes.APIType_OpenAIChat
-	}
+	// Novita only supports chat-completions; force to prevent APIType/endpoint mismatch.
+	if config.APIType == "" || config.APIType != uctypes.APIType_OpenAIChat {
+		config.APIType = uctypes.APIType_OpenAIChat
+	}
 	if config.Endpoint == "" {
 		config.Endpoint = NovitaChatEndpoint
 	}
 	if config.APITokenSecretName == "" {
 		config.APITokenSecretName = NovitaAPITokenSecretName
 	}
 	if len(config.Capabilities) == 0 {
 		config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages}
 	}
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if config.Provider == uctypes.AIProvider_Novita {
if config.APIType == "" {
config.APIType = uctypes.APIType_OpenAIChat
}
if config.Endpoint == "" {
config.Endpoint = NovitaChatEndpoint
}
if config.APITokenSecretName == "" {
config.APITokenSecretName = NovitaAPITokenSecretName
}
if len(config.Capabilities) == 0 {
config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages}
}
}
if config.Provider == uctypes.AIProvider_Novita {
// Novita only supports chat-completions; force to prevent APIType/endpoint mismatch.
if config.APIType == "" || config.APIType != uctypes.APIType_OpenAIChat {
config.APIType = uctypes.AIType_OpenAIChat
}
if config.Endpoint == "" {
config.Endpoint = NovitaChatEndpoint
}
if config.APITokenSecretName == "" {
config.APITokenSecretName = NovitaAPITokenSecretName
}
if len(config.Capabilities) == 0 {
config.Capabilities = []string{uctypes.AICapabilityTools, uctypes.AICapabilityImages}
}
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@pkg/aiusechat/usechat-mode.go` around lines 130 - 143, When config.Provider
equals uctypes.AIProvider_Novita, force config.APIType to
uctypes.APIType_OpenAIChat unconditionally (do not allow a manual ai:apitype
override such as openai-responses) to prevent mismatched request/endpoint
behavior; update the block around config.Provider/uctypes.AIProvider_Novita to
always set config.APIType = uctypes.APIType_OpenAIChat (overwriting any existing
value) and optionally add a short comment explaining Novita only supports OpenAI
chat/completions endpoints.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant