feat: add MiniMax as a new LLM provider#11367
feat: add MiniMax as a new LLM provider#11367octo-patch wants to merge 2 commits intocontinuedev:mainfrom
Conversation
Add MiniMax (https://platform.minimax.io) as a new LLM provider with OpenAI-compatible API support. Changes: - Add MiniMax LLM provider class extending OpenAI with temperature clamping (must be in (0, 1]) and response_format removal - Register provider in LLMClasses, openai-adapters, and config-types - Add model info for MiniMax-M2.5 and MiniMax-M2.5-highspeed (204K context, 192K max output) - Add GUI model selection entries and provider configuration - Add provider documentation page
|
I have read the CLA Document and I hereby sign the CLA 1 out of 2 committers have signed the CLA. |
There was a problem hiding this comment.
1 issue found across 10 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/openai-adapters/src/index.ts">
<violation number="1" location="packages/openai-adapters/src/index.ts:145">
P1: MiniMax is wired to generic `OpenAIApi`, which skips the repo’s MiniMax-specific request fixes (temperature clamping and `response_format` removal), creating a real incompatibility path in adapter-based runtime flows.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
|
I have read the CLA Document and I hereby sign the CLA |
The minimax provider was wired to the generic OpenAIApi via openAICompatible(), which skips MiniMax-specific request fixes. This creates a dedicated MiniMaxApi adapter class that overrides modifyChatBody to apply temperature clamping (MiniMax requires temperature in (0.0, 1.0]) and response_format removal, matching the adaptations already present in core/llm/llms/MiniMax.ts. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Summary
Changes
New Files
core/llm/llms/MiniMax.ts— Provider class extending OpenAI with:(0, 1]range (MiniMax rejects0)response_formatremoval (unsupported)https://api.minimax.io/v1/packages/llm-info/src/providers/minimax.ts— Model metadatadocs/customize/model-providers/more/minimax.mdx— Provider documentationModified Files
core/llm/llms/index.ts— Register in LLMClassespackages/openai-adapters/src/index.ts— Register OpenAI-compatible adapterpackages/config-types/src/index.ts— Add"minimax"to provider enumpackages/llm-info/src/index.ts— Add to allModelProvidersgui/src/pages/AddNewModel/configs/models.ts— Model entriesgui/src/pages/AddNewModel/configs/providers.ts— Provider config with API key setupdocs/customize/model-providers/overview.mdx— Add to hosted services tableModels
Test Plan
Summary by cubic
Add MiniMax as an OpenAI-compatible provider with GUI and docs, including
MiniMax-M2.5andMiniMax-M2.5-highspeed(204K context). Adds a dedicatedMiniMaxApiadapter to enforce MiniMax-specific request rules.New Features
minimaxprovider usinghttps://api.minimax.io/v1/via a dedicatedMiniMaxApi(OpenAI-compatible) with temperature clamping andresponse_formatremoval.LLMClasses, config types, andllm-info.MiniMax-M2.5andMiniMax-M2.5-highspeed.Migration
provider: minimaxwith your API key, or setMINIMAX_API_KEY.apiBasetohttps://api.minimaxi.com/v1/.Written for commit 8104b5b. Summary will update on new commits.