Skip to content

feat: add MiniMax as a new LLM provider#6723

Open
octo-patch wants to merge 1 commit intoChatGPTNextWeb:mainfrom
octo-patch:feat/add-minimax-provider
Open

feat: add MiniMax as a new LLM provider#6723
octo-patch wants to merge 1 commit intoChatGPTNextWeb:mainfrom
octo-patch:feat/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Add MiniMax as a new LLM provider for NextChat. MiniMax offers OpenAI-compatible API with powerful models including the MiniMax-M2.5 series with up to 204K context window.

Changes

  • New files: app/api/minimax.ts (server-side API handler), app/client/platforms/minimax.ts (client platform)
  • Constants: Added MINIMAX_BASE_URL, MiniMax object, ServiceProvider.MiniMax, ModelProvider.MiniMax, ApiPath.MiniMax, and model definitions (MiniMax-M1/M2/M2.5 series)
  • Server config: MINIMAX_API_KEY and MINIMAX_URL environment variables
  • Access store: minimaxUrl, minimaxApiKey state and isValidMiniMax() validation
  • Auth: Server-side API key injection for MiniMax provider
  • Client API: MiniMaxApi class registration and API key resolution
  • Route handler: MiniMax route in the dynamic provider handler
  • Settings UI: MiniMax endpoint and API key configuration fields
  • Locales: English and Chinese translations
  • Documentation: Updated README.md and README_CN.md with MiniMax env vars

Special handling

  • MiniMax API requires temperature in (0.0, 1.0] — zero values are automatically clamped to 0.01
  • Supports streaming with thinking/reasoning content pass-through

Supported Models

  • MiniMax-M1, MiniMax-M1-80k
  • MiniMax-M2, MiniMax-M2-80k
  • MiniMax-M2.5, MiniMax-M2.5-80k

Test Plan

  • Verify TypeScript compilation passes (tsc --noEmit)
  • Test MiniMax provider selection in Settings UI
  • Test chat with MiniMax models using API key
  • Verify streaming responses work correctly
  • Verify non-streaming responses work correctly

Add MiniMax (https://api.minimax.io) as a new LLM provider with support
for MiniMax-M1, M2, and M2.5 series models. MiniMax provides an
OpenAI-compatible API with up to 204K context window.

Changes include:
- New client platform implementation (app/client/platforms/minimax.ts)
- New server-side API handler (app/api/minimax.ts)
- Provider constants, enums, and model definitions
- Server config for MINIMAX_API_KEY and MINIMAX_URL env vars
- Access store with validation and API key management
- Settings UI with endpoint and API key configuration
- Locale strings for English and Chinese
- Route handler registration
- README documentation for both EN and CN

Note: MiniMax API requires temperature in (0.0, 1.0], handled
automatically by clamping zero values to 0.01.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant