Skip to content

feat: add MiniMax as a new LLM provider#6724

Open
octo-patch wants to merge 2 commits intoChatGPTNextWeb:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as a new LLM provider#6724
octo-patch wants to merge 2 commits intoChatGPTNextWeb:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 15, 2026

Summary

Add MiniMax as a new LLM provider with the latest M2.7 model support.

Changes

  • Add MiniMax provider integration (API route, client platform, settings UI)
  • Include MiniMax-M2.7 and MiniMax-M2.7-highspeed as default models
  • Support older models (M1, M2, M2.5) as alternatives
  • Add i18n strings for Chinese and English

Model List

  • MiniMax-M2.7 (default) — Latest flagship model with enhanced reasoning and coding
  • MiniMax-M2.7-highspeed — High-speed version of M2.7 for low-latency scenarios
  • MiniMax-M2.5 / M2.5-80k
  • MiniMax-M2 / M2-80k
  • MiniMax-M1 / M1-80k

Testing

  • Verified MiniMax API connectivity with M2.7 model
  • Temperature clamping works correctly (MiniMax requires > 0)

Add MiniMax (https://api.minimax.io) as a new LLM provider with support
for MiniMax-M1, M2, and M2.5 series models. MiniMax provides an
OpenAI-compatible API with up to 204K context window.

Changes include:
- New client platform implementation (app/client/platforms/minimax.ts)
- New server-side API handler (app/api/minimax.ts)
- Provider constants, enums, and model definitions
- Server config for MINIMAX_API_KEY and MINIMAX_URL env vars
- Access store with validation and API key management
- Settings UI with endpoint and API key configuration
- Locale strings for English and Chinese
- Route handler registration
- README documentation for both EN and CN

Note: MiniMax API requires temperature in (0.0, 1.0], handled
automatically by clamping zero values to 0.01.
- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list
- Set MiniMax-M2.7 as default model (first in list)
- Keep all previous models as alternatives
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant