Skip to content

feat: upgrade MiniMax default model to M2.7#98

Open
octo-patch wants to merge 1 commit intoshareAI-lab:mainfrom
octo-patch:feature/upgrade-minimax-m27
Open

feat: upgrade MiniMax default model to M2.7#98
octo-patch wants to merge 1 commit intoshareAI-lab:mainfrom
octo-patch:feature/upgrade-minimax-m27

Conversation

@octo-patch
Copy link

Summary

  • Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to the model selection list
  • Set MiniMax-M2.7 as the new default model for both international and China mainland endpoints
  • Retain MiniMax-M2.5 as a previous generation alternative

Changes

Updated .env.example to include the latest MiniMax M2.7 models:

  • MiniMax-M2.7 — Latest flagship model with enhanced reasoning and coding capabilities (new default)
  • MiniMax-M2.7-highspeed — High-speed version of M2.7 for low-latency scenarios
  • MiniMax-M2.5 — Kept as previous generation option

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, superseding M2.5 as the recommended default.

Testing

  • Verified all MODEL_ID references updated correctly
  • Both international (api.minimax.io) and China mainland (api.minimaxi.com) sections updated consistently

- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list
- Set MiniMax-M2.7 as default model for both international and China mainland endpoints
- Keep MiniMax-M2.5 as previous generation alternative
@vercel
Copy link

vercel bot commented Mar 18, 2026

Someone is attempting to deploy a commit to the crazyboym's projects Team on Vercel.

A member of the Team first needs to authorize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant