Skip to content

feat: add MiniMax as first-class LLM provider#1964

Open
octo-patch wants to merge 1 commit intomicrosoft:mainfrom
octo-patch:add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#1964
octo-patch wants to merge 1 commit intomicrosoft:mainfrom
octo-patch:add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax AI as a first-class LLM provider, using their OpenAI-compatible API at https://api.minimax.io/v1.

  • Register minimax provider in llms.json with model aliases (large -> MiniMax-M2.7, small -> MiniMax-M2.7-highspeed) and pricing for M2.7/M2.5 model families
  • Add MODEL_PROVIDER_MINIMAX constant and MINIMAX_API_BASE URL in constants.ts
  • Add MINIMAX_API_KEY / MINIMAX_API_BASE environment variable parsing in env.ts
  • Add configuration documentation page at docs/configuration/minimax.mdx
  • Add unit tests (model parsing, provider registration, language model resolution)
  • Add integration tests (env token parsing, custom base URL, missing key error)

How It Works

MiniMax provides an OpenAI-compatible chat completions API, so the existing LocalOpenAICompatibleModel handler works out of the box. Users configure via:

MINIMAX_API_KEY=eyJh...

Then use models like minimax:MiniMax-M2.7 or minimax:MiniMax-M2.5-highspeed.

Available Models

Model Context Description
MiniMax-M2.7 1M tokens Latest flagship model
MiniMax-M2.7-highspeed 1M tokens Faster variant
MiniMax-M2.5 204K tokens Previous generation
MiniMax-M2.5-highspeed 204K tokens Faster variant

Test Plan

  • Unit tests: model identifier parsing for minimax models
  • Unit tests: constants and base URL validation
  • Unit tests: provider registered in defaultModelConfigurations aliases
  • Unit tests: resolveLanguageModel returns OpenAI-compatible model
  • Integration tests: parseTokenFromEnv with MINIMAX_API_KEY
  • Integration tests: custom MINIMAX_API_BASE override
  • Integration tests: error when MINIMAX_API_KEY is missing

Add MiniMax AI as a built-in LLM provider with OpenAI-compatible API.

Changes:
- Add minimax provider entry in llms.json with model aliases and pricing
- Add MODEL_PROVIDER_MINIMAX constant and MINIMAX_API_BASE in constants.ts
- Add MINIMAX_API_KEY/MINIMAX_API_BASE env var parsing in env.ts
- Add configuration docs page (docs/configuration/minimax.mdx)
- Add unit tests and integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant