Skip to content

feat: add MiniMax as an alternative LLM provider (M2.7)#11146

Open
octo-patch wants to merge 2 commits intostackblitz:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as an alternative LLM provider (M2.7)#11146
octo-patch wants to merge 2 commits intostackblitz:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 17, 2026

Summary

  • Add MiniMax as an alternative LLM provider via @ai-sdk/openai (OpenAI-compatible API)
  • Default model: MiniMax-M2.7 (latest flagship with enhanced reasoning and coding)
  • Also supports MiniMax-M2.7-highspeed for low-latency scenarios
  • Configurable via DEFAULT_LLM_PROVIDER and MINIMAX_API_KEY environment variables

Changes

  • Add MiniMax provider factory in model.ts using @ai-sdk/openai with custom baseURL
  • Add MINIMAX_API_KEY and DEFAULT_LLM_PROVIDER support in api-key.ts
  • Add Env type declaration for MINIMAX_API_KEY and DEFAULT_LLM_PROVIDER
  • Update CONTRIBUTING.md with MiniMax provider documentation
  • Add comprehensive unit tests (22 tests) and E2E integration tests (3 tests)

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, available via an OpenAI-compatible API at https://api.minimax.io/v1.

Testing

  • 22 unit tests passing (model.spec.ts + api-key.spec.ts)
  • 3 E2E integration tests passing (basic chat, streaming, highspeed model)

- Add MiniMax-M2.5 and MiniMax-M2.5-highspeed model support via @ai-sdk/openai
- Introduce provider selection via DEFAULT_LLM_PROVIDER env var
- Add MINIMAX_API_KEY environment variable support
- Refactor model.ts to support multiple providers (factory pattern)
- Update stream-text.ts to conditionally apply Anthropic-specific headers
- Add unit tests for model and api-key modules (22 tests)
- Add integration tests for MiniMax API (3 E2E tests)
- Update CONTRIBUTING.md with MiniMax setup instructions
@bolt-new-by-stackblitz
Copy link

Review PR in StackBlitz Codeflow Run & review this pull request in StackBlitz Codeflow.

- Update default model from MiniMax-M2.5 to MiniMax-M2.7
- Update E2E tests to use M2.7 and M2.7-highspeed models
- Update CONTRIBUTING.md documentation
- Keep all previous models as alternatives
@octo-patch octo-patch changed the title feat: add MiniMax as an alternative LLM provider feat: add MiniMax as an alternative LLM provider (M2.7) Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant