feat: add MiniMax as first-class LLM provider#619
Open
octo-patch wants to merge 1 commit intoValueCell-ai:mainfrom
Open
feat: add MiniMax as first-class LLM provider#619octo-patch wants to merge 1 commit intoValueCell-ai:mainfrom
octo-patch wants to merge 1 commit intoValueCell-ai:mainfrom
Conversation
Add MiniMax AI as a dedicated model provider with full integration across the backend, frontend, and configuration system. Backend changes: - MiniMaxProvider class in factory.py using agno's OpenAILike with temperature clamping to (0.0, 1.0] range - Provider registered in ModelFactory._providers registry - MiniMax YAML config with M2.7, M2.7-highspeed, M2.5, M2.5-highspeed models (all 204K context) - config.yaml registration with MINIMAX_API_KEY env var - JSON mode detection for minimax.io base_url in model.py - API key URL mapping in models router Frontend changes: - MiniMax provider icon in model-providers/ - Icon registration in icons.ts constants - i18n strings for en, zh_CN, zh_TW, ja locales Documentation: - README (EN/ZH/ZH_Hant/JA) provider list updated - CONFIGURATION_GUIDE.md provider table updated Tests: 34 tests (28 unit + 6 integration), all passing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Add MiniMax as a first-class LLM provider in ValueCell, supporting MiniMax-M2.7, M2.7-highspeed, M2.5, and M2.5-highspeed models (all with 204K context length) via the OpenAI-compatible API.
Changes
Backend (6 files)
MiniMaxProviderclass infactory.py— extendsModelProvider, uses agno'sOpenAILikewith temperature clamping to(0.0, 1.0]rangeModelFactory._providersregistryminimax.yaml) with 4 models and connection settingsconfig.yamlregistration withMINIMAX_API_KEYenv varminimax.iobase URL inmodel.pyFrontend (5 files)
icons.tsDocumentation (5 files)
CONFIGURATION_GUIDE.mdprovider table updatedTests (2 files, 34 tests)
Configuration
Models
Test Plan
pytest valuecell/adapters/models/tests/test_minimax_provider.py)MINIMAX_API_KEYand verify model creation20 files changed, 1195 insertions(+), 9 deletions(-)