feat: 增加 MiniMax Anthropic 兼容预设#153
Conversation
改动说明: - 在 AI 控制台厂商预设中新增 MiniMax / Anthropic 兼容配置,默认 Base URL 为 https://api.minimaxi.com/anthropic,默认模型为 MiniMax-M2.7。 - 支持通过 MINIMAX_API_KEY、MINIMAX_BASE_URL、MINIMAX_MODEL 初始化 MiniMax 配置。 - 模型列表接口识别 MiniMax Anthropic 端点,并按 MiniMax 文档使用 Bearer Token 认证拉取模型列表。 - 更新 .env.example,仅增加 MiniMax 相关占位示例,不包含任何真实密钥。
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Plus Run ID: 📒 Files selected for processing (1)
✅ Files skipped from review due to trivial changes (1)
📝 WalkthroughWalkthroughAdds MiniMax as an Anthropic-compatible LLM provider by adding environment variables, registering a new minimax-anthropic preset, and switching auth header behavior for MiniMax endpoints. ChangesMiniMax Anthropic integration
Estimated code review effort🎯 2 (Simple) | ⏱️ ~12 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Tip 💬 Introducing Slack Agent: The best way for teams to turn conversations into code.Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.
Built for teams:
One agent for your entire SDLC. Right inside Slack. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
interfaces/api/v1/workbench/llm_control.py (1)
108-133:⚠️ Potential issue | 🟠 MajorSeed the active profile's protocol and base URL when falling back.
Right now the fallback only copies
api_key, so requests that rely on the stored MiniMax profile still default toprotocol='openai'and an empty base URL. That skips_is_minimax_anthropic_base()and sends the wrong auth header unless the caller redundantly supplies the endpoint fields.🔧 Proposed fix
if not candidate.get('api_key'): # 尝试从当前激活配置中获取 key 作为 fallback active = _service.get_active_profile() if active: - candidate['api_key'] = active.api_key + candidate.update({ + 'api_key': active.api_key, + 'protocol': active.protocol, + 'base_url': active.base_url, + })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@interfaces/api/v1/workbench/llm_control.py` around lines 108 - 133, When falling back to the active profile in the candidate block (use _service.get_active_profile() and the candidate dict), also copy the active profile's protocol and base_url into candidate (not just api_key) so api_format = (candidate.get('protocol')...) and base_url = (candidate.get('base_url')...) reflect the stored profile; this ensures _is_minimax_anthropic_base(base_url) sees the MiniMax endpoint and the correct auth header is selected instead of defaulting to openai/empty base URL.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@interfaces/api/v1/workbench/llm_control.py`:
- Around line 108-133: When falling back to the active profile in the candidate
block (use _service.get_active_profile() and the candidate dict), also copy the
active profile's protocol and base_url into candidate (not just api_key) so
api_format = (candidate.get('protocol')...) and base_url =
(candidate.get('base_url')...) reflect the stored profile; this ensures
_is_minimax_anthropic_base(base_url) sees the MiniMax endpoint and the correct
auth header is selected instead of defaulting to openai/empty base URL.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro Plus
Run ID: 587b1cf2-40d7-4a51-be80-ba95f44f14c0
📒 Files selected for processing (3)
.env.exampleapplication/ai/llm_control_service.pyinterfaces/api/v1/workbench/llm_control.py
改动说明:
变更类型
feat新功能fixBug 修复refactor重构(不影响功能)perf性能优化docs文档chore构建/工具链变更说明
架构影响
domain/application/infrastructure/interfaces/frontend/scripts(删除不适用项)测试
python -m uvicorn ...)npm run dev)风险说明
Summary by CodeRabbit