Add native OpenRouter model support#232
Conversation
Co-authored-by: OpenAI Codex <codex@openai.com>
|
Claude finished @lewtun's task in 3m 27s —— View job Review: PR #232 — Add native OpenRouter model supportNo blocking issues — 3 P1. Routing change is small and well-scoped: P1 — Auth error message doesn't mention OpenRouter
A user picking P1 —
|
|
Drive-by — same approach landed in my local fork while testing OpenRouter+Kimi K2.6 yesterday, so I dug into the Claude-bot review and have concrete diffs for the three P1s if it helps move this along. P1.1 — auth-error hint ( " • Anthropic: export ANTHROPIC_API_KEY=sk-...\n"
" • OpenAI: export OPENAI_API_KEY=sk-...\n"
+ " • OpenRouter: export OPENROUTER_API_KEY=sk-or-...\n"
" • HF Router: export HF_TOKEN=hf_...\n\n"P1.2 — research sub-agent step-down ( if main_model.startswith("anthropic/"):
return "anthropic/claude-sonnet-4-6"
if main_model.startswith("bedrock/") and "anthropic" in main_model:
return "bedrock/us.anthropic.claude-sonnet-4-6"
+ if main_model.startswith("openrouter/anthropic/"):
+ return "openrouter/anthropic/claude-sonnet-4-6"
return main_modelWithout this, anyone routing Opus 4.7 through OpenRouter pays Opus rates for every research sub-agent — material cost regression vs. the P1.3 —
Happy to push these as commits on your branch if you'd rather not juggle them, or open a follow-up PR after this merges. Also noticed the branch is |
Summary
openrouter/...model-id handling before the Hugging Face Router fallbackopenai/...reserved for direct OpenAI usage and intentionally do not wireOPENAI_BASE_URLTests
uv run --extra dev pytest tests/unit/test_llm_params.py tests/unit/test_cli_rendering.pyuv run ruff check .uv run ruff format --check .Refs #223, #197, #188