Skip to content

Add native OpenRouter model support#232

Open
lewtun wants to merge 1 commit into
mainfrom
codex/native-openrouter-models
Open

Add native OpenRouter model support#232
lewtun wants to merge 1 commit into
mainfrom
codex/native-openrouter-models

Conversation

@lewtun
Copy link
Copy Markdown
Member

@lewtun lewtun commented May 6, 2026

Summary

  • add native openrouter/... model-id handling before the Hugging Face Router fallback
  • keep openai/... reserved for direct OpenAI usage and intentionally do not wire OPENAI_BASE_URL
  • update CLI model-switching help/docs and ensure OpenRouter ids bypass HF Router catalog lookup

Tests

  • uv run --extra dev pytest tests/unit/test_llm_params.py tests/unit/test_cli_rendering.py
  • uv run ruff check .
  • uv run ruff format --check .

Refs #223, #197, #188

Co-authored-by: OpenAI Codex <codex@openai.com>
@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 6, 2026

Claude finished @lewtun's task in 3m 27s —— View job


Review: PR #232 — Add native OpenRouter model support

No blocking issues — 3 P1. Routing change is small and well-scoped: openrouter/... short-circuits before the HF Router fallback in both _resolve_llm_params and _print_hf_routing_info, with matching tests. No P0 found in the routing/auth/agent-loop chain. The three items below are quality-of-life misses for the new prefix.


P1 — Auth error message doesn't mention OpenRouter

agent/core/agent_loop.py:518-520 — when LiteLLM raises 401 for a missing/invalid OpenRouter key, the friendly hint enumerates only Anthropic / OpenAI / HF Router:

"  • Anthropic:   export ANTHROPIC_API_KEY=sk-...\n"
"  • OpenAI:      export OPENAI_API_KEY=sk-...\n"
"  • HF Router:   export HF_TOKEN=hf_...\n\n"

A user picking openrouter/anthropic/claude-opus-4.7 with no OPENROUTER_API_KEY set will hit this branch and get a misleading hint. Add a fourth bullet for OpenRouter so the new prefix has discoverable error guidance. Fix this →

P1 — _get_research_model doesn't step down openrouter/anthropic/...

agent/tools/research_tool.py:222-229 has explicit Sonnet step-downs for anthropic/... and bedrock/.../anthropic, but the new openrouter/anthropic/... ids fall through to "use the same model" — so a session on openrouter/anthropic/claude-opus-4.7 runs every research sub-agent on Opus too. The PR brought a new shape into a function that's pattern-matching prefixes; the step-down should be extended:

if main_model.startswith("openrouter/anthropic/"):
    return "openrouter/anthropic/claude-sonnet-4-6"

Not a correctness break (the fall-through works), but for a frontier-model-via-OpenRouter user this materially changes per-session cost. Fix this →

P1 — test_openai_base_url_is_not_forwarded doesn't test what its name (and the README) claim

tests/unit/test_llm_params.py:33-38 and README.md:64-65 together promise that openai/... is "intentionally not wired" to OPENAI_BASE_URL, so a user who has OPENAI_BASE_URL=https://openrouter.ai/api/v1 set in their env won't have OpenAI calls silently rerouted. The test only verifies that _resolve_llm_params returns {"model": "openai/gpt-5.5"} — but LiteLLM's OpenAI adapter honors OPENAI_BASE_URL from the environment regardless of what's in the params dict (no api_base override is set in llm_params.py:187-197). The invariant the README promises isn't actually enforced.

Either (a) tighten the test to assert end-to-end routing — e.g. monkeypatch litellm.acompletion and verify the resolved request URL when OPENAI_BASE_URL is set — or (b) soften the README claim to "we don't pass api_base; LiteLLM still honors OPENAI_BASE_URL from your env, set it deliberately." Right now a reader could reasonably expect protection that isn't there.


• Branch: codex/native-openrouter-models

@husayni
Copy link
Copy Markdown

husayni commented May 11, 2026

Drive-by — same approach landed in my local fork while testing OpenRouter+Kimi K2.6 yesterday, so I dug into the Claude-bot review and have concrete diffs for the three P1s if it helps move this along.

P1.1 — auth-error hint (agent/core/agent_loop.py:518-520)

         "  • Anthropic:   export ANTHROPIC_API_KEY=sk-...\n"
         "  • OpenAI:      export OPENAI_API_KEY=sk-...\n"
+        "  • OpenRouter:  export OPENROUTER_API_KEY=sk-or-...\n"
         "  • HF Router:   export HF_TOKEN=hf_...\n\n"

P1.2 — research sub-agent step-down (agent/tools/research_tool.py:222-226)

     if main_model.startswith("anthropic/"):
         return "anthropic/claude-sonnet-4-6"
     if main_model.startswith("bedrock/") and "anthropic" in main_model:
         return "bedrock/us.anthropic.claude-sonnet-4-6"
+    if main_model.startswith("openrouter/anthropic/"):
+        return "openrouter/anthropic/claude-sonnet-4-6"
     return main_model

Without this, anyone routing Opus 4.7 through OpenRouter pays Opus rates for every research sub-agent — material cost regression vs. the anthropic/... path.

P1.3 — test_openai_base_url_is_not_forwarded doesn't enforce the README claim
The test only checks the params dict. LiteLLM's OpenAI adapter honors OPENAI_BASE_URL from env regardless of params, so a user with OPENAI_BASE_URL=https://openrouter.ai/api/v1 already set will have their openai/... calls silently re-routed — the invariant the README promises isn't actually there. Two options:

  • (a) Tighten the test: monkeypatch litellm.acompletion and assert the resolved api_base for openai/gpt-5.5 doesn't equal OPENAI_BASE_URL when the env var is set.
  • (b) Soften the README: "we don't pass api_base; LiteLLM still honors OPENAI_BASE_URL from your environment — set it deliberately."

Happy to push these as commits on your branch if you'd rather not juggle them, or open a follow-up PR after this merges. Also noticed the branch is CONFLICTING against main — happy to rebase if useful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants