feat(provider): add LiteLLM provider support#22008
feat(provider): add LiteLLM provider support#22008tobias-weiss-ai-xr wants to merge 1 commit intoanomalyco:devfrom
Conversation
Add litellm entry to the custom() provider map. Uses the bundled @ai-sdk/openai-compatible SDK with user-configured baseURL. Supports LITELLM_API_KEY env var and apiKey in provider options. Autoloads when baseURL is configured in opencode.json under provider.litellm.
|
This PR doesn't fully meet our contributing guidelines and PR template. What needs to be fixed:
Please edit this PR description to address the above within 2 hours, or it will be automatically closed. If you believe this was flagged incorrectly, please let a maintainer know. |
|
The following comment was made by an LLM, it may be inaccurate: Based on my search, I found two potentially related PRs that may be duplicates or should be reviewed together:
These older PRs (#13896, #14468) suggest that LiteLLM provider support may have already been attempted or implemented. You should verify whether PR #22008 is:
|
|
This pull request has been automatically closed because it was not updated to meet our contributing guidelines within the 2-hour window. Feel free to open a new pull request that follows our guidelines. |
Summary
litellmas a built-in provider option using@ai-sdk/openai-compatiblebaseURLto the LiteLLM server address (defaulthttp://localhost:4000)Test plan