-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Open
Description
This issue is to discuss whether LocalAI should support cloud-based inference providers like Avian.
See PR #8666 which adds Avian as a cloud LLM provider. The maintainer requested an issue be opened before the PR can proceed.
Use case:
- Avian is an OpenAI-compatible inference provider offering DeepSeek V3.2, Kimi K2.5, GLM-5, and MiniMax M2.5 models
- Users may want to configure ZeroClaw (or other tools) to route through Avian using LocalAI as the interface
Question for maintainers:
Would LocalAI consider supporting cloud-based inference backends via OpenAI-compatible endpoints? If so, what is the preferred integration approach?
Related PR: #8666
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels