Skip to content

[Discussion] Add Avian as a cloud LLM inference provider #8877

@avianion

Description

@avianion

This issue is to discuss whether LocalAI should support cloud-based inference providers like Avian.

See PR #8666 which adds Avian as a cloud LLM provider. The maintainer requested an issue be opened before the PR can proceed.

Use case:

  • Avian is an OpenAI-compatible inference provider offering DeepSeek V3.2, Kimi K2.5, GLM-5, and MiniMax M2.5 models
  • Users may want to configure ZeroClaw (or other tools) to route through Avian using LocalAI as the interface

Question for maintainers:
Would LocalAI consider supporting cloud-based inference backends via OpenAI-compatible endpoints? If so, what is the preferred integration approach?

Related PR: #8666

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions