Batteries-included LLM client that bundles omnillm-core with all thick providers.
go get github.com/plexusone/omnillmimport "github.com/plexusone/omnillm"
client, _ := omnillm.NewClient(omnillm.ClientConfig{
Provider: omnillm.ProviderNameOpenAI,
APIKey: os.Getenv("OPENAI_API_KEY"),
})Lightweight implementations using stdlib net/http:
| Provider | Streaming | Tools | JSON Mode |
|---|---|---|---|
| OpenAI | Yes | Yes | Yes |
| Anthropic | Yes | Yes | No |
| Gemini | Yes | No | No |
| X.AI (Grok) | Yes | Yes | Yes |
| GLM (Zhipu) | Yes | Yes | No |
| Kimi (Moonshot) | Yes | No | No |
| Qwen (Alibaba) | Yes | Yes | No |
| Ollama | Yes | Yes | Yes |
Full-featured implementations using official vendor SDKs:
| Provider | Module | Streaming | Tools | JSON Mode |
|---|---|---|---|---|
| OpenAI | omni-openai | Yes | Yes | Yes |
| Anthropic | omnillm-anthropic | Yes | Yes | No |
| Gemini | omnillm-gemini | Yes | No | No |
| Bedrock | omnillm-bedrock | Yes | Yes | No |
Thick providers automatically override thin providers when imported.
| Aspect | Thin (omnillm-core) | Thick (omnillm-*) |
|---|---|---|
| Dependencies | Minimal (stdlib) | Official SDK |
| API Coverage | Core features | Full coverage |
| Retries | Manual | SDK-managed |
| Auth | API key | SDK-managed |
Import only what you need:
import (
omnillm "github.com/plexusone/omnillm-core"
_ "github.com/plexusone/omni-openai/omnillm" // Only OpenAI thick provider
)- omnillm-core - Core interfaces and thin providers
- API Reference - GoDoc
MIT