Skip to content

plexusone/omnillm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OmniLLM

Go CI Go Lint Go SAST Go Report Card Docs Docs Visualization License

Batteries-included LLM client that bundles omnillm-core with all thick providers.

Installation

go get github.com/plexusone/omnillm

Quick Start

import "github.com/plexusone/omnillm"

client, _ := omnillm.NewClient(omnillm.ClientConfig{
    Provider: omnillm.ProviderNameOpenAI,
    APIKey:   os.Getenv("OPENAI_API_KEY"),
})

Provider Support

Thin Providers (omnillm-core)

Lightweight implementations using stdlib net/http:

Provider Streaming Tools JSON Mode
OpenAI Yes Yes Yes
Anthropic Yes Yes No
Gemini Yes No No
X.AI (Grok) Yes Yes Yes
GLM (Zhipu) Yes Yes No
Kimi (Moonshot) Yes No No
Qwen (Alibaba) Yes Yes No
Ollama Yes Yes Yes

Thick Providers (Official SDKs)

Full-featured implementations using official vendor SDKs:

Provider Module Streaming Tools JSON Mode
OpenAI omni-openai Yes Yes Yes
Anthropic omnillm-anthropic Yes Yes No
Gemini omnillm-gemini Yes No No
Bedrock omnillm-bedrock Yes Yes No

Thick providers automatically override thin providers when imported.

Thin vs Thick

Aspect Thin (omnillm-core) Thick (omnillm-*)
Dependencies Minimal (stdlib) Official SDK
API Coverage Core features Full coverage
Retries Manual SDK-managed
Auth API key SDK-managed

Selective Import

Import only what you need:

import (
    omnillm "github.com/plexusone/omnillm-core"
    _ "github.com/plexusone/omni-openai/omnillm" // Only OpenAI thick provider
)

Documentation

License

MIT

About

Batteries-included LLM client that bundles omnillm-core with all thick providers.

Resources

License

Stars

Watchers

Forks

Contributors

Languages