Skip to content

πŸ€– Reusable AI agent library for Omni

License

Notifications You must be signed in to change notification settings

omnidotdev/agent-core

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

4 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

πŸ€– Agent Core

Website | Docs | Feedback | Discord | X

Agent Core is a reusable Rust library for building AI agents in the Omni ecosystem. It provides provider-agnostic LLM integration, conversation management, a permission system, and plan file management.

Features

  • LLM Provider Abstraction β€” BYOM (Bring Your Own Model) trait with streaming completions, supporting Anthropic, OpenAI, and a unified multi-provider backend
  • Conversation Management β€” Multi-turn conversation state with serialization and persistence
  • Permission System β€” Configurable ask/allow/deny presets, session caching, and TUI dialog integration for agent tool use
  • Plan Management β€” Plan file storage for plan mode with project-local and global directories
  • Core Types β€” Messages, roles, content blocks, tool definitions, streaming events, and usage tracking

Installation

Add to your Cargo.toml:

[dependencies]
agent-core = { git = "https://github.com/omnidotdev/agent-core" }

Usage

LLM Provider

Implement the LlmProvider trait to add support for any LLM backend:

use agent_core::provider::{CompletionRequest, CompletionStream, LlmProvider};

struct MyProvider;

#[async_trait::async_trait]
impl LlmProvider for MyProvider {
    fn name(&self) -> &'static str {
        "my-provider"
    }

    async fn stream(
        &self,
        request: CompletionRequest,
    ) -> agent_core::error::Result<CompletionStream> {
        // Stream completion events from your LLM
        todo!()
    }
}

Built-in Providers

use agent_core::providers::{AnthropicProvider, OpenAiProvider, UnifiedProvider};

Conversation

use agent_core::conversation::Conversation;

let mut conv = Conversation::with_system("You are a helpful assistant.");
conv.add_user_message("Hello!");
conv.add_assistant_message("Hi there!");

// Persist to disk
conv.save(Path::new("conversation.json"))?;

Permissions

use agent_core::permission::{PermissionActor, PermissionClient};

let (actor, tx) = PermissionActor::new();
let client = PermissionClient::new("session-id".into(), tx);

// Actor runs in background, client is used by tools to request permissions
tokio::spawn(actor.run());

Development

cargo build   # Build
cargo test    # Run tests
cargo clippy  # Lint

License

The code in this repository is licensed under MIT, Β© Omni LLC. See LICENSE.md for more information.

About

πŸ€– Reusable AI agent library for Omni

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

  •  

Packages

No packages published

Languages