AIChat is an all-in-one LLM platform featuring a powerful CLI tool with Shell Assistant, CMD & REPL modes, RAG, AI Tools & Agents, and a comprehensive web interface with Matrix theming, API marketplace, document processing, and cloud integrations.
AIChat Web Platform Transformation - AIChat is evolving from a CLI-only tool into a full-featured web platform while maintaining all CLI capabilities. The transformation includes:
- π¨ Matrix-Themed Web Interface - Immersive cyberpunk UI with animated rain effects
- π User Authentication & Sessions - JWT-based auth with persistent sessions
- πͺ API Key Marketplace - Buy/sell API access with dynamic pricing & intelligent routing
- π³ Integrated Billing System - Stripe payments, balance tracking, and automated payouts
- π₯οΈ Web Terminal Emulator - Full CLI experience in browser via xterm.js
- π Multi-Provider Chat - Side-by-side model comparison with real-time cost tracking
- βοΈ Cloud Document Processing - Google Drive & OneDrive integration with RAG
- π Web Search Integration - DuckDuckGo & Google Search API with citations
- π Analytics Dashboard - Usage metrics, cost analysis, and PDF reports
Current Status: Foundation complete (13%) - Authentication system and Matrix theme operational. View Epic Progress β
- 20+ LLM Providers - OpenAI, Claude, Gemini, Ollama, and more in one tool
- Shell Assistant - Natural language to shell commands
- Interactive REPL - Tab completion, history search, multi-line input
- RAG Integration - Chat with your documents and codebase
- AI Agents - Build custom GPT-like agents with tools and knowledge
- Function Calling - Connect LLMs to external APIs and services
- Session Management - Persistent conversations with context
-
API Marketplace - Monetize your API keys or find cheaper alternatives
- Dynamic pricing with real-time cost calculations
- Intelligent routing with automatic failover
- Capacity tracking and usage limits
- Revenue sharing with automated payouts
-
Multi-Provider Arena - Compare LLMs side-by-side
- Simultaneous queries to multiple providers
- Real-time cost comparison per response
- Response quality voting and analytics
- Export conversation comparisons
-
Cloud Document Hub - Unified document processing
- Google Drive and OneDrive integration
- Automatic RAG processing for uploaded files
- Citation-aware responses with source links
- Collaborative document discussions
-
Analytics & Insights - Track everything
- Usage metrics across all providers
- Cost analysis and spending trends
- Performance monitoring dashboards
- PDF report generation and scheduling
-
Matrix-Themed Experience - Cyberpunk aesthetics
- Animated code rain effects
- Terminal-style interface design
- Green phosphor text styling
- Retro-futuristic UI elements
- Rust Developers:
cargo install aichat - Homebrew/Linuxbrew Users:
brew install aichat - Pacman Users:
pacman -S aichat - Windows Scoop Users:
scoop install aichat - Android Termux Users:
pkg install aichat
Download pre-built binaries for macOS, Linux, and Windows from GitHub Releases, extract them, and add the aichat binary to your $PATH.
# Set your API key (OpenAI example)
export OPENAI_API_KEY="sk-..."
# Ask a question
aichat "Explain Rust ownership"
# Start interactive REPL
aichat
# Use shell assistant
aichat --execute "find all rust files modified today"
# Chat with documents (RAG)
aichat --file docs/ "Summarize the architecture"
# Use a specific model
aichat --model claude:claude-3-5-sonnet "Write a poem"# Start the web server
aichat --serve
# Or on a specific port
aichat --serve 3000
# Access the platform
# 1. Open http://localhost:8000 in your browser
# 2. Create an account or log in
# 3. Explore the Matrix-themed interface
# 4. Try the marketplace, multi-provider chat, or terminal# Configure your LLM providers
aichat --list-models # View available models
aichat --model openai:gpt-4 # Set default model
# Create a custom role
aichat --role .edit # Opens editor for new role
# Set up an AI agent
aichat --agent my-agent .edit # Configure agent with tools & docsIntegrate seamlessly with over 20 leading LLM providers through a unified interface. Supported providers include OpenAI, Claude, Gemini (Google AI Studio), Ollama, Groq, Azure-OpenAI, VertexAI, Bedrock, Github Models, Mistral, Deepseek, AI21, XAI Grok, Cohere, Perplexity, Cloudflare, OpenRouter, Ernie, Qianwen, Moonshot, ZhipuAI, MiniMax, Deepinfra, VoyageAI, any OpenAI-Compatible API provider.
Explore powerful command-line functionalities with AIChat's CMD mode.
Experience an interactive Chat-REPL with features like tab autocompletion, multi-line input support, history search, configurable keybindings, and custom REPL prompts.
Elevate your command-line efficiency. Describe your tasks in natural language, and let AIChat transform them into precise shell commands. AIChat intelligently adjusts to your OS and shell environment.
Accept diverse input forms such as stdin, local files and directories, and remote URLs, allowing flexibility in data handling.
| Input | CMD | REPL |
|---|---|---|
| CMD | aichat hello |
|
| STDIN | cat data.txt | aichat |
|
| Last Reply | .file %% |
|
| Local files | aichat -f image.png -f data.txt |
.file image.png data.txt |
| Local directories | aichat -f dir/ |
.file dir/ |
| Remote URLs | aichat -f https://example.com |
.file https://example.com |
| External commands | aichat -f '`git diff`' |
.file `git diff` |
| Combine Inputs | aichat -f dir/ -f data.txt explain |
.file dir/ data.txt -- explain |
Customize roles to tailor LLM behavior, enhancing interaction efficiency and boosting productivity.
The role consists of a prompt and model configuration.
Maintain context-aware conversations through sessions, ensuring continuity in interactions.
The left side uses a session, while the right side does not use a session.
Streamline repetitive tasks by combining a series of REPL commands into a custom macro.
Integrate external documents into your LLM conversations for more accurate and contextually relevant responses.
Function calling supercharges LLMs by connecting them to external tools and data sources. This unlocks a world of possibilities, enabling LLMs to go beyond their core capabilities and tackle a wider range of tasks.
We have created a new repository https://github.com/sigoden/llm-functions to help you make the most of this feature.
Integrate external tools to automate tasks, retrieve information, and perform actions directly within your workflow.
AI Agent = Instructions (Prompt) + Tools (Function Callings) + Documents (RAG).
AIChat includes a powerful built-in HTTP server that serves both APIs and the web interface.
$ aichat --serve
Chat Completions API: http://127.0.0.1:8000/v1/chat/completions
Embeddings API: http://127.0.0.1:8000/v1/embeddings
Rerank API: http://127.0.0.1:8000/v1/rerank
LLM Playground: http://127.0.0.1:8000/playground
LLM Arena: http://127.0.0.1:8000/arena?num=2
Web Platform: http://127.0.0.1:8000/ (Matrix GUI)Web Platform Features:
- Matrix Console Interface - Cyberpunk-themed UI with green code rain animation
- Authentication Portal - Secure login/signup with JWT tokens
- API Marketplace - Browse, purchase, and sell API key access
- Web Terminal - Full-featured terminal emulator with xterm.js
- Multi-Provider Chat - Compare responses from multiple LLMs side-by-side
- Document Manager - Upload from cloud storage (Google Drive, OneDrive)
- Analytics Dashboard - Track usage, costs, and performance metrics
- Billing Portal - Manage payments, balance, and transactions
The LLM Arena is a web-based platform where you can compare different LLMs side-by-side.
Test with curl:
curl -X POST -H "Content-Type: application/json" -d '{
"model":"claude:claude-3-5-sonnet-20240620",
"messages":[{"role":"user","content":"hello"}],
"stream":true
}' http://127.0.0.1:8000/v1/chat/completionsA web application to interact with supported LLMs directly from your browser.
A web platform to compare different LLMs side-by-side.
AIChat supports custom dark and light themes, which highlight response text and code blocks.
- Chat-REPL Guide
- Command-Line Guide
- Role Guide
- Macro Guide
- RAG Guide
- Environment Variables
- Configuration Guide
- Custom Theme
- Custom REPL Prompt
- FAQ
- Web Platform Architecture - Technical overview and architecture
- API Marketplace Guide - Buy/sell API keys
- Web Terminal Usage - Browser-based CLI
- Multi-Provider Chat - Model comparison
- Document Processing - Cloud integrations
- Billing & Payments - Stripe integration
- Analytics Dashboard - Usage metrics
AIChat is built in Rust with a modular architecture:
- CLI Interface (
src/cli.rs) - Clap-based command parsing - Client System (
src/client/) - Multi-provider LLM integration (20+ providers) - Server (
src/serve.rs) - Hyper-based HTTP/WebSocket server - Configuration (
src/config/) - Roles, sessions, agents management - RAG System (
src/rag/) - Document processing and vector search - REPL (
src/repl/) - Interactive chat with syntax highlighting - Function Calling (
src/function.rs) - External tool integration
- Backend: Rust (Hyper, Tokio, Serde)
- Database: PostgreSQL/Supabase
- Frontend: Matrix-themed HTML/CSS/JS with WebSockets
- Terminal: xterm.js with WebSocket backend
- Payments: Stripe API integration
- Cloud Storage: Google Drive & OneDrive APIs
- Search: DuckDuckGo & Google Custom Search
Phase 1: Foundation β (Complete)
- Matrix-themed web interface with animated code rain
- PostgreSQL database schema and migrations
- JWT authentication and user management system
Phase 2: Marketplace & Payments π§ (In Progress)
- API key marketplace with listing management
- Dynamic pricing engine and capacity tracking
- Intelligent request routing with failover
- Stripe payment integration and billing system
- Balance management and automated payouts
Phase 3: Enhanced UX π (Planned)
- xterm.js terminal emulator with WebSocket backend
- Side-by-side multi-provider chat comparison
- Real-time cost tracking and analysis
- Google Drive & OneDrive cloud integration
- Document processing pipeline with RAG
Phase 4: Intelligence & Insights π (Planned)
- DuckDuckGo & Google Search integration
- Web search with citation extraction
- Comprehensive analytics dashboard
- Usage metrics and PDF report generation
- System-wide performance monitoring
Track progress in Epic Issue #1 and individual task issues:
- β Task 1: Matrix Console Theme Foundation (Complete)
- β Task 2: Database Schema & User Management (Complete)
- π§ Task 3: API Key Marketplace Core (#2-6)
- π§ Task 4: Terminal Emulator (#7-11)
- π§ Task 5: Multi-Provider Chat (#12-16)
- π§ Task 6: Document Processing (#17-21)
- π§ Task 7: Web Search Integration (#22-26)
- π§ Task 8: Billing & Payment System (#27-31)
- π§ Task 9: Analytics & Reporting (#32-36)
Total Progress: 13% complete (2 of 9 core tasks) β’ 35 subtasks planned β’ ~15 weeks estimated
We welcome contributions! See our Epic roadmap for areas where help is needed. Each feature has detailed subtasks with implementation guides.
# Clone repository
git clone https://github.com/johnproblems/aichat.git
cd aichat
# Build project
cargo build
# Run tests
cargo test
# Start local server
cargo run -- --serve
# Access web platform
open http://localhost:8000aichat/
βββ src/ # Rust source code
β βββ cli.rs # CLI interface
β βββ serve.rs # HTTP/WebSocket server
β βββ client/ # Multi-provider LLM clients
β βββ config/ # Configuration management
β βββ rag/ # Document processing
β βββ repl/ # Interactive REPL
βββ assets/ # Web UI assets
β βββ index.html # Matrix-themed interface
β βββ styles.css # Cyberpunk styling
β βββ app.js # Frontend logic
βββ migrations/ # Database migrations
βββ .claude/ # Claude Code project management
β βββ epics/ # Epic and task tracking
β βββ prds/ # Product requirements
βββ Cargo.toml # Rust dependencies
Copyright (c) 2023-2025 aichat-developers. Copyright (c) 2025 FormalHosting - Web Platform Transformation
AIChat is made available under the terms of either the MIT License or the Apache License 2.0, at your option.
See the LICENSE-APACHE and LICENSE-MIT files for license details.











