Skip to content

HapPhi/cachee-examples

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 

Repository files navigation

Cachee Integration Examples

Official integration examples for Cachee.ai — drop-in RESP proxy that accelerates any Redis-compatible cache with sub-microsecond L1 reads.

What is Cachee?

Cachee is an AI-powered caching proxy that sits in front of your existing cache (Redis, ElastiCache, CloudFlare KV, etc.) and serves hot data from L1 memory in 17 nanoseconds. No code changes — swap one connection string.

  • 17ns GET latency (59,000x faster than Redis)
  • 59M ops/sec throughput
  • 98.1% cache hit rate with ML-powered prediction
  • RESP protocol compatible — works with any Redis client

How it works | Benchmarks | Start free

Examples by Language

Node.js

// Before: connect to Redis
const redis = new Redis('redis://your-redis:6379');

// After: connect to Cachee (same API, 59,000x faster reads)
const redis = new Redis('redis://cachee-proxy:6380');

const value = await redis.get('user:123:profile');
// 17ns from L1 cache instead of 1ms from Redis

Full Node.js example | Node.js SDK docs

Python

import redis

# Point at Cachee instead of Redis
cache = redis.Redis(host='cachee-proxy', port=6380)

# Same API — sub-microsecond from L1
profile = cache.get('user:123:profile')

Full Python example | Python SDK docs

Go

client := redis.NewClient(&redis.Options{
    Addr: "cachee-proxy:6380", // Cachee instead of Redis
})

val, err := client.Get(ctx, "user:123:profile").Result()
// 17ns L1 hit

Full Go example | Go SDK docs

Rust

let client = redis::Client::open("redis://cachee-proxy:6380/")?;
let mut con = client.get_connection()?;

let val: String = redis::cmd("GET").arg("user:123:profile").query(&mut con)?;

Full Rust example | Rust integration guide

Integration Patterns

Overlay Mode (Accelerate Existing Cache)

Cachee sits in front of your existing cache as an L1 acceleration layer. Your app connects to Cachee, hot data serves from memory, misses forward to your backend.

App → Cachee (17ns L1) → Your Redis/ElastiCache/etc (1ms)

Supported backends: AWS ElastiCache, CloudFlare KV, Redis Cloud, Azure Cache, GCP Memorystore, Upstash, Any Redis

All supported integrations

Sidecar Mode (Kubernetes)

Deploy Cachee as a sidecar container alongside your application pod.

# See helm-charts repo for full Kubernetes deployment
# https://github.com/HapPhi/cachee-helm-charts
containers:
  - name: app
    image: your-app:latest
    env:
      - name: REDIS_URL
        value: "redis://localhost:6380"
  - name: cachee
    image: cachee/proxy:latest
    ports:
      - containerPort: 6380

Kubernetes deployment guide | Helm charts

Docker Compose

services:
  cachee:
    image: cachee/proxy:latest
    ports:
      - "6380:6380"
    environment:
      - CACHEE_UPSTREAM=redis://redis:6379
      - CACHEE_L1_MAX_KEYS=1000000
  redis:
    image: redis:7-alpine
    ports:
      - "6379:6379"

Full Docker Compose example

Use Case Examples

Use Case Example Docs
Trading / HFT Order lifecycle cache cachee.ai/trading
AI/ML Inference KV cache + embeddings cachee.ai/ai
Gaming Game state at 128-tick cachee.ai/gaming
5G / Telecom MEC edge caching cachee.ai/5g-telecom
Session Management Auth session store cachee.ai/session-scaling

Links

License

MIT

About

Integration examples for Cachee.ai — Node.js, Python, Go, Rust. Drop-in RESP proxy for Redis, ElastiCache, CloudFlare KV. https://cachee.ai/integrations

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors