Official integration examples for Cachee.ai — drop-in RESP proxy that accelerates any Redis-compatible cache with sub-microsecond L1 reads.
Cachee is an AI-powered caching proxy that sits in front of your existing cache (Redis, ElastiCache, CloudFlare KV, etc.) and serves hot data from L1 memory in 17 nanoseconds. No code changes — swap one connection string.
- 17ns GET latency (59,000x faster than Redis)
- 59M ops/sec throughput
- 98.1% cache hit rate with ML-powered prediction
- RESP protocol compatible — works with any Redis client
// Before: connect to Redis
const redis = new Redis('redis://your-redis:6379');
// After: connect to Cachee (same API, 59,000x faster reads)
const redis = new Redis('redis://cachee-proxy:6380');
const value = await redis.get('user:123:profile');
// 17ns from L1 cache instead of 1ms from Redisimport redis
# Point at Cachee instead of Redis
cache = redis.Redis(host='cachee-proxy', port=6380)
# Same API — sub-microsecond from L1
profile = cache.get('user:123:profile')client := redis.NewClient(&redis.Options{
Addr: "cachee-proxy:6380", // Cachee instead of Redis
})
val, err := client.Get(ctx, "user:123:profile").Result()
// 17ns L1 hitlet client = redis::Client::open("redis://cachee-proxy:6380/")?;
let mut con = client.get_connection()?;
let val: String = redis::cmd("GET").arg("user:123:profile").query(&mut con)?;Cachee sits in front of your existing cache as an L1 acceleration layer. Your app connects to Cachee, hot data serves from memory, misses forward to your backend.
App → Cachee (17ns L1) → Your Redis/ElastiCache/etc (1ms)
Supported backends: AWS ElastiCache, CloudFlare KV, Redis Cloud, Azure Cache, GCP Memorystore, Upstash, Any Redis
Deploy Cachee as a sidecar container alongside your application pod.
# See helm-charts repo for full Kubernetes deployment
# https://github.com/HapPhi/cachee-helm-charts
containers:
- name: app
image: your-app:latest
env:
- name: REDIS_URL
value: "redis://localhost:6380"
- name: cachee
image: cachee/proxy:latest
ports:
- containerPort: 6380services:
cachee:
image: cachee/proxy:latest
ports:
- "6380:6380"
environment:
- CACHEE_UPSTREAM=redis://redis:6379
- CACHEE_L1_MAX_KEYS=1000000
redis:
image: redis:7-alpine
ports:
- "6379:6379"| Use Case | Example | Docs |
|---|---|---|
| Trading / HFT | Order lifecycle cache | cachee.ai/trading |
| AI/ML Inference | KV cache + embeddings | cachee.ai/ai |
| Gaming | Game state at 128-tick | cachee.ai/gaming |
| 5G / Telecom | MEC edge caching | cachee.ai/5g-telecom |
| Session Management | Auth session store | cachee.ai/session-scaling |
- Website: cachee.ai
- Start Free Trial: cachee.ai/start
- Integrations: cachee.ai/integrations
- Pricing: cachee.ai/pricing
- Blog: cachee.ai/blog
- Benchmarks: github.com/HapPhi/cachee-benchmarks
- Helm Charts: github.com/HapPhi/cachee-helm-charts
MIT