-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Overview
Implement optional cache compression for large cache values with configurable compression thresholds and algorithms to reduce memory usage and improve storage efficiency, especially for persistent storage adapters.
Background Analysis
Current State
- RunCache stores values as strings in
CacheState.value - Supports persistent storage via adapters (localStorage, IndexedDB, filesystem)
- Has middleware system for value transformation
- No compression currently implemented
Research Findings
- Native Browser Support:
CompressionStream/DecompressionStreamAPI available in modern browsers (Chrome 80+, Safari 152+, Firefox support varies) - Library Options:
lz-string,lzutf8.jsfor broader compatibility - Node.js: Native
zlibmodule available - Common Algorithms: gzip, deflate, LZ-based algorithms
Implementation Strategy
Phase 1: Core Compression Infrastructure
1.1 Compression Interface Design
Create a flexible compression interface that supports multiple algorithms:
// src/types/compression.ts
export interface CompressionAlgorithm {
name: string;
compress(data: string): Promise<string>;
decompress(data: string): Promise<string>;
isAvailable(): boolean;
}
export interface CompressionConfig {
enabled?: boolean;
algorithm?: 'gzip' | 'deflate' | 'lz-string' | 'auto';
threshold?: number; // Compress values larger than X bytes
level?: number; // Compression level (1-9, where applicable)
}1.2 Algorithm Implementations
Create concrete implementations for different environments:
Native Browser Implementation (src/compression/native-compression.ts):
export class NativeCompressionAlgorithm implements CompressionAlgorithm {
constructor(private format: 'gzip' | 'deflate' = 'gzip') {}
async compress(data: string): Promise<string> {
// Use CompressionStream API with base64 encoding
}
async decompress(data: string): Promise<string> {
// Use DecompressionStream API
}
isAvailable(): boolean {
return typeof CompressionStream !== 'undefined';
}
}LZ-String Fallback (src/compression/lz-string-compression.ts):
export class LZStringCompressionAlgorithm implements CompressionAlgorithm {
async compress(data: string): Promise<string> {
// Use lz-string library
}
// ... implementation
}Node.js Implementation (src/compression/node-compression.ts):
export class NodeCompressionAlgorithm implements CompressionAlgorithm {
async compress(data: string): Promise<string> {
// Use Node.js zlib module
}
// ... implementation
}1.3 Compression Manager
Create a manager to handle algorithm selection and fallbacks:
// src/compression/compression-manager.ts
export class CompressionManager {
private algorithm: CompressionAlgorithm;
constructor(config: CompressionConfig) {
this.algorithm = this.selectAlgorithm(config.algorithm);
}
private selectAlgorithm(preferred?: string): CompressionAlgorithm {
// Auto-detect best available algorithm
// Priority: Native -> LZ-String -> None
}
async compressIfNeeded(data: string, threshold: number): Promise<{
data: string;
compressed: boolean;
originalSize: number;
compressedSize: number;
}> {
// Compress only if data size > threshold
}
}Phase 2: Integration with Core Cache
2.1 Update CacheState Type
Extend the cache state to track compression metadata:
// src/types/cache-state.ts
export type CacheState = {
value: string;
compressed?: boolean; // New field
originalSize?: number; // New field
compressionRatio?: number; // New field
// ... existing fields
};2.2 Update CacheConfig
Add compression configuration to cache config:
// src/types/cache-config.ts
export interface CacheConfig {
// ... existing fields
compression?: CompressionConfig;
}2.3 Modify CacheStore Operations
Update core cache operations to handle compression:
Set Operation (src/core/cache-store.ts):
async set(params: SetParams): Promise<boolean> {
// ... existing validation
// Apply middleware first
let processedValue = await this.middlewareManager.execute(value, context);
// Apply compression if configured
if (this.compressionManager) {
const result = await this.compressionManager.compressIfNeeded(
processedValue,
this.config.compression?.threshold || Infinity
);
processedValue = result.data;
// Update cache state with compression metadata
cacheState.compressed = result.compressed;
cacheState.originalSize = result.originalSize;
cacheState.compressionRatio = result.originalSize / result.compressedSize;
}
// ... rest of set logic
}Get Operation:
private async getSingle(key: string): Promise<string | undefined> {
const cached = this.cache.get(key);
if (!cached) return undefined;
// ... expiry checks
let value = cached.value;
// Decompress if needed
if (cached.compressed && this.compressionManager) {
value = await this.compressionManager.decompress(value);
}
// Apply middleware
const result = await this.middlewareManager.execute(value, context);
return result;
}Phase 3: Storage Adapter Integration
3.1 Serialization Updates
Update cache serialization to include compression metadata:
// In CacheStore.serializeCache()
const data: SerializedCacheData = {
version: 2, // Increment version for compression support
timestamp: Date.now(),
entries,
config: {
maxEntries: this.config.maxEntries,
evictionPolicy: this.config.evictionPolicy,
compression: this.config.compression, // New field
},
};3.2 Migration Strategy
Handle backward compatibility when loading compressed data:
private deserializeCache(data: string): void {
const parsed: SerializedCacheData = JSON.parse(data);
if (parsed.version === 1) {
// Handle legacy data without compression
this.migrateLegacyData(parsed);
} else if (parsed.version === 2) {
// Handle new format with compression
this.loadCompressedData(parsed);
}
}Phase 4: Configuration and API
4.1 Update RunCache API
Add compression configuration methods:
// src/run-cache.ts
export class RunCache {
static async configure(config: CacheConfig): Promise<void> {
await RunCache.ensureInitialized();
await RunCache.instance.configure(config);
}
// New methods for compression management
static async getCompressionStats(): Promise<CompressionStats> {
await RunCache.ensureInitialized();
return RunCache.instance.getCompressionStats();
}
static async optimizeCompression(): Promise<void> {
await RunCache.ensureInitialized();
return RunCache.instance.recompressCache();
}
}4.2 Configuration Examples
Provide clear configuration options:
// Example configurations
const config: CacheConfig = {
compression: {
enabled: true,
algorithm: 'auto', // Auto-select best available
threshold: 1024, // Compress values > 1KB
level: 6, // Default compression level
}
};
// Environment-specific configs
const browserConfig: CacheConfig = {
compression: {
enabled: true,
algorithm: 'gzip', // Use native CompressionStream
threshold: 512,
}
};
const nodeConfig: CacheConfig = {
compression: {
enabled: true,
algorithm: 'gzip', // Use Node.js zlib
threshold: 1024,
level: 9, // High compression for server
}
};Phase 5: Testing and Validation
5.1 Unit Tests
Create comprehensive tests for compression functionality:
// src/compression/compression.test.ts
describe('Compression', () => {
describe('Algorithm Detection', () => {
it('should prefer native compression when available');
it('should fallback to lz-string when native unavailable');
it('should handle compression errors gracefully');
});
describe('Threshold Behavior', () => {
it('should not compress values below threshold');
it('should compress values above threshold');
it('should track compression ratios');
});
describe('Backward Compatibility', () => {
it('should read uncompressed legacy data');
it('should migrate data format correctly');
});
});5.2 Integration Tests
Test compression with existing features:
// src/compression/integration.test.ts
describe('Compression Integration', () => {
describe('With Middleware', () => {
it('should apply compression after middleware');
it('should decompress before middleware on get');
});
describe('With Storage Adapters', () => {
it('should persist compressed data correctly');
it('should restore compressed data correctly');
});
describe('With Eviction Policies', () => {
it('should calculate memory usage correctly with compression');
});
});5.3 Performance Tests
Benchmark compression performance:
// src/compression/performance.test.ts
describe('Compression Performance', () => {
it('should measure compression speed vs ratio tradeoffs');
it('should measure memory usage improvements');
it('should test with various data sizes and types');
});Phase 6: Documentation and Examples
6.1 API Documentation
Update README and API docs with compression features:
## Compression
RunCache supports optional compression to reduce memory usage and storage size:
### Basic Usage
```typescript
import { RunCache } from 'run-cache';
// Enable compression for values larger than 1KB
RunCache.configure({
compression: {
enabled: true,
threshold: 1024,
}
});Algorithm Selection
// Use specific algorithm
RunCache.configure({
compression: {
algorithm: 'gzip', // 'gzip', 'deflate', 'lz-string', or 'auto'
threshold: 512,
level: 6, // Compression level (1-9)
}
});
#### 6.2 Migration Guide
Provide guidance for existing users:
```markdown
## Migration to Compression
Compression is backward compatible. Existing caches will work without modification.
### Enabling Compression
1. Update your configuration to enable compression
2. Existing data remains uncompressed until updated
3. New data will be compressed based on threshold settings
### Performance Considerations
- Compression adds CPU overhead but reduces memory/storage
- Tune threshold based on your data patterns
- Monitor compression ratios to optimize settings
Implementation Dependencies
External Libraries
- lz-string (
^1.5.0) - Fallback compression for broader compatibility - @types/node (dev) - For Node.js zlib types
Browser Compatibility
- Native Compression: Chrome 80+, Safari 152+, Firefox 90+
- Fallback (lz-string): All modern browsers + IE9+
- Node.js: All supported versions (native zlib)
File Structure
src/
├── compression/
│ ├── index.ts # Public exports
│ ├── compression-manager.ts # Main compression coordinator
│ ├── algorithms/
│ │ ├── native-compression.ts # Browser CompressionStream
│ │ ├── lz-string-compression.ts # lz-string fallback
│ │ ├── node-compression.ts # Node.js zlib
│ │ └── index.ts # Algorithm exports
│ ├── compression.test.ts # Unit tests
│ ├── integration.test.ts # Integration tests
│ └── performance.test.ts # Performance benchmarks
├── types/
│ └── compression.ts # Compression type definitions
├── core/
│ └── cache-store.ts # Updated with compression
└── run-cache.ts # Updated API
Timeline Estimation
Phase 1 (Week 1-2): Core Infrastructure
- Compression interfaces and algorithm implementations
- Basic compression manager
- Unit tests for algorithms
Phase 2 (Week 2-3): Cache Integration
- Update CacheState and CacheConfig types
- Integrate compression into set/get operations
- Handle compression metadata
Phase 3 (Week 3-4): Storage Integration
- Update serialization format
- Implement migration strategy
- Update storage adapters
Phase 4 (Week 4): API & Configuration
- Update RunCache public API
- Add compression management methods
- Configuration validation
Phase 5 (Week 5): Testing
- Comprehensive test suite
- Performance benchmarks
- Cross-environment testing
Phase 6 (Week 6): Documentation
- API documentation updates
- Usage examples
- Migration guides
Success Metrics
- Functionality: All compression algorithms work correctly across environments
- Compatibility: Backward compatibility maintained, no breaking changes
- Performance: Compression reduces storage size by 30-70% for typical text data
- Reliability: 100% test coverage for compression functionality
- Usability: Simple configuration with sensible defaults
Risk Mitigation
- Browser Compatibility: Graceful fallback to lz-string when native compression unavailable
- Performance Impact: Configurable thresholds to avoid compressing small values
- Data Corruption: Comprehensive error handling and validation
- Breaking Changes: Maintain full backward compatibility with version migration
- Memory Usage: Monitor compression overhead vs. benefits
Future Enhancements
- Additional Algorithms: Brotli support when browser support improves
- Adaptive Compression: Automatically adjust thresholds based on data patterns
- Compression Analytics: Detailed metrics and reporting
- Streaming Compression: For very large values
- Custom Algorithms: Plugin system for user-defined compression methods
Metadata
Metadata
Assignees
Labels
Projects
Status