-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Overview
Implement batch operations for bulk cache operations (setMany(), getMany(), deleteMany()) and transaction support for atomic operations across multiple keys to reduce overhead and improve performance.
Background Analysis
Current State
- RunCache operations are performed one key at a time
- Each operation involves individual validation, middleware processing, and storage
- No atomic operations across multiple keys
- No bulk operation optimizations
- Individual event emissions for each operation
Performance Issues with Current Approach
- Network Overhead: Multiple storage adapter calls for related operations
- Validation Overhead: Repeated validation logic for similar operations
- Middleware Overhead: Individual middleware execution for each key
- Event Overhead: Individual event emissions vs. batch events
- Lock Contention: Potential race conditions with concurrent operations
Batch Operation Requirements
- Bulk Operations:
setMany(),getMany(),deleteMany(),hasMany() - Transaction Support: Atomic operations with rollback capability
- Performance: Significant reduction in overhead for bulk operations
- Consistency: Maintain data consistency across batch operations
- Error Handling: Partial failure handling and reporting
Implementation Strategy
Phase 1: Batch Operation Interfaces
1.1 Batch Operation Types
Define comprehensive types for batch operations:
// src/types/batch-operations.ts
export interface BatchSetEntry {
key: string;
value?: string;
ttl?: number;
autoRefetch?: boolean;
sourceFn?: SourceFn;
tags?: string[];
dependencies?: string[];
}
export interface BatchGetResult {
key: string;
value: string | undefined;
found: boolean;
error?: Error;
}
export interface BatchSetResult {
key: string;
success: boolean;
error?: Error;
}
export interface BatchDeleteResult {
key: string;
deleted: boolean;
existed: boolean;
}
export interface BatchOperationResult<T> {
results: T[];
successCount: number;
errorCount: number;
totalCount: number;
executionTime: number;
errors: { key: string; error: Error }[];
}
export interface BatchOptions {
continueOnError?: boolean; // Continue processing if individual operations fail
maxConcurrency?: number; // Maximum concurrent operations
validateFirst?: boolean; // Validate all entries before processing
atomic?: boolean; // All operations succeed or all fail
}
export interface TransactionOptions {
timeout?: number; // Transaction timeout in milliseconds
retries?: number; // Number of retry attempts
isolationLevel?: 'read-uncommitted' | 'read-committed' | 'serializable';
}1.2 Transaction Interface
Define transaction support for atomic operations:
// src/types/transactions.ts
export interface Transaction {
id: string;
startTime: number;
operations: TransactionOperation[];
state: 'pending' | 'committed' | 'rolled-back' | 'error';
// Transaction operations
set(key: string, value: string, options?: SetOptions): Transaction;
get(key: string): Promise<string | undefined>;
delete(key: string): Transaction;
// Batch operations within transaction
setMany(entries: BatchSetEntry[]): Transaction;
deleteMany(keys: string[]): Transaction;
// Transaction control
commit(): Promise<TransactionResult>;
rollback(): Promise<void>;
getState(): TransactionState;
}
export interface TransactionOperation {
type: 'set' | 'delete' | 'get';
key: string;
value?: string;
originalValue?: string; // For rollback
options?: any;
}
export interface TransactionResult {
success: boolean;
operationsExecuted: number;
errors: Error[];
executionTime: number;
}Phase 2: Core Batch Implementation
2.1 Batch Operations Manager
Create a dedicated manager for batch operations:
// src/core/batch-operations-manager.ts
export class BatchOperationsManager {
private cache: Map<string, CacheState>;
private logger: Logger;
private middlewareManager: MiddlewareManager;
private eventSystem: EventSystem;
constructor(
cache: Map<string, CacheState>,
logger: Logger,
middlewareManager: MiddlewareManager,
eventSystem: EventSystem
) {
this.cache = cache;
this.logger = logger;
this.middlewareManager = middlewareManager;
this.eventSystem = eventSystem;
}
async setMany(
entries: BatchSetEntry[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchSetResult>> {
const startTime = Date.now();
const results: BatchSetResult[] = [];
const errors: { key: string; error: Error }[] = [];
this.logger.log('info', `Starting batch set operation for ${entries.length} entries`);
// Validate all entries first if requested
if (options.validateFirst) {
const validationErrors = this.validateBatchSetEntries(entries);
if (validationErrors.length > 0 && !options.continueOnError) {
throw new Error(`Validation failed: ${validationErrors.map(e => e.message).join(', ')}`);
}
}
// Process entries with concurrency control
const concurrency = options.maxConcurrency || 10;
const batches = this.createBatches(entries, concurrency);
for (const batch of batches) {
const batchResults = await Promise.allSettled(
batch.map(entry => this.processSingleSetOperation(entry))
);
batchResults.forEach((result, index) => {
const entry = batch[index];
if (result.status === 'fulfilled') {
results.push({ key: entry.key, success: true });
} else {
const error = result.reason;
results.push({ key: entry.key, success: false, error });
errors.push({ key: entry.key, error });
if (!options.continueOnError) {
throw new Error(`Batch operation failed at key ${entry.key}: ${error.message}`);
}
}
});
}
const executionTime = Date.now() - startTime;
const successCount = results.filter(r => r.success).length;
// Emit batch event
this.eventSystem.emitEvent('BATCH_SET', {
operationType: 'setMany',
totalCount: entries.length,
successCount,
errorCount: errors.length,
executionTime,
});
this.logger.log('info', `Batch set completed: ${successCount}/${entries.length} successful`);
return {
results,
successCount,
errorCount: errors.length,
totalCount: entries.length,
executionTime,
errors,
};
}
async getMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchGetResult>> {
const startTime = Date.now();
const results: BatchGetResult[] = [];
const errors: { key: string; error: Error }[] = [];
this.logger.log('info', `Starting batch get operation for ${keys.length} keys`);
// Group keys by pattern (wildcard vs exact)
const { exactKeys, patternKeys } = this.groupKeysByType(keys);
// Process exact keys in batches
const concurrency = options.maxConcurrency || 20; // Higher for reads
const exactBatches = this.createBatches(exactKeys, concurrency);
for (const batch of exactBatches) {
const batchResults = await Promise.allSettled(
batch.map(key => this.processSingleGetOperation(key))
);
batchResults.forEach((result, index) => {
const key = batch[index];
if (result.status === 'fulfilled') {
const value = result.value;
results.push({ key, value, found: value !== undefined });
} else {
const error = result.reason;
results.push({ key, value: undefined, found: false, error });
errors.push({ key, error });
}
});
}
// Process pattern keys individually (they may return multiple results)
for (const pattern of patternKeys) {
try {
const values = await this.processSingleGetOperation(pattern);
if (Array.isArray(values)) {
// Handle multiple matches from pattern
values.forEach((value, index) => {
results.push({
key: `${pattern}[${index}]`,
value,
found: true
});
});
} else {
results.push({ key: pattern, value: values, found: values !== undefined });
}
} catch (error) {
results.push({ key: pattern, value: undefined, found: false, error });
errors.push({ key: pattern, error });
}
}
const executionTime = Date.now() - startTime;
const successCount = results.filter(r => r.found).length;
this.logger.log('info', `Batch get completed: ${successCount}/${keys.length} found`);
return {
results,
successCount,
errorCount: errors.length,
totalCount: keys.length,
executionTime,
errors,
};
}
async deleteMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchDeleteResult>> {
const startTime = Date.now();
const results: BatchDeleteResult[] = [];
const errors: { key: string; error: Error }[] = [];
this.logger.log('info', `Starting batch delete operation for ${keys.length} keys`);
// Process deletions in batches
const concurrency = options.maxConcurrency || 15;
const batches = this.createBatches(keys, concurrency);
for (const batch of batches) {
const batchResults = await Promise.allSettled(
batch.map(key => this.processSingleDeleteOperation(key))
);
batchResults.forEach((result, index) => {
const key = batch[index];
if (result.status === 'fulfilled') {
const { deleted, existed } = result.value;
results.push({ key, deleted, existed });
} else {
const error = result.reason;
results.push({ key, deleted: false, existed: false });
errors.push({ key, error });
}
});
}
const executionTime = Date.now() - startTime;
const successCount = results.filter(r => r.deleted).length;
// Emit batch event
this.eventSystem.emitEvent('BATCH_DELETE', {
operationType: 'deleteMany',
totalCount: keys.length,
successCount,
errorCount: errors.length,
executionTime,
});
this.logger.log('info', `Batch delete completed: ${successCount}/${keys.length} deleted`);
return {
results,
successCount,
errorCount: errors.length,
totalCount: keys.length,
executionTime,
errors,
};
}
// Helper methods
private validateBatchSetEntries(entries: BatchSetEntry[]): Error[] {
const errors: Error[] = [];
entries.forEach((entry, index) => {
if (!entry.key || !entry.key.length) {
errors.push(new Error(`Entry ${index}: Empty key provided`));
}
if (entry.sourceFn === undefined && (entry.value === undefined || !entry.value.length)) {
errors.push(new Error(`Entry ${index}: Neither value nor sourceFn provided`));
}
if (entry.autoRefetch && !entry.ttl) {
errors.push(new Error(`Entry ${index}: autoRefetch enabled without ttl`));
}
if (entry.ttl !== undefined && entry.ttl <= 0) {
errors.push(new Error(`Entry ${index}: ttl cannot be negative`));
}
});
return errors;
}
private createBatches<T>(items: T[], batchSize: number): T[][] {
const batches: T[][] = [];
for (let i = 0; i < items.length; i += batchSize) {
batches.push(items.slice(i, i + batchSize));
}
return batches;
}
private groupKeysByType(keys: string[]): { exactKeys: string[]; patternKeys: string[] } {
const exactKeys: string[] = [];
const patternKeys: string[] = [];
keys.forEach(key => {
if (key.includes('*')) {
patternKeys.push(key);
} else {
exactKeys.push(key);
}
});
return { exactKeys, patternKeys };
}
private async processSingleSetOperation(entry: BatchSetEntry): Promise<void> {
// Implement single set operation (similar to existing CacheStore.set)
// But optimized for batch context
}
private async processSingleGetOperation(key: string): Promise<string | string[] | undefined> {
// Implement single get operation optimized for batch context
}
private async processSingleDeleteOperation(key: string): Promise<{ deleted: boolean; existed: boolean }> {
// Implement single delete operation optimized for batch context
}
}2.2 Transaction Manager
Implement transaction support for atomic operations:
// src/core/transaction-manager.ts
export class TransactionManager {
private transactions: Map<string, CacheTransaction> = new Map();
private cache: Map<string, CacheState>;
private logger: Logger;
constructor(cache: Map<string, CacheState>, logger: Logger) {
this.cache = cache;
this.logger = logger;
}
async beginTransaction(options: TransactionOptions = {}): Promise<Transaction> {
const id = this.generateTransactionId();
const transaction = new CacheTransaction(id, this.cache, this.logger, options);
this.transactions.set(id, transaction);
this.logger.log('info', `Transaction ${id} started`);
// Set timeout if specified
if (options.timeout) {
setTimeout(() => {
if (transaction.getState() === 'pending') {
this.logger.log('warn', `Transaction ${id} timed out`);
transaction.rollback();
}
}, options.timeout);
}
return transaction;
}
getTransaction(id: string): Transaction | undefined {
return this.transactions.get(id);
}
async cleanupTransaction(id: string): Promise<void> {
const transaction = this.transactions.get(id);
if (transaction) {
if (transaction.getState() === 'pending') {
await transaction.rollback();
}
this.transactions.delete(id);
this.logger.log('debug', `Transaction ${id} cleaned up`);
}
}
private generateTransactionId(): string {
return `tx_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
}
}
export class CacheTransaction implements Transaction {
public readonly id: string;
public readonly startTime: number;
public operations: TransactionOperation[] = [];
public state: 'pending' | 'committed' | 'rolled-back' | 'error' = 'pending';
private cache: Map<string, CacheState>;
private logger: Logger;
private options: TransactionOptions;
private snapshot: Map<string, CacheState | null> = new Map(); // null = key didn't exist
constructor(
id: string,
cache: Map<string, CacheState>,
logger: Logger,
options: TransactionOptions
) {
this.id = id;
this.startTime = Date.now();
this.cache = cache;
this.logger = logger;
this.options = options;
}
set(key: string, value: string, options: any = {}): Transaction {
if (this.state !== 'pending') {
throw new Error(`Transaction ${this.id} is not in pending state`);
}
// Take snapshot of original value for rollback
if (!this.snapshot.has(key)) {
const originalValue = this.cache.get(key) || null;
this.snapshot.set(key, originalValue);
}
this.operations.push({
type: 'set',
key,
value,
options,
});
return this;
}
async get(key: string): Promise<string | undefined> {
if (this.state !== 'pending') {
throw new Error(`Transaction ${this.id} is not in pending state`);
}
// Check pending operations first
const pendingOp = this.operations
.slice()
.reverse()
.find(op => op.key === key);
if (pendingOp) {
if (pendingOp.type === 'set') {
return pendingOp.value;
} else if (pendingOp.type === 'delete') {
return undefined;
}
}
// Fall back to current cache state
const cached = this.cache.get(key);
return cached?.value;
}
delete(key: string): Transaction {
if (this.state !== 'pending') {
throw new Error(`Transaction ${this.id} is not in pending state`);
}
// Take snapshot for rollback
if (!this.snapshot.has(key)) {
const originalValue = this.cache.get(key) || null;
this.snapshot.set(key, originalValue);
}
this.operations.push({
type: 'delete',
key,
});
return this;
}
setMany(entries: BatchSetEntry[]): Transaction {
entries.forEach(entry => {
this.set(entry.key, entry.value || '', entry);
});
return this;
}
deleteMany(keys: string[]): Transaction {
keys.forEach(key => this.delete(key));
return this;
}
async commit(): Promise<TransactionResult> {
if (this.state !== 'pending') {
throw new Error(`Transaction ${this.id} is not in pending state`);
}
const startTime = Date.now();
const errors: Error[] = [];
let operationsExecuted = 0;
try {
this.logger.log('info', `Committing transaction ${this.id} with ${this.operations.length} operations`);
// Execute all operations
for (const operation of this.operations) {
try {
await this.executeOperation(operation);
operationsExecuted++;
} catch (error) {
errors.push(error);
// Rollback on first error
await this.rollback();
this.state = 'error';
throw new Error(`Transaction ${this.id} failed: ${error.message}`);
}
}
this.state = 'committed';
const executionTime = Date.now() - startTime;
this.logger.log('info', `Transaction ${this.id} committed successfully in ${executionTime}ms`);
return {
success: true,
operationsExecuted,
errors,
executionTime,
};
} catch (error) {
this.state = 'error';
return {
success: false,
operationsExecuted,
errors: [...errors, error],
executionTime: Date.now() - startTime,
};
}
}
async rollback(): Promise<void> {
if (this.state === 'rolled-back') {
return; // Already rolled back
}
this.logger.log('info', `Rolling back transaction ${this.id}`);
// Restore original values
for (const [key, originalValue] of this.snapshot.entries()) {
if (originalValue === null) {
// Key didn't exist, delete it
this.cache.delete(key);
} else {
// Restore original value
this.cache.set(key, originalValue);
}
}
this.state = 'rolled-back';
this.logger.log('info', `Transaction ${this.id} rolled back`);
}
getState(): TransactionState {
return {
id: this.id,
state: this.state,
startTime: this.startTime,
operationCount: this.operations.length,
executionTime: this.state === 'pending' ? Date.now() - this.startTime : undefined,
};
}
private async executeOperation(operation: TransactionOperation): Promise<void> {
switch (operation.type) {
case 'set':
// Execute set operation (reuse logic from CacheStore.set but without validation)
break;
case 'delete':
// Execute delete operation
this.cache.delete(operation.key);
break;
default:
throw new Error(`Unknown operation type: ${operation.type}`);
}
}
}Phase 3: CacheStore Integration
3.1 Update CacheStore for Batch Operations
Integrate batch operations into the main CacheStore class:
// src/core/cache-store.ts (updates)
export class CacheStore {
private batchManager: BatchOperationsManager;
private transactionManager: TransactionManager;
constructor(config: CacheConfig = {}) {
// ... existing constructor logic
// Initialize batch and transaction managers
this.batchManager = new BatchOperationsManager(
this.cache,
this.logger,
this.middlewareManager,
this.eventSystem
);
this.transactionManager = new TransactionManager(this.cache, this.logger);
}
/**
* Set multiple cache entries in a batch operation
*/
async setMany(
entries: BatchSetEntry[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchSetResult>> {
this.logger.log('info', `Batch setting ${entries.length} entries`);
return this.batchManager.setMany(entries, options);
}
/**
* Get multiple cache entries in a batch operation
*/
async getMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchGetResult>> {
this.logger.log('info', `Batch getting ${keys.length} keys`);
return this.batchManager.getMany(keys, options);
}
/**
* Delete multiple cache entries in a batch operation
*/
async deleteMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchDeleteResult>> {
this.logger.log('info', `Batch deleting ${keys.length} keys`);
return this.batchManager.deleteMany(keys, options);
}
/**
* Check existence of multiple keys in a batch operation
*/
async hasMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<{ key: string; exists: boolean }>> {
this.logger.log('info', `Batch checking existence of ${keys.length} keys`);
const startTime = Date.now();
const results: { key: string; exists: boolean }[] = [];
const errors: { key: string; error: Error }[] = [];
const concurrency = options.maxConcurrency || 20;
const batches = this.createBatches(keys, concurrency);
for (const batch of batches) {
const batchResults = await Promise.allSettled(
batch.map(key => this.has(key))
);
batchResults.forEach((result, index) => {
const key = batch[index];
if (result.status === 'fulfilled') {
results.push({ key, exists: result.value });
} else {
results.push({ key, exists: false });
errors.push({ key, error: result.reason });
}
});
}
const executionTime = Date.now() - startTime;
const successCount = results.filter(r => r.exists).length;
return {
results,
successCount,
errorCount: errors.length,
totalCount: keys.length,
executionTime,
errors,
};
}
/**
* Begin a new transaction for atomic operations
*/
async beginTransaction(options: TransactionOptions = {}): Promise<Transaction> {
return this.transactionManager.beginTransaction(options);
}
/**
* Execute multiple operations atomically
*/
async transaction<T>(
operations: (tx: Transaction) => Promise<T>,
options: TransactionOptions = {}
): Promise<T> {
const tx = await this.beginTransaction(options);
try {
const result = await operations(tx);
await tx.commit();
return result;
} catch (error) {
await tx.rollback();
throw error;
} finally {
await this.transactionManager.cleanupTransaction(tx.id);
}
}
private createBatches<T>(items: T[], batchSize: number): T[][] {
const batches: T[][] = [];
for (let i = 0; i < items.length; i += batchSize) {
batches.push(items.slice(i, i + batchSize));
}
return batches;
}
}Phase 4: RunCache API Updates
4.1 Public API for Batch Operations
Expose batch operations through the main RunCache class:
// src/run-cache.ts (updates)
export class RunCache {
/**
* Set multiple cache entries in a batch operation
* @param entries Array of entries to set
* @param options Batch operation options
*/
static async setMany(
entries: BatchSetEntry[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchSetResult>> {
await RunCache.ensureInitialized();
return RunCache.instance.setMany(entries, options);
}
/**
* Get multiple cache entries in a batch operation
* @param keys Array of keys to get
* @param options Batch operation options
*/
static async getMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchGetResult>> {
await RunCache.ensureInitialized();
return RunCache.instance.getMany(keys, options);
}
/**
* Delete multiple cache entries in a batch operation
* @param keys Array of keys to delete
* @param options Batch operation options
*/
static async deleteMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<BatchDeleteResult>> {
await RunCache.ensureInitialized();
return RunCache.instance.deleteMany(keys, options);
}
/**
* Check existence of multiple keys in a batch operation
* @param keys Array of keys to check
* @param options Batch operation options
*/
static async hasMany(
keys: string[],
options: BatchOptions = {}
): Promise<BatchOperationResult<{ key: string; exists: boolean }>> {
await RunCache.ensureInitialized();
return RunCache.instance.hasMany(keys, options);
}
/**
* Begin a new transaction for atomic operations
* @param options Transaction options
*/
static async beginTransaction(options: TransactionOptions = {}): Promise<Transaction> {
await RunCache.ensureInitialized();
return RunCache.instance.beginTransaction(options);
}
/**
* Execute multiple operations atomically within a transaction
* @param operations Function containing transaction operations
* @param options Transaction options
*/
static async transaction<T>(
operations: (tx: Transaction) => Promise<T>,
options: TransactionOptions = {}
): Promise<T> {
await RunCache.ensureInitialized();
return RunCache.instance.transaction(operations, options);
}
/**
* Batch operations with convenience methods
*/
static batch = {
/**
* Set multiple key-value pairs
*/
async set(
entries: Record<string, string> | Array<[string, string]>,
options: Omit<BatchOptions, 'validateFirst'> = {}
): Promise<BatchOperationResult<BatchSetResult>> {
const batchEntries: BatchSetEntry[] = Array.isArray(entries)
? entries.map(([key, value]) => ({ key, value }))
: Object.entries(entries).map(([key, value]) => ({ key, value }));
return RunCache.setMany(batchEntries, { ...options, validateFirst: true });
},
/**
* Get multiple values by keys
*/
async get(
keys: string[],
options: BatchOptions = {}
): Promise<Record<string, string | undefined>> {
const result = await RunCache.getMany(keys, options);
const record: Record<string, string | undefined> = {};
result.results.forEach(({ key, value }) => {
record[key] = value;
});
return record;
},
/**
* Delete multiple keys
*/
async delete(
keys: string[],
options: BatchOptions = {}
): Promise<number> {
const result = await RunCache.deleteMany(keys, options);
return result.successCount;
},
};
}Phase 5: Event System Updates
5.1 Batch Events
Add batch operation events to the event system:
// src/types/events.ts (updates)
export const EVENT = {
// ... existing events
BATCH_SET: 'batch_set',
BATCH_GET: 'batch_get',
BATCH_DELETE: 'batch_delete',
BATCH_HAS: 'batch_has',
TRANSACTION_BEGIN: 'transaction_begin',
TRANSACTION_COMMIT: 'transaction_commit',
TRANSACTION_ROLLBACK: 'transaction_rollback',
TRANSACTION_ERROR: 'transaction_error',
} as const;
export interface BatchEventParam {
operationType: 'setMany' | 'getMany' | 'deleteMany' | 'hasMany';
totalCount: number;
successCount: number;
errorCount: number;
executionTime: number;
keys?: string[];
}
export interface TransactionEventParam {
transactionId: string;
operationCount: number;
executionTime: number;
success?: boolean;
error?: Error;
}5.2 Event Handler Registration
Add event handlers for batch and transaction operations:
// src/run-cache.ts (additional event methods)
export class RunCache {
/**
* Register callback for batch operation events
*/
static async onBatchOperation(callback: (event: BatchEventParam) => void | Promise<void>): Promise<void> {
await RunCache.ensureInitialized();
RunCache.instance.eventSystem.on('BATCH_SET', callback);
RunCache.instance.eventSystem.on('BATCH_GET', callback);
RunCache.instance.eventSystem.on('BATCH_DELETE', callback);
RunCache.instance.eventSystem.on('BATCH_HAS', callback);
}
/**
* Register callback for transaction events
*/
static async onTransaction(callback: (event: TransactionEventParam) => void | Promise<void>): Promise<void> {
await RunCache.ensureInitialized();
RunCache.instance.eventSystem.on('TRANSACTION_BEGIN', callback);
RunCache.instance.eventSystem.on('TRANSACTION_COMMIT', callback);
RunCache.instance.eventSystem.on('TRANSACTION_ROLLBACK', callback);
RunCache.instance.eventSystem.on('TRANSACTION_ERROR', callback);
}
}Phase 6: Performance Optimizations
6.1 Storage Adapter Optimizations
Optimize storage adapters for batch operations:
// src/types/storage-adapter.ts (updates)
export interface BatchStorageAdapter extends StorageAdapter {
/**
* Save multiple namespace data in a single operation
*/
saveBatch?(data: { namespace: string; data: string }[]): Promise<void>;
/**
* Load multiple namespace data in a single operation
*/
loadBatch?(namespaces: string[]): Promise<(string | null)[]>;
/**
* Clear multiple namespaces in a single operation
*/
clearBatch?(namespaces: string[]): Promise<void>;
}
// src/storage/indexed-db-adapter.ts (batch optimizations)
export class IndexedDBAdapter implements BatchStorageAdapter {
async saveBatch(data: { namespace: string; data: string }[]): Promise<void> {
const db = await this.initDB();
return new Promise<void>((resolve, reject) => {
const transaction = db.transaction(
data.map(d => this.getNamespaceStore(d.namespace)),
'readwrite'
);
let completed = 0;
const total = data.length;
data.forEach(({ namespace, data: cacheData }) => {
const store = transaction.objectStore(this.getNamespaceStore(namespace));
const request = store.put({ id: this.storageKey, data: cacheData });
request.onsuccess = () => {
completed++;
if (completed === total) {
resolve();
}
};
request.onerror = () => reject(request.error);
});
transaction.onerror = () => reject(transaction.error);
});
}
async loadBatch(namespaces: string[]): Promise<(string | null)[]> {
const db = await this.initDB();
return new Promise<(string | null)[]>((resolve, reject) => {
const transaction = db.transaction(
namespaces.map(ns => this.getNamespaceStore(ns)),
'readonly'
);
const results: (string | null)[] = new Array(namespaces.length);
let completed = 0;
namespaces.forEach((namespace, index) => {
const store = transaction.objectStore(this.getNamespaceStore(namespace));
const request = store.get(this.storageKey);
request.onsuccess = () => {
results[index] = request.result?.data || null;
completed++;
if (completed === namespaces.length) {
resolve(results);
}
};
request.onerror = () => reject(request.error);
});
transaction.onerror = () => reject(transaction.error);
});
}
}6.2 Memory Optimization
Implement memory-efficient batch processing:
// src/core/batch-memory-manager.ts
export class BatchMemoryManager {
private maxMemoryUsage: number;
private currentMemoryUsage: number = 0;
constructor(maxMemoryUsage: number = 100 * 1024 * 1024) { // 100MB default
this.maxMemoryUsage = maxMemoryUsage;
}
async processBatchWithMemoryLimit<T, R>(
items: T[],
processor: (batch: T[]) => Promise<R[]>,
estimateSize: (item: T) => number,
maxBatchSize: number = 1000
): Promise<R[]> {
const results: R[] = [];
let currentBatch: T[] = [];
let currentBatchSize = 0;
for (const item of items) {
const itemSize = estimateSize(item);
// Check if adding this item would exceed memory or batch size limits
if (
(currentBatchSize + itemSize > this.maxMemoryUsage) ||
(currentBatch.length >= maxBatchSize)
) {
// Process current batch
if (currentBatch.length > 0) {
const batchResults = await processor(currentBatch);
results.push(...batchResults);
// Reset batch
currentBatch = [];
currentBatchSize = 0;
}
}
currentBatch.push(item);
currentBatchSize += itemSize;
}
// Process remaining items
if (currentBatch.length > 0) {
const batchResults = await processor(currentBatch);
results.push(...batchResults);
}
return results;
}
estimateSetEntrySize(entry: BatchSetEntry): number {
let size = 0;
size += (entry.key?.length || 0) * 2; // UTF-16
size += (entry.value?.length || 0) * 2;
size += (entry.tags?.join('').length || 0) * 2;
size += (entry.dependencies?.join('').length || 0) * 2;
size += 200; // Overhead estimate
return size;
}
estimateKeySize(key: string): number {
return key.length * 2 + 50; // UTF-16 + overhead
}
}Phase 7: Testing and Benchmarks
7.1 Comprehensive Test Suite
Create extensive tests for batch operations:
// src/batch/batch-operations.test.ts
describe('Batch Operations', () => {
let cacheStore: CacheStore;
beforeEach(async () => {
cacheStore = await CacheStore.create();
});
afterEach(() => {
cacheStore.flush();
});
describe('setMany', () => {
it('should set multiple entries successfully', async () => {
const entries: BatchSetEntry[] = [
{ key: 'key1', value: 'value1' },
{ key: 'key2', value: 'value2' },
{ key: 'key3', value: 'value3' },
];
const result = await cacheStore.setMany(entries);
expect(result.successCount).toBe(3);
expect(result.errorCount).toBe(0);
expect(result.totalCount).toBe(3);
// Verify values were set
for (const entry of entries) {
const value = await cacheStore.get(entry.key);
expect(value).toBe(entry.value);
}
});
it('should handle partial failures with continueOnError', async () => {
const entries: BatchSetEntry[] = [
{ key: 'valid1', value: 'value1' },
{ key: '', value: 'invalid' }, // Empty key should fail
{ key: 'valid2', value: 'value2' },
];
const result = await cacheStore.setMany(entries, { continueOnError: true });
expect(result.successCount).toBe(2);
expect(result.errorCount).toBe(1);
expect(result.errors).toHaveLength(1);
expect(result.errors[0].key).toBe('');
});
it('should validate all entries before processing', async () => {
const entries: BatchSetEntry[] = [
{ key: 'key1', value: 'value1' },
{ key: '', value: 'invalid' },
{ key: 'key3', value: 'value3' },
];
await expect(
cacheStore.setMany(entries, { validateFirst: true, continueOnError: false })
).rejects.toThrow('Validation failed');
});
it('should respect concurrency limits', async () => {
const entries: BatchSetEntry[] = Array.from({ length: 100 }, (_, i) => ({
key: `key${i}`,
value: `value${i}`,
}));
const startTime = Date.now();
const result = await cacheStore.setMany(entries, { maxConcurrency: 5 });
const endTime = Date.now();
expect(result.successCount).toBe(100);
expect(endTime - startTime).toBeGreaterThan(0); // Should take some time due to concurrency limit
});
});
describe('getMany', () => {
beforeEach(async () => {
// Set up test data
const entries: BatchSetEntry[] = [
{ key: 'user:1', value: 'John' },
{ key: 'user:2', value: 'Jane' },
{ key: 'user:3', value: 'Bob' },
{ key: 'product:1', value: 'Widget' },
{ key: 'product:2', value: 'Gadget' },
];
await cacheStore.setMany(entries);
});
it('should get multiple entries successfully', async () => {
const keys = ['user:1', 'user:2', 'product:1'];
const result = await cacheStore.getMany(keys);
expect(result.successCount).toBe(3);
expect(result.results).toHaveLength(3);
const valueMap = new Map(result.results.map(r => [r.key, r.value]));
expect(valueMap.get('user:1')).toBe('John');
expect(valueMap.get('user:2')).toBe('Jane');
expect(valueMap.get('product:1')).toBe('Widget');
});
it('should handle non-existent keys', async () => {
const keys = ['user:1', 'nonexistent', 'user:2'];
const result = await cacheStore.getMany(keys);
expect(result.totalCount).toBe(3);
const resultMap = new Map(result.results.map(r => [r.key, r]));
expect(resultMap.get('user:1')?.found).toBe(true);
expect(resultMap.get('nonexistent')?.found).toBe(false);
expect(resultMap.get('user:2')?.found).toBe(true);
});
it('should handle wildcard patterns', async () => {
const keys = ['user:*', 'product:1'];
const result = await cacheStore.getMany(keys);
expect(result.results.length).toBeGreaterThan(2); // Should include multiple user entries
const productResult = result.results.find(r => r.key === 'product:1');
expect(productResult?.value).toBe('Widget');
});
});
describe('deleteMany', () => {
beforeEach(async () => {
const entries: BatchSetEntry[] = [
{ key: 'temp:1', value: 'data1' },
{ key: 'temp:2', value: 'data2' },
{ key: 'temp:3', value: 'data3' },
{ key: 'keep:1', value: 'keep1' },
];
await cacheStore.setMany(entries);
});
it('should delete multiple entries successfully', async () => {
const keys = ['temp:1', 'temp:2', 'temp:3'];
const result = await cacheStore.deleteMany(keys);
expect(result.successCount).toBe(3);
// Verify deletions
for (const key of keys) {
const exists = await cacheStore.has(key);
expect(exists).toBe(false);
}
// Verify other keys remain
const keepExists = await cacheStore.has('keep:1');
expect(keepExists).toBe(true);
});
it('should handle non-existent keys gracefully', async () => {
const keys = ['temp:1', 'nonexistent', 'temp:2'];
const result = await cacheStore.deleteMany(keys);
expect(result.totalCount).toBe(3);
const resultMap = new Map(result.results.map(r => [r.key, r]));
expect(resultMap.get('temp:1')?.deleted).toBe(true);
expect(resultMap.get('nonexistent')?.deleted).toBe(false);
expect(resultMap.get('temp:2')?.deleted).toBe(true);
});
});
describe('hasMany', () => {
beforeEach(async () => {
const entries: BatchSetEntry[] = [
{ key: 'exists:1', value: 'data1' },
{ key: 'exists:2', value: 'data2' },
];
await cacheStore.setMany(entries);
});
it('should check existence of multiple keys', async () => {
const keys = ['exists:1', 'nonexistent', 'exists:2'];
const result = await cacheStore.hasMany(keys);
expect(result.totalCount).toBe(3);
const resultMap = new Map(result.results.map(r => [r.key, r.exists]));
expect(resultMap.get('exists:1')).toBe(true);
expect(resultMap.get('nonexistent')).toBe(false);
expect(resultMap.get('exists:2')).toBe(true);
});
});
});
// src/batch/transactions.test.ts
describe('Transactions', () => {
let cacheStore: CacheStore;
beforeEach(async () => {
cacheStore = await CacheStore.create();
});
afterEach(() => {
cacheStore.flush();
});
describe('Basic Transaction Operations', () => {
it('should commit transaction successfully', async () => {
const result = await cacheStore.transaction(async (tx) => {
tx.set('key1', 'value1');
tx.set('key2', 'value2');
return 'success';
});
expect(result).toBe('success');
// Verify changes were applied
expect(await cacheStore.get('key1')).toBe('value1');
expect(await cacheStore.get('key2')).toBe('value2');
});
it('should rollback transaction on error', async () => {
// Set initial value
await cacheStore.set({ key: 'existing', value: 'original' });
await expect(
cacheStore.transaction(async (tx) => {
tx.set('existing', 'modified');
tx.set('new', 'value');
throw new Error('Intentional error');
})
).rejects.toThrow('Intentional error');
// Verify rollback
expect(await cacheStore.get('existing')).toBe('original');
expect(await cacheStore.get('new')).toBeUndefined();
});
it('should handle transaction isolation', async () => {
const tx = await cacheStore.beginTransaction();
tx.set('isolated', 'tx-value');
// Value should not be visible outside transaction yet
expect(await cacheStore.get('isolated')).toBeUndefined();
// But should be visible within transaction
expect(await tx.get('isolated')).toBe('tx-value');
await tx.commit();
// Now should be visible globally
expect(await cacheStore.get('isolated')).toBe('tx-value');
});
it('should support batch operations in transactions', async () => {
await cacheStore.transaction(async (tx) => {
const entries: BatchSetEntry[] = [
{ key: 'batch:1', value: 'value1' },
{ key: 'batch:2', value: 'value2' },
{ key: 'batch:3', value: 'value3' },
];
tx.setMany(entries);
tx.deleteMany(['batch:2']);
return 'completed';
});
expect(await cacheStore.get('batch:1')).toBe('value1');
expect(await cacheStore.get('batch:2')).toBeUndefined();
expect(await cacheStore.get('batch:3')).toBe('value3');
});
});
describe('Transaction Timeout', () => {
it('should timeout long-running transactions', async () => {
const tx = await cacheStore.beginTransaction({ timeout: 100 });
tx.set('key', 'value');
// Wait longer than timeout
await new Promise(resolve => setTimeout(resolve, 150));
// Transaction should be automatically rolled back
expect(tx.getState().state).toBe('rolled-back');
});
});
});7.2 Performance Benchmarks
Create benchmarks to measure batch operation performance:
// src/batch/performance.test.ts
describe('Batch Performance', () => {
let cacheStore: CacheStore;
beforeEach(async () => {
cacheStore = await CacheStore.create();
});
afterEach(() => {
cacheStore.flush();
});
describe('Performance Comparisons', () => {
it('should outperform individual operations for large datasets', async () => {
const count = 1000;
const entries: BatchSetEntry[] = Array.from({ length: count }, (_, i) => ({
key: `perf:${i}`,
value: `value${i}`,
}));
// Measure individual operations
const individualStart = Date.now();
for (const entry of entries) {
await cacheStore.set(entry);
}
const individualTime = Date.now() - individualStart;
// Clear cache for batch test
cacheStore.flush();
// Measure batch operations
const batchStart = Date.now();
await cacheStore.setMany(entries);
const batchTime = Date.now() - batchStart;
console.log(`Individual: ${individualTime}ms, Batch: ${batchTime}ms`);
console.log(`Improvement: ${(individualTime / batchTime).toFixed(2)}x faster`);
// Batch should be significantly faster
expect(batchTime).toBeLessThan(individualTime * 0.5); // At least 2x faster
});
it('should handle memory efficiently with large batches', async () => {
const largeValue = 'x'.repeat(10000); // 10KB value
const count = 1000; // 10MB total
const entries: BatchSetEntry[] = Array.from({ length: count }, (_, i) => ({
key: `large:${i}`,
value: largeValue,
}));
const memoryBefore = process.memoryUsage().heapUsed;
await cacheStore.setMany(entries, { maxConcurrency: 50 });
const memoryAfter = process.memoryUsage().heapUsed;
const memoryIncrease = memoryAfter - memoryBefore;
console.log(`Memory increase: ${(memoryIncrease / 1024 / 1024).toFixed(2)}MB`);
// Memory increase should be reasonable (not much more than the data size)
expect(memoryIncrease).toBeLessThan(15 * 1024 * 1024); // Less than 15MB
});
});
describe('Concurrency Limits', () => {
it('should respect concurrency limits without degrading performance', async () => {
const count = 500;
const entries: BatchSetEntry[] = Array.from({ length: count }, (_, i) => ({
key: `concurrent:${i}`,
value: `value${i}`,
}));
// Test different concurrency levels
const concurrencyLevels = [1, 5, 10, 20, 50];
const results: { concurrency: number; time: number }[] = [];
for (const concurrency of concurrencyLevels) {
cacheStore.flush();
const start = Date.now();
await cacheStore.setMany(entries, { maxConcurrency: concurrency });
const time = Date.now() - start;
results.push({ concurrency, time });
console.log(`Concurrency ${concurrency}: ${time}ms`);
}
// Higher concurrency should generally be faster (up to a point)
const lowConcurrencyTime = results.find(r => r.concurrency === 1)?.time || 0;
const highConcurrencyTime = results.find(r => r.concurrency === 20)?.time || 0;
expect(highConcurrencyTime).toBeLessThan(lowConcurrencyTime);
});
});
});Phase 8: Documentation and Examples
8.1 API Documentation
Comprehensive documentation for batch operations:
## Batch Operations
RunCache supports efficient batch operations to reduce overhead when working with multiple cache entries.
### Setting Multiple Entries
```typescript
import { RunCache } from 'run-cache';
// Using setMany with array of entries
const entries = [
{ key: 'user:1', value: 'John Doe' },
{ key: 'user:2', value: 'Jane Smith' },
{ key: 'user:3', value: 'Bob Wilson' }
];
const result = await RunCache.setMany(entries);
console.log(`Set ${result.successCount}/${result.totalCount} entries`);
// Using convenience batch.set method
const userMap = {
'user:4': 'Alice Brown',
'user:5': 'Charlie Davis'
};
await RunCache.batch.set(userMap);Getting Multiple Entries
// Get multiple specific keys
const keys = ['user:1', 'user:2', 'user:3'];
const result = await RunCache.getMany(keys);
// Convert to object for easier access
const users = await RunCache.batch.get(keys);
console.log(users['user:1']); // 'John Doe'
// Handle results with error checking
result.results.forEach(({ key, value, found, error }) => {
if (found) {
console.log(`${key}: ${value}`);
} else if (error) {
console.error(`Error getting ${key}:`, error);
} else {
console.log(`${key}: not found`);
}
});Deleting Multiple Entries
// Delete specific keys
const keysToDelete = ['temp:1', 'temp:2', 'temp:3'];
const result = await RunCache.deleteMany(keysToDelete);
// Delete using convenience method (returns count)
const deletedCount = await RunCache.batch.delete(['old:*']);
console.log(`Deleted ${deletedCount} entries`);Batch Options
const options = {
continueOnError: true, // Continue processing if individual operations fail
maxConcurrency: 10, // Maximum concurrent operations
validateFirst: true, // Validate all entries before processing
};
await RunCache.setMany(entries, options);Advanced Batch Operations
// Complex batch operation with different configurations
const complexEntries = [
{
key: 'user:session:123',
value: 'active',
ttl: 30 * 60 * 1000, // 30 minutes
tags: ['session', 'user:123']
},
{
key: 'user:profile:123',
value: JSON.stringify({ name: 'John', email: 'john@example.com' }),
tags: ['profile', 'user:123'],
dependencies: ['user:session:123']
}
];
const result = await RunCache.setMany(complexEntries, {
continueOnError: false,
maxConcurrency: 5
});Transactions
Transactions provide atomic operations across multiple cache keys.
Basic Transaction Usage
// Using the transaction helper
const result = await RunCache.transaction(async (tx) => {
tx.set('counter', '10');
tx.set('status', 'active');
// Read within transaction
const currentCounter = await tx.get('counter');
tx.set('double-counter', String(parseInt(currentCounter || '0') * 2));
return 'success';
});
console.log(result); // 'success' if all operations succeededManual Transaction Control
const tx = await RunCache.beginTransaction({
timeout: 5000, // 5 second timeout
retries: 3 // Retry up to 3 times on failure
});
try {
tx.set('key1', 'value1');
tx.set('key2', 'value2');
// Batch operations within transaction
tx.setMany([
{ key: 'batch:1', value: 'val1' },
{ key: 'batch:2', value: 'val2' }
]);
const result = await tx.commit();
console.log('Transaction committed:', result.success);
} catch (error) {
await tx.rollback();
console.error('Transaction failed:', error);
}Transaction Error Handling
try {
await RunCache.transaction(async (tx) => {
tx.set('important', 'data');
// This will cause rollback
throw new Error('Something went wrong');
});
} catch (error) {
// All changes are automatically rolled back
console.log('Transaction rolled back');
}
// Verify rollback
const value = await RunCache.get('important');
console.log(value); // undefinedPerformance Best Practices
Batch Size Optimization
// For large datasets, process in chunks
const largeDataset = Array.from({ length: 10000 }, (_, i) => ({
key: `data:${i}`,
value: `value${i}`
}));
// Process in batches of 1000
const batchSize = 1000;
for (let i = 0; i < largeDataset.length; i += batchSize) {
const batch = largeDataset.slice(i, i + batchSize);
await RunCache.setMany(batch, { maxConcurrency: 20 });
}Memory Management
// Use memory-efficient processing for large values
const largeEntries = [
/* ... large data ... */
];
// Process with memory limits
await RunCache.setMany(largeEntries, {
maxConcurrency: 5, // Lower concurrency for large data
continueOnError: true
});
#### 8.2 Migration Guide
Guide for adopting batch operations:
```markdown
## Migrating to Batch Operations
### Before (Individual Operations)
```typescript
// Inefficient: Multiple individual operations
const users = ['user:1', 'user:2', 'user:3'];
const userData = [];
for (const userId of users) {
const data = await RunCache.get(userId);
if (data) {
userData.push(data);
}
}
// Setting multiple values
await RunCache.set({ key: 'key1', value: 'value1' });
await RunCache.set({ key: 'key2', value: 'value2' });
await RunCache.set({ key: 'key3', value: 'value3' });
After (Batch Operations)
// Efficient: Single batch operation
const users = ['user:1', 'user:2', 'user:3'];
const result = await RunCache.getMany(users);
const userData = result.results
.filter(r => r.found)
.map(r => r.value);
// Setting multiple values
const entries = [
{ key: 'key1', value: 'value1' },
{ key: 'key2', value: 'value2' },
{ key: 'key3', value: 'value3' }
];
await RunCache.setMany(entries);Performance Benefits
- Reduced Overhead: Single operation vs multiple operations
- Better Error Handling: Centralized error reporting
- Concurrency Control: Configurable parallelism
- Memory Efficiency: Optimized memory usage for large datasets
## File Structure
src/
├── core/
│ ├── batch-operations-manager.ts # Batch operations coordination
│ ├── transaction-manager.ts # Transaction management
│ ├── batch-memory-manager.ts # Memory optimization
│ └── cache-store.ts # Updated with batch support
├── types/
│ ├── batch-operations.ts # Batch operation types
│ ├── transactions.ts # Transaction types
│ └── events.ts # Updated with batch events
├── batch/
│ ├── batch-operations.test.ts # Core batch tests
│ ├── transactions.test.ts # Transaction tests
│ ├── performance.test.ts # Performance benchmarks
│ └── integration.test.ts # Integration tests
├── storage/
│ └── *.ts # Updated adapters with batch support
└── run-cache.ts # Updated main API
## Timeline Estimation
### Phase 1 (Week 1): Core Types and Interfaces
- Define batch operation types and interfaces
- Create transaction type definitions
- Update event system types
- Design API contracts
### Phase 2 (Week 2): Batch Operations Manager
- Implement `BatchOperationsManager` class
- Core batch logic for `setMany`, `getMany`, `deleteMany`
- Concurrency control and error handling
- Memory management integration
### Phase 3 (Week 3): Transaction System
- Implement `TransactionManager` and `CacheTransaction`
- Atomic operations with rollback capability
- Transaction isolation and state management
- Timeout and retry mechanisms
### Phase 4 (Week 4): CacheStore Integration
- Update `CacheStore` with batch and transaction support
- Integrate batch manager and transaction manager
- Update existing operations to work with batch context
- Performance optimizations
### Phase 5 (Week 5): API and Storage Updates
- Update RunCache public API
- Implement convenience methods (`RunCache.batch`)
- Update storage adapters for batch operations
- Event system integration
### Phase 6 (Week 6): Testing and Documentation
- Comprehensive test suite
- Performance benchmarks
- Integration tests with existing features
- Complete API documentation and examples
## Success Metrics
### Performance Targets
1. **Batch Operations**: 5-10x faster than individual operations for 100+ items
2. **Memory Efficiency**: <20% overhead for batch processing vs individual operations
3. **Concurrency**: Configurable parallelism with optimal defaults
4. **Error Rate**: <1% operation failures under normal conditions
### Functionality Goals
1. **Complete API**: All core operations support batch variants
2. **Transaction ACID**: Full ACID compliance for transaction operations
3. **Backward Compatibility**: Zero breaking changes to existing API
4. **Error Handling**: Comprehensive error reporting and recovery
5. **Integration**: Seamless work with all existing features
### Quality Assurance
1. **Test Coverage**: 100% coverage for batch and transaction features
2. **Performance**: Benchmarks showing clear improvements
3. **Documentation**: Complete API docs with examples
4. **Cross-Platform**: Works consistently across all supported environments
## Risk Mitigation
### Performance Risks
1. **Memory Usage**: Implement streaming for very large batches
2. **CPU Overhead**: Optimize batch processing algorithms
3. **Storage Bottlenecks**: Implement efficient storage adapter optimizations
4. **Concurrency Issues**: Careful lock management and deadlock prevention
### Implementation Risks
1. **Transaction Complexity**: Start with simple implementation, iterate
2. **API Confusion**: Clear separation between batch and individual operations
3. **Error Handling**: Comprehensive error scenarios and recovery mechanisms
4. **Integration Issues**: Thorough testing with existing middleware and features
### Migration Risks
1. **Breaking Changes**: Maintain full backward compatibility
2. **Performance Regression**: Ensure individual operations remain fast
3. **Learning Curve**: Provide clear documentation and examples
4. **Storage Compatibility**: Ensure all adapters work with new features
## Advanced Features (Future Enhancements)
### Phase 7+ (Future Iterations)
#### 1. Advanced Transaction Features
```typescript
// Nested transactions
await RunCache.transaction(async (tx1) => {
tx1.set('key1', 'value1');
await tx1.transaction(async (tx2) => {
tx2.set('key2', 'value2');
// Nested transaction semantics
});
});
// Transaction savepoints
await RunCache.transaction(async (tx) => {
tx.set('key1', 'value1');
const savepoint = await tx.createSavepoint();
tx.set('key2', 'value2');
// Rollback to savepoint
await tx.rollbackToSavepoint(savepoint);
tx.set('key3', 'value3');
await tx.commit(); // Only key1 and key3 are committed
});
2. Streaming Batch Operations
// Handle very large datasets with streaming
const stream = RunCache.createBatchStream({
operation: 'set',
batchSize: 1000,
maxConcurrency: 10
});
// Process millions of entries efficiently
for (let i = 0; i < 1000000; i++) {
await stream.write({ key: `item:${i}`, value: `data${i}` });
}
await stream.end();3. Conditional Batch Operations
// Conditional operations within batches
await RunCache.setMany([
{ key: 'counter', value: '1', condition: 'if-not-exists' },
{ key: 'timestamp', value: Date.now().toString(), condition: 'always' },
{ key: 'status', value: 'active', condition: 'if-equals:inactive' }
]);4. Batch Analytics and Monitoring
// Get detailed batch operation statistics
const stats = await RunCache.getBatchStats();
console.log({
totalBatchOperations: stats.totalOperations,
averageBatchSize: stats.averageBatchSize,
performanceImprovement: stats.performanceVsIndividual,
errorRate: stats.errorRate,
popularOperations: stats.operationBreakdown
});
// Real-time batch operation monitoring
RunCache.onBatchOperation((event) => {
if (event.executionTime > 1000) {
console.warn(`Slow batch operation: ${event.operationType} took ${event.executionTime}ms`);
}
});5. Cross-Namespace Batch Operations
// Batch operations across multiple namespaces
await RunCache.batch.setAcrossNamespaces([
{ namespace: 'users', key: 'user:1', value: 'John' },
{ namespace: 'sessions', key: 'session:1', value: 'active' },
{ namespace: 'analytics', key: 'login:1', value: Date.now().toString() }
]);
// Atomic operations across namespaces
await RunCache.crossNamespaceTransaction(async (ctx) => {
const userTx = await ctx.getNamespaceTransaction('users');
const sessionTx = await ctx.getNamespaceTransaction('sessions');
userTx.set('user:1', 'John');
sessionTx.set('session:1', 'active');
// Both succeed or both fail
});6. Optimistic Concurrency Control
// Version-based optimistic locking
const result = await RunCache.setMany([
{ key: 'document:1', value: 'content', version: 5 },
{ key: 'document:2', value: 'content', version: 3 }
], {
concurrencyControl: 'optimistic'
});
// Handle version conflicts
result.results.forEach(r => {
if (r.error?.type === 'version-conflict') {
console.log(`Version conflict for ${r.key}`);
}
});7. Batch Operations with Dependencies
// Batch operations with execution dependencies
await RunCache.batchWithDependencies([
{
id: 'op1',
operation: { type: 'set', key: 'base', value: 'data' }
},
{
id: 'op2',
operation: { type: 'set', key: 'derived', value: 'processed' },
dependsOn: ['op1']
},
{
id: 'op3',
operation: { type: 'set', key: 'final', value: 'result' },
dependsOn: ['op1', 'op2']
}
]);Integration Examples
With Existing Features
1. Batch + Middleware
// Middleware applies to all batch operations
RunCache.use(async (value, context, next) => {
if (context.operation === 'setMany') {
// Transform all values in batch
return next(value.toUpperCase());
}
return next(value);
});
await RunCache.setMany([
{ key: 'key1', value: 'hello' },
{ key: 'key2', value: 'world' }
]);
// Results in 'HELLO' and 'WORLD'2. Batch + Tags/Dependencies
// Batch operations with complex relationships
await RunCache.setMany([
{
key: 'user:profile:123',
value: JSON.stringify({ name: 'John' }),
tags: ['user:123', 'profile']
},
{
key: 'user:settings:123',
value: JSON.stringify({ theme: 'dark' }),
tags: ['user:123', 'settings'],
dependencies: ['user:profile:123']
},
{
key: 'user:cache:123',
value: JSON.stringify({ lastLogin: Date.now() }),
tags: ['user:123', 'cache'],
dependencies: ['user:profile:123', 'user:settings:123']
}
]);
// Invalidate all user data at once
await RunCache.invalidateByTag('user:123');3. Batch + Compression
// Large batch operations with compression
const largeEntries = Array.from({ length: 1000 }, (_, i) => ({
key: `large:${i}`,
value: 'x'.repeat(10000) // 10KB per entry
}));
await RunCache.setMany(largeEntries, {
maxConcurrency: 10
});
// Compression automatically applied to large values4. Batch + Events
// Monitor batch operations
RunCache.onBatchOperation((event) => {
console.log(`Batch ${event.operationType}: ${event.successCount}/${event.totalCount} in ${event.executionTime}ms`);
if (event.errorCount > 0) {
console.warn(`${event.errorCount} errors in batch operation`);
}
});
// Monitor transactions
RunCache.onTransaction((event) => {
if (event.success) {
console.log(`Transaction ${event.transactionId} committed with ${event.operationCount} operations`);
} else {
console.error(`Transaction ${event.transactionId} failed:`, event.error);
}
});Conclusion
This implementation plan provides RunCache with powerful batch operations and transaction capabilities while maintaining its core simplicity and performance characteristics. The phased approach ensures:
- Backward Compatibility: Existing code continues to work unchanged
- Performance: Significant improvements for bulk operations
- Flexibility: Multiple APIs to suit different use cases
- Reliability: Robust error handling and transaction support
- Scalability: Efficient memory and concurrency management
The batch operations feature will make RunCache significantly more suitable for applications that need to process large amounts of cached data efficiently, while the transaction support enables reliable atomic operations across multiple cache entries.
Key benefits for users:
- 5-10x performance improvement for bulk operations
- Atomic operations with full rollback support
- Memory efficient processing of large datasets
- Flexible APIs from simple convenience methods to full transaction control
- Zero breaking changes to existing applications
This implementation positions RunCache as a comprehensive caching solution suitable for both simple use cases and complex, high-performance applications.
Metadata
Metadata
Assignees
Labels
Projects
Status