Loading content...
Caching is a double-edged sword. When implemented correctly, it transforms sluggish applications into responsive systems. When implemented incorrectly—or when correctness erodes over time—it introduces some of the most insidious bugs in software engineering: stale data served to users, inconsistent state across services, and phantom performance that evaporates under changed conditions.
The fundamental challenge is this: caching inherently adds a layer of indirection between what users request and what they receive. Without rigorous testing, you cannot guarantee that this layer behaves correctly under all circumstances—cache hits, cache misses, evictions, invalidations, concurrent access, and failure scenarios.
This page establishes a comprehensive framework for testing cached behavior at every level of your system, from unit tests that verify individual cache operations to integration tests that validate end-to-end caching semantics.
By the end of this page, you will understand how to design testable cache implementations, write unit tests that verify cache correctness, create integration tests that validate caching semantics, and apply test patterns that catch cache-related bugs before they reach production.
Testing cache behavior differs fundamentally from testing regular business logic. Traditional tests verify that given input A, you receive output B. Cache tests must verify a more complex property: regardless of cache state, the observable behavior must match the specification.
This distinction creates unique testing challenges:
The Correctness Contract:
A cache must satisfy two fundamental properties:
Transparency: From the client's perspective, a cached system must behave identically to an uncached system (except for performance). If the underlying data source returns X, the cache must eventually return X.
Consistency: The cache must not return data that contradicts the current state of the source-of-truth, within the bounds of the chosen consistency model (strong, eventual, or read-your-writes).
Every cache test should ultimately verify one or both of these properties. A test that only checks 'the cache returns something' without verifying correctness provides false confidence.
Cache bugs rarely crash applications. Instead, they silently serve stale or incorrect data. Users see outdated prices, missing inventory, old profile pictures—and often don't report it because they assume the data is correct. Rigorous testing is your only defense against these silent failures.
Before discussing test patterns, we must address a foundational truth: poorly designed cache implementations are difficult or impossible to test properly. Testability must be designed into the cache layer from the beginning.
The key principle is separation of concerns through abstraction. The cache mechanism should be decoupled from the business logic that uses it, allowing both to be tested independently and together.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126
// Bad: Cache logic embedded directly in serviceclass ProductService { private redisClient: RedisClient; async getProduct(id: string): Promise<Product> { // Cache check hardcoded - impossible to test without Redis const cached = await this.redisClient.get(`product:${id}`); if (cached) { return JSON.parse(cached); } const product = await this.database.find(id); await this.redisClient.setex(`product:${id}`, 3600, JSON.stringify(product)); return product; }} // Good: Cache abstraction enables testinginterface Cache<T> { get(key: string): Promise<T | null>; set(key: string, value: T, ttlSeconds?: number): Promise<void>; delete(key: string): Promise<void>; clear(): Promise<void>;} // In-memory implementation for testingclass InMemoryCache<T> implements Cache<T> { private store = new Map<string, { value: T; expiresAt: number | null }>(); // Track operations for test assertions public operations: { type: string; key: string; timestamp: number }[] = []; async get(key: string): Promise<T | null> { this.operations.push({ type: 'get', key, timestamp: Date.now() }); const entry = this.store.get(key); if (!entry) return null; if (entry.expiresAt && Date.now() > entry.expiresAt) { this.store.delete(key); return null; } return entry.value; } async set(key: string, value: T, ttlSeconds?: number): Promise<void> { this.operations.push({ type: 'set', key, timestamp: Date.now() }); const expiresAt = ttlSeconds ? Date.now() + (ttlSeconds * 1000) : null; this.store.set(key, { value, expiresAt }); } async delete(key: string): Promise<void> { this.operations.push({ type: 'delete', key, timestamp: Date.now() }); this.store.delete(key); } async clear(): Promise<void> { this.operations.push({ type: 'clear', key: '*', timestamp: Date.now() }); this.store.clear(); } // Test helper: forcibly expire an entry expireEntry(key: string): void { const entry = this.store.get(key); if (entry) { entry.expiresAt = Date.now() - 1; } } // Test helper: check if key exists (bypassing expiry check) hasEntry(key: string): boolean { return this.store.has(key); }} // Redis implementation for productionclass RedisCache<T> implements Cache<T> { constructor(private client: RedisClient) {} async get(key: string): Promise<T | null> { const data = await this.client.get(key); return data ? JSON.parse(data) : null; } async set(key: string, value: T, ttlSeconds?: number): Promise<void> { const serialized = JSON.stringify(value); if (ttlSeconds) { await this.client.setex(key, ttlSeconds, serialized); } else { await this.client.set(key, serialized); } } async delete(key: string): Promise<void> { await this.client.del(key); } async clear(): Promise<void> { await this.client.flushdb(); }} // Service now accepts cache as dependencyclass ProductService { constructor( private cache: Cache<Product>, private database: ProductDatabase, private metrics: MetricsCollector ) {} async getProduct(id: string): Promise<Product> { const cacheKey = `product:${id}`; const cached = await this.cache.get(cacheKey); if (cached) { this.metrics.recordCacheHit('product'); return cached; } this.metrics.recordCacheMiss('product'); const product = await this.database.find(id); await this.cache.set(cacheKey, product, 3600); return product; }}Notice how the InMemoryCache includes test helper methods like 'expireEntry()' and 'hasEntry()'. These methods expose internal state for testing purposes. In production caches, you typically can't inspect or manipulate internal state—but in test caches, these capabilities are invaluable for verifying correct behavior.
Unit tests for cache behavior should verify the fundamental operations in isolation. Each test should be independent, repeatable, and fast. The key categories of unit tests for caching are:
1. Cache Miss Behavior — Verify that when data isn't cached, the system correctly fetches from the source and populates the cache.
2. Cache Hit Behavior — Verify that when data is cached, the system returns it without accessing the source.
3. Cache Invalidation — Verify that invalidation operations correctly remove or update cached data.
4. TTL Expiration — Verify that expired entries are treated as cache misses.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140
import { describe, it, expect, beforeEach, vi } from 'vitest';import { InMemoryCache } from './InMemoryCache';import { ProductService } from './ProductService';import { MockProductDatabase } from './mocks/MockProductDatabase';import { MockMetricsCollector } from './mocks/MockMetricsCollector'; describe('ProductService Cache Behavior', () => { let cache: InMemoryCache<Product>; let database: MockProductDatabase; let metrics: MockMetricsCollector; let service: ProductService; beforeEach(() => { // Fresh instances for each test - no state pollution cache = new InMemoryCache<Product>(); database = new MockProductDatabase(); metrics = new MockMetricsCollector(); service = new ProductService(cache, database, metrics); }); describe('Cache Miss Behavior', () => { it('should fetch from database on cache miss', async () => { const expectedProduct = { id: 'prod-1', name: 'Widget', price: 29.99 }; database.setProduct('prod-1', expectedProduct); const result = await service.getProduct('prod-1'); expect(result).toEqual(expectedProduct); expect(database.queryCount).toBe(1); }); it('should populate cache after database fetch', async () => { const expectedProduct = { id: 'prod-1', name: 'Widget', price: 29.99 }; database.setProduct('prod-1', expectedProduct); await service.getProduct('prod-1'); // Verify cache was populated const cached = await cache.get('product:prod-1'); expect(cached).toEqual(expectedProduct); }); it('should record cache miss metric', async () => { database.setProduct('prod-1', { id: 'prod-1', name: 'Widget', price: 29.99 }); await service.getProduct('prod-1'); expect(metrics.getCacheMisses('product')).toBe(1); expect(metrics.getCacheHits('product')).toBe(0); }); }); describe('Cache Hit Behavior', () => { it('should return cached data without database query', async () => { const cachedProduct = { id: 'prod-1', name: 'Cached Widget', price: 19.99 }; await cache.set('product:prod-1', cachedProduct, 3600); const result = await service.getProduct('prod-1'); expect(result).toEqual(cachedProduct); expect(database.queryCount).toBe(0); // No database hit }); it('should record cache hit metric', async () => { await cache.set('product:prod-1', { id: 'prod-1', name: 'Widget', price: 29.99 }, 3600); await service.getProduct('prod-1'); expect(metrics.getCacheHits('product')).toBe(1); expect(metrics.getCacheMisses('product')).toBe(0); }); it('should serve cache hit even if database has different data', async () => { // This verifies cache takes precedence const cachedProduct = { id: 'prod-1', name: 'Cached Version', price: 19.99 }; const databaseProduct = { id: 'prod-1', name: 'Database Version', price: 29.99 }; await cache.set('product:prod-1', cachedProduct, 3600); database.setProduct('prod-1', databaseProduct); const result = await service.getProduct('prod-1'); expect(result.name).toBe('Cached Version'); }); }); describe('TTL Expiration', () => { it('should treat expired entry as cache miss', async () => { const product = { id: 'prod-1', name: 'Widget', price: 29.99 }; await cache.set('product:prod-1', product, 3600); database.setProduct('prod-1', product); // Force expiration using test helper cache.expireEntry('product:prod-1'); await service.getProduct('prod-1'); // Should have queried database due to expiration expect(database.queryCount).toBe(1); expect(metrics.getCacheMisses('product')).toBe(1); }); it('should refresh cache after TTL expiration', async () => { const oldProduct = { id: 'prod-1', name: 'Old Widget', price: 19.99 }; const newProduct = { id: 'prod-1', name: 'New Widget', price: 29.99 }; await cache.set('product:prod-1', oldProduct, 3600); database.setProduct('prod-1', newProduct); cache.expireEntry('product:prod-1'); const result = await service.getProduct('prod-1'); expect(result.name).toBe('New Widget'); }); }); describe('Cache Operation Tracking', () => { it('should perform get then set on cache miss', async () => { database.setProduct('prod-1', { id: 'prod-1', name: 'Widget', price: 29.99 }); await service.getProduct('prod-1'); const ops = cache.operations; expect(ops.length).toBe(2); expect(ops[0].type).toBe('get'); expect(ops[1].type).toBe('set'); }); it('should perform only get on cache hit', async () => { await cache.set('product:prod-1', { id: 'prod-1', name: 'Widget', price: 29.99 }, 3600); cache.operations.length = 0; // Clear setup operations await service.getProduct('prod-1'); const ops = cache.operations; expect(ops.length).toBe(1); expect(ops[0].type).toBe('get'); }); });});Each test creates fresh cache, database, and metrics instances in beforeEach/setUp. This ensures test isolation—no test can affect another through shared cache state. Cache pollution is a common source of flaky tests in systems that share cache instances across tests.
Cache invalidation is famously one of the two hard problems in computer science. Testing invalidation logic is equally challenging because you must verify both what gets invalidated and what remains cached.
Invalidation tests should cover:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143
describe('Cache Invalidation', () => { let cache: InMemoryCache<any>; let productService: ProductService; let catalogService: CatalogService; beforeEach(() => { cache = new InMemoryCache<any>(); // Services share the same cache instance productService = new ProductService(cache, mockDatabase, mockMetrics); catalogService = new CatalogService(cache, mockDatabase); }); describe('Direct Invalidation', () => { it('should invalidate cache entry when product is updated', async () => { // Arrange: Cache a product const originalProduct = { id: 'prod-1', name: 'Original', price: 29.99 }; await cache.set('product:prod-1', originalProduct, 3600); // Act: Update the product (which should invalidate cache) const updatedProduct = { id: 'prod-1', name: 'Updated', price: 39.99 }; mockDatabase.setProduct('prod-1', updatedProduct); await productService.updateProduct('prod-1', updatedProduct); // Assert: Cache entry should be invalidated const cached = await cache.get('product:prod-1'); expect(cached).toBeNull(); }); it('should serve fresh data after invalidation', async () => { // Arrange await cache.set('product:prod-1', { id: 'prod-1', name: 'Cached', price: 29.99 }, 3600); const freshProduct = { id: 'prod-1', name: 'Fresh from DB', price: 49.99 }; mockDatabase.setProduct('prod-1', freshProduct); // Act: Invalidate then fetch await productService.invalidateProductCache('prod-1'); const result = await productService.getProduct('prod-1'); // Assert expect(result.name).toBe('Fresh from DB'); expect(mockDatabase.queryCount).toBe(1); }); }); describe('Cascading Invalidation', () => { it('should invalidate product listings when product is updated', async () => { // Arrange: Cache both product and category listing containing it await cache.set('product:prod-1', { id: 'prod-1', name: 'Widget' }, 3600); await cache.set('category:electronics:products', [ { id: 'prod-1', name: 'Widget' }, { id: 'prod-2', name: 'Gadget' } ], 3600); // Act: Update product await productService.updateProduct('prod-1', { id: 'prod-1', name: 'Super Widget' }); // Assert: Both caches should be invalidated expect(await cache.get('product:prod-1')).toBeNull(); expect(await cache.get('category:electronics:products')).toBeNull(); }); it('should invalidate all product caches when category is deleted', async () => { // Arrange: Cache multiple products in a category await cache.set('product:prod-1', { id: 'prod-1', categoryId: 'cat-1' }, 3600); await cache.set('product:prod-2', { id: 'prod-2', categoryId: 'cat-1' }, 3600); await cache.set('product:prod-3', { id: 'prod-3', categoryId: 'cat-2' }, 3600); // Act: Delete category await catalogService.deleteCategory('cat-1'); // Assert: Products in deleted category should be invalidated expect(await cache.get('product:prod-1')).toBeNull(); expect(await cache.get('product:prod-2')).toBeNull(); // Product in different category should remain cached expect(await cache.get('product:prod-3')).not.toBeNull(); }); }); describe('Pattern-Based Invalidation', () => { it('should invalidate all matching keys for pattern', async () => { // Arrange: Cache various keys await cache.set('user:123:profile', { name: 'John' }, 3600); await cache.set('user:123:preferences', { theme: 'dark' }, 3600); await cache.set('user:123:sessions', ['sess-1', 'sess-2'], 3600); await cache.set('user:456:profile', { name: 'Jane' }, 3600); // Act: Invalidate all caches for user 123 await cache.deletePattern('user:123:*'); // Assert expect(await cache.get('user:123:profile')).toBeNull(); expect(await cache.get('user:123:preferences')).toBeNull(); expect(await cache.get('user:123:sessions')).toBeNull(); // Other users unaffected expect(await cache.get('user:456:profile')).not.toBeNull(); }); }); describe('Version-Based Invalidation', () => { it('should miss cache when version changes', async () => { const versionedCache = new VersionedCache<Product>(cache); // Arrange: Cache with version 1 versionedCache.setVersion('products', 1); await versionedCache.set('product:prod-1', { id: 'prod-1', name: 'v1' }, 3600); // Act: Increment version (simulates schema change or bulk refresh) versionedCache.setVersion('products', 2); // Assert: Old cached data is not returned const result = await versionedCache.get('product:prod-1'); expect(result).toBeNull(); }); });}); // Example versioned cache wrapperclass VersionedCache<T> { private versions: Map<string, number> = new Map(); constructor(private innerCache: Cache<{ version: number; data: T }>) {} setVersion(namespace: string, version: number): void { this.versions.set(namespace, version); } async get(key: string): Promise<T | null> { const namespace = key.split(':')[0]; const currentVersion = this.versions.get(namespace) || 0; const entry = await this.innerCache.get(key); if (!entry || entry.version !== currentVersion) { return null; } return entry.data; } async set(key: string, value: T, ttlSeconds: number): Promise<void> { const namespace = key.split(':')[0]; const version = this.versions.get(namespace) || 0; await this.innerCache.set(key, { version, data: value }, ttlSeconds); }}One of the most common cache bugs is incomplete invalidation—updating a product but forgetting to invalidate the category listing that contains it. Tests should explicitly verify that ALL related caches are invalidated, not just the obvious ones.
While unit tests with in-memory cache implementations verify logic correctness, integration tests with real cache systems (Redis, Memcached) verify that your code handles the actual infrastructure correctly.
Integration tests catch issues that unit tests miss:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146
import { GenericContainer, StartedTestContainer } from 'testcontainers';import { createClient, RedisClientType } from 'redis'; describe('ProductService Redis Integration', () => { let redisContainer: StartedTestContainer; let redisClient: RedisClientType; let cache: RedisCache<Product>; let service: ProductService; // Start Redis container before all tests beforeAll(async () => { redisContainer = await new GenericContainer('redis:7-alpine') .withExposedPorts(6379) .start(); const redisUrl = `redis://${redisContainer.getHost()}:${redisContainer.getMappedPort(6379)}`; redisClient = createClient({ url: redisUrl }); await redisClient.connect(); cache = new RedisCache<Product>(redisClient); service = new ProductService(cache, mockDatabase, mockMetrics); }, 60000); // Extended timeout for container startup afterAll(async () => { await redisClient.quit(); await redisContainer.stop(); }); beforeEach(async () => { // Clean slate for each test await redisClient.flushDb(); }); describe('Data Serialization', () => { it('should correctly serialize and deserialize complex objects', async () => { const complexProduct = { id: 'prod-1', name: 'Widget', price: 29.99, metadata: { tags: ['electronics', 'gadgets'], dimensions: { width: 10, height: 5, depth: 2 }, createdAt: new Date().toISOString(), }, variants: [ { sku: 'WID-RED', color: 'red', stock: 100 }, { sku: 'WID-BLU', color: 'blue', stock: 50 }, ], }; await cache.set('product:prod-1', complexProduct, 3600); const retrieved = await cache.get('product:prod-1'); expect(retrieved).toEqual(complexProduct); }); it('should handle special characters in data', async () => { const product = { id: 'prod-special', name: 'Widget™ — Pro Edition (Ü)', description: 'Features: "quotes", \backslash, emoji 🎉', }; await cache.set('product:prod-special', product, 3600); const retrieved = await cache.get('product:prod-special'); expect(retrieved?.name).toBe('Widget™ — Pro Edition (Ü)'); expect(retrieved?.description).toContain('🎉'); }); }); describe('TTL Behavior with Real Time', () => { it('should expire entries after TTL', async () => { const product = { id: 'prod-1', name: 'Short-lived Widget' }; // Set with 1 second TTL await cache.set('product:prod-1', product, 1); // Should exist immediately expect(await cache.get('product:prod-1')).not.toBeNull(); // Wait for expiration await new Promise(resolve => setTimeout(resolve, 1500)); // Should be expired expect(await cache.get('product:prod-1')).toBeNull(); }); }); describe('Concurrent Access', () => { it('should handle concurrent reads correctly', async () => { const product = { id: 'prod-1', name: 'Widget' }; await cache.set('product:prod-1', product, 3600); // Simulate 100 concurrent reads const reads = Array.from({ length: 100 }, () => cache.get('product:prod-1')); const results = await Promise.all(reads); // All reads should return the same data results.forEach(result => { expect(result).toEqual(product); }); }); it('should handle concurrent writes correctly', async () => { // Simulate concurrent writes with different values const writes = Array.from({ length: 100 }, (_, i) => cache.set(`product:prod-${i}`, { id: `prod-${i}`, name: `Widget ${i}` }, 3600) ); await Promise.all(writes); // Verify all writes succeeded for (let i = 0; i < 100; i++) { const result = await cache.get(`product:prod-${i}`); expect(result?.id).toBe(`prod-${i}`); } }); }); describe('Connection Resilience', () => { it('should handle connection interruption gracefully', async () => { // This test verifies error handling when Redis becomes unavailable const product = { id: 'prod-1', name: 'Widget' }; await cache.set('product:prod-1', product, 3600); // Pause the container to simulate network issues await redisContainer.exec(['redis-cli', 'DEBUG', 'SLEEP', '2']); // Attempt operation during "outage" - should throw or return null gracefully await expect(async () => { await Promise.race([ cache.get('product:prod-1'), new Promise((_, reject) => setTimeout(() => reject(new Error('Timeout')), 1000)) ]); }).rejects.toThrow(); // Wait for recovery await new Promise(resolve => setTimeout(resolve, 3000)); // Should work again const result = await cache.get('product:prod-1'); expect(result).toEqual(product); }); });});Testcontainers automatically manages Docker containers for your tests. Each test run gets a fresh Redis instance, ensuring tests are isolated and reproducible. The containers are automatically cleaned up after tests complete. This eliminates 'works on my machine' issues with integration tests.
A cache stampede (also called cache thundering herd) occurs when a cache entry expires and multiple concurrent requests simultaneously attempt to regenerate it, overwhelming the backend. Testing stampede prevention mechanisms is crucial for systems under high load.
Stampede prevention techniques include:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157
describe('Cache Stampede Prevention', () => { let cache: InMemoryCache<Product>; let database: MockSlowDatabase; let service: StampedeProtectedProductService; beforeEach(() => { cache = new InMemoryCache<Product>(); database = new MockSlowDatabase(100); // 100ms latency per query service = new StampedeProtectedProductService(cache, database); }); describe('Request Coalescing', () => { it('should only hit database once for concurrent requests to same key', async () => { // Arrange: Empty cache const product = { id: 'prod-1', name: 'Widget', price: 29.99 }; database.setProduct('prod-1', product); // Act: Fire 10 concurrent requests for the same product const requests = Array.from({ length: 10 }, () => service.getProduct('prod-1') ); const results = await Promise.all(requests); // Assert: All requests get the same result results.forEach(result => expect(result).toEqual(product)); // Only ONE database query should have been made expect(database.queryCount).toBe(1); }); it('should not coalesce requests for different keys', async () => { // Arrange database.setProduct('prod-1', { id: 'prod-1', name: 'Widget 1' }); database.setProduct('prod-2', { id: 'prod-2', name: 'Widget 2' }); database.setProduct('prod-3', { id: 'prod-3', name: 'Widget 3' }); // Act: Concurrent requests for different products const requests = [ service.getProduct('prod-1'), service.getProduct('prod-2'), service.getProduct('prod-3'), ]; await Promise.all(requests); // Assert: Each product triggers its own query expect(database.queryCount).toBe(3); }); }); describe('Distributed Locking', () => { it('should acquire lock before regenerating cache', async () => { const lockManager = new MockLockManager(); const lockingService = new LockingCacheService(cache, database, lockManager); database.setProduct('prod-1', { id: 'prod-1', name: 'Widget' }); await lockingService.getProduct('prod-1'); // Verify lock was acquired and released expect(lockManager.acquisitions).toContainEqual({ key: 'cache-lock:product:prod-1', acquired: true, released: true }); }); it('should wait for existing lock instead of querying database', async () => { const lockManager = new MockLockManager(); const lockingService = new LockingCacheService(cache, database, lockManager); // Simulate another process holding the lock lockManager.holdLock('cache-lock:product:prod-1', 50); // 50ms // Simulate that while waiting, the cache gets populated setTimeout(async () => { await cache.set('product:prod-1', { id: 'prod-1', name: 'Widget' }, 3600); }, 30); const result = await lockingService.getProduct('prod-1'); expect(result.name).toBe('Widget'); // Database should NOT have been queried - got from cache after lock released expect(database.queryCount).toBe(0); }); }); describe('Probabilistic Early Expiration', () => { it('should sometimes refresh before actual expiration', async () => { const earlyExpirationService = new EarlyExpirationCacheService(cache, database, { beta: 1.0, // Higher beta = more likely to refresh early }); const product = { id: 'prod-1', name: 'Widget' }; database.setProduct('prod-1', product); // Populate cache with entry that will expire in 10 seconds // but has been there for 9 seconds already (delta = 1 second remaining) await cache.set('product:prod-1', { data: product, cachedAt: Date.now() - 9000, ttl: 10000 }, 10); // Run multiple fetches and count how many trigger refresh let refreshCount = 0; for (let i = 0; i < 100; i++) { database.queryCount = 0; await earlyExpirationService.getProduct('prod-1'); if (database.queryCount > 0) refreshCount++; } // With beta=1.0 and only 1 second remaining, should refresh frequently expect(refreshCount).toBeGreaterThan(20); expect(refreshCount).toBeLessThan(100); // But not every time }); });}); // Implementation: Request coalescingclass StampedeProtectedProductService { private inFlightRequests: Map<string, Promise<Product>> = new Map(); constructor( private cache: Cache<Product>, private database: ProductDatabase ) {} async getProduct(id: string): Promise<Product> { const cacheKey = `product:${id}`; // Check cache first const cached = await this.cache.get(cacheKey); if (cached) return cached; // Check for in-flight request const inFlight = this.inFlightRequests.get(cacheKey); if (inFlight) { return inFlight; // Piggyback on existing request } // Create new request and store it const request = this.fetchAndCache(id, cacheKey); this.inFlightRequests.set(cacheKey, request); try { return await request; } finally { this.inFlightRequests.delete(cacheKey); } } private async fetchAndCache(id: string, cacheKey: string): Promise<Product> { const product = await this.database.find(id); await this.cache.set(cacheKey, product, 3600); return product; }}When testing stampede prevention, measure not just 'did it work' but 'how well did it work'. Track database query count, latency distribution across requests, and lock contention. A poorly implemented stampede prevention mechanism can actually increase latency for most requests while only slightly reducing database load.
Testing cached behavior requires a systematic approach that addresses the unique challenges of stateful, time-dependent, concurrent systems. Let's consolidate the key principles:
What's next:
With a solid foundation in testing cache behavior, we'll explore Cache Hit/Miss Metrics—how to instrument your cache to measure its effectiveness and identify optimization opportunities.
You now understand how to design testable cache implementations, write comprehensive unit and integration tests for cache behavior, and verify stampede prevention mechanisms. These testing practices ensure your caches behave correctly before reaching production.