Why Cache?
Every layer of a system has a speed hierarchy:
Caching stores expensive results in a faster layer. A DB query taking 10ms served from Redis takes ~0.1ms — 100x faster.
Cache-Aside (Lazy Loading)
The most common pattern. Application manages the cache explicitly.
Write-Through
Every write goes to both cache and database simultaneously.
Use when: Read-heavy, can't tolerate stale data
Write-Behind (Write-Back)
Writes go to cache first, database updated asynchronously.
Use when: Write-heavy, write latency matters
Cache Eviction Policies
Cache Stampede Problem
Fix: Use a mutex — only one request fetches from DB, others wait.
Where to Cache
Key Takeaway
- Cache-aside is the default — simple, resilient, widely used
- Always set a TTL — unbounded caches grow forever
- Invalidate on write for data that must be fresh
- Watch for cache stampede on high-traffic systems
- Redis is the industry standard for distributed caching