Caching Patterns

Understanding different caching strategies and when to use each

Why Caching Matters

10-100x
Faster than database
<1ms
Cache read latency
90%+
Target hit rate
$$$
Reduced DB costs

Caching Patterns

Cache-Aside (Lazy Loading)

Application manages the cache - loads data on cache miss

Read-heavy workloads with infrequent updates
Application checks cache for data
->
If cache hit: return cached data
->
If cache miss: query database
->
Store result in cache
->
Return data to caller

Write-Through

Data written to cache and database simultaneously

Data that is read immediately after writing
Application writes data
->
Write to cache first
->
Write to database
->
Both writes must succeed
->
Return success to caller

Write-Behind (Write-Back)

Write to cache immediately, async write to database later

Write-heavy workloads where some data loss is acceptable
Application writes data
->
Write to cache immediately
->
Return success to caller
->
Async: batch writes to database
->
Confirm persistence

Read-Through

Cache automatically fetches from database on miss

When you want caching logic abstracted from application
Application requests data from cache
->
Cache checks if data exists
->
If miss: cache fetches from database
->
Cache stores and returns data
->
Subsequent reads served from cache

Refresh-Ahead (Pre-fetching)

Proactively refresh cache before expiration

Hot data that must always be fast and fresh
Cache tracks TTL of entries
->
Before TTL expires, trigger refresh
->
Fetch fresh data from database
->
Update cache in background
->
Users always get cached data

Cache-Aside in Detail

The most common caching pattern - walk through each step.

Cache-Aside Pattern Flow

Step 1 of 6 17% complete
Step 1

Application Requests Data

User requests user profile for user_id=123

Technical Details

The application receives a request that needs data from the database.

Pro Tip

Common entry point: API endpoint, service method.

Eviction Strategies

When cache is full, which items should be removed?

LRU (Least Recently Used)

Evict the item that has not been accessed for the longest time

Best for: General purpose, most common

LFU (Least Frequently Used)

Evict the item with the fewest accesses

Best for: When access frequency matters more than recency

FIFO (First In, First Out)

Evict the oldest item regardless of access pattern

Best for: Simple, predictable behavior

TTL (Time To Live)

Evict when time expires, regardless of space

Best for: Data that becomes stale over time

Random

Evict a random item

Best for: When access patterns are unpredictable

Common Caching Problems

Cache Stampede

Many requests hit an expired cache key simultaneously, all querying the database at once.

Solutions:
  • - Lock/mutex on cache refresh
  • - Staggered TTLs
  • - Background refresh

Stale Data

Cache contains outdated data that does not reflect current database state.

Solutions:
  • - Appropriate TTL values
  • - Cache invalidation on writes
  • - Write-through pattern

Cache Penetration

Requests for non-existent data bypass cache and always hit the database.

Solutions:
  • - Cache negative results
  • - Bloom filter for existence check
  • - Input validation

Redis vs Memcached

FeatureRedisMemcached
Data StructuresStrings, Lists, Sets, Hashes, Sorted SetsStrings only
PersistenceYes (RDB, AOF)No
ReplicationBuilt-inNo
Pub/SubYesNo
Lua ScriptingYesNo
Memory EfficiencyGoodBetter (simpler)
Best ForFeature-rich caching, sessions, queuesSimple, high-throughput key-value cache