CLOSE
Updated on 07 Dec, 20258 mins read 29 views

What Is Cache Eviction?

Cache eviction = the process of removing items from cache when it becomes full.

Since cache memory is finite:

  • New data must replace old data
  • The eviction policies decides what to remove

Wrong eviction leads to:

  • low hit rate
  • high DB load
  • latency spikes
  • cache pollution

Why Eviction Is Mandatory?

  • Cache is always bounded
  • Working set may grow unbounded
  • Without eviction:
    • Memory leaks
    • OOM crashes
    • System instability

Core Cache Eviction Policies

Below are the main strategies to decide what to keep and what to remove as data comes in. These decides which items stay in the cache and which ones get replaced when it fills up.

1 Least Recently Used (LRU)

Evict items that haven't been used recently. Means the item that was accessed longest ago will be evicted.

It is often implemented with the linked list or the priority queue that tracks the access order.

“Recent past predicts near future”

Used In:

  • Redis
  • In-process caches
  • CDN edge caches

Strengths:

  • Works great for:
    • Web traffic
    • API responses
    • Hot user data

Weakness:

  • Fails on:
    • One-time sequential scans
    • Looping over large datasets

2 Least Frequently Used (LFU)

Evict items that are used least often, event if accessed recently. Means evict the items that are least frequently accessed.

“Popularity over time matters more than recency”

Used In:

  • ML feature caches
  • Recommendation systems
  • Search query caches

Strengths:

  • Stable for:
    • Long-term hot objects

Weakness:

  • Slow to adapt to sudden traffic changes
  • Requires counters (more memory)
  • Can keep stale “formerly popular” items forever

3 First in First Out (FIFO)

Evict the oldest item first. Simple, but rarely the right choice.

Used In:

  • Very simple systems
  • Streaming buffers

Strengths:

  • Easiest to implement
  • Constant time operations

Weakness:

  • Ignores usage completely
  • Very poor hit rate in practice

4 Time-To-Live (TTL)

Each item expires after a set time (e.g., 5 minutes). Great for data that can go stale, like API responses.

Used Everywhere:

  • Redis
  • CDNs
  • Browser caching
  • API caching

Strengths:

  • Prevents stale data
  • Guarantees eventual cleanup

Weakness:

  • Alone, it does not handle memory pressure
  • Can cause cache avalanche

Best Practice:

TTL + LRU Together

Buy Me A Coffee

Leave a comment

Your email address will not be published. Required fields are marked *

Your experience on this site will be improved by allowing cookies Cookie Policy