We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Caching in distributed systems: A friendly introduction
Summary
Description
Caching is an amazingly effective technique to reduce latency. It helps build scalable, distributed systems.
We first discuss what is a cache and why we use it. We then talk about the key features of a cache in a distributed system.
Cache management is important because of its relation to cache hit ratios and performance. We talk about various scenarios in a distributed environment.
Benefits of a cache:
1. Saves network calls
2. Avoids repeated computations
3. Reduces DB load
Drawbacks of a cache:
1. Can be expensive to host
2. Potential thrashing
3. Eventual consistency
Cache Write Policies:
1. Write-through
2. Write-back
3. Write-around
Cache Replacement Policies:
1. LRU
2. LFU
3. Segmented LRU
00:00 What is a cache?
00:20 Caching use cases
03:42 Caching limitations
06:33 Drawbacks
09:42 Cache Placement
Caching resources listed together (click on the resources tab):
https://interviewready.io/learn/system-design-course/caches-deep-dive/caching-basics
You can follow me on:
Github: https://github.com/InterviewReady/system-design-resources
Instagram: https://www.instagram.com/interviewready_/
LinkedIn: https://www.linkedin.com/company/interview-ready/
Twitter: https://twitter.com/gkcs_
#Caching #DistributedSystems #SystemDesign
Translated At: 2025-02-22T11:27:23Z