Codecast

Codecast

Share this post

Codecast
Codecast
Introduction to Caching — a System Design Primer: Part 1

Introduction to Caching — a System Design Primer: Part 1

Exploring how and when to use Caching in your systems

Yash Prakash's avatar
Yash Prakash
Jun 02, 2023
∙ Paid

Share this post

Codecast
Codecast
Introduction to Caching — a System Design Primer: Part 1
Share
Photo by John Arano on Unsplash

System design is a crucial aspect of software engineering, but it can be challenging due to the complex terminology used in system design resources you typically get online. 

In order to design effective systems, it’s essential to understand the tools and terms used to solve specific problems. 

By familiarizing yourself with the basics of system design concepts and terminology, you can improve your skills substantially.

This article focuses on an important topic in system design: caching. 

Caching is a technique used to enhance system performance and user experience. Here, we’ll be looking into the properties of caching, its techniques, and its properties in detail. 

Let’s dive in 👇

A Cache and some of its definitions

A cache is a key component of any medium to large system that ensures a limit amount of data can be retrieved at a fraction of the time it takes to retrieve from a database. Caches can contain the most recently accessed data and provide it time and again to minimize its retrieval from the database.

A cache is highly essential to improve the performance of a system particularly: its latency and throughput.

Latency is the time delay between a request and a response, while throughput is the amount of data that can be transmitted over a system in a given amount of time. 

Latency measures the time taken to complete a single task, whereas throughput measures the number of tasks completed in a given time. 

Low latency is desirable in systems that require fast response times, while high throughput is desirable in systems that require high data transfer rates.

A system with low latency and high throughput is achievable through a decent cache. 

Some other benefits of caching may include: 

1. Avoiding Redundant Computation: Caching is used to avoid redoing complex computations again and again, which can save time and compute.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Yash Prakash
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share