article thumbnail

The Three Cs: Concatenate, Compress, Cache

CSS Wizardry

Caching them at the other end: How long should we cache files on a user’s device? Cache This is the easy one. Caching is something I’ve been a little obsessed with lately , but for static assets as we’re discussing today, we don’t need to know much other than: cache everything as aggressively as possible.

Cache 353
article thumbnail

Designing Instagram

High Scalability

Design a photo-sharing platform similar to Instagram where users can upload their photos and share it with their followers. High Level Design. Component Design. API Design. We have provided the API design of posting an image on Instagram below. API Design. Problem Statement. Architecture. Data Models.

Design 334
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Design Of A Modern Cache—Part Deux

High Scalability

The previous article described the caching algorithms used by Caffeine , in particular the eviction and concurrency models. This allows for quickly discarding new arrivals that are unlikely to be used again, guarding the main region from cache pollution.

Cache 200
article thumbnail

Architectural Insights: Designing Efficient Multi-Layered Caching With Instagram Example

DZone

Caching is a critical technique for optimizing application performance by temporarily storing frequently accessed data, allowing for faster retrieval during subsequent requests. Multi-layered caching involves using multiple levels of cache to store and retrieve data.

Cache 173
article thumbnail

Consistent caching mechanism in Titus Gateway

The Netflix TechBlog

We introduce a caching mechanism in the API gateway layer, allowing us to offload processing from singleton leader elected controllers without giving up strict data consistency and guarantees clients observe. The cache is kept in sync with the current leader process. How do I know that my cache is up to date? of the data.

Cache 228
article thumbnail

Seeing through hardware counters: a journey to threefold performance increase

The Netflix TechBlog

We also see much higher L1 cache activity combined with 4x higher count of MACHINE_CLEARS. a usage pattern occurring when 2 cores reading from / writing to unrelated variables that happen to share the same L1 cache line. Cache line is a concept similar to memory page?—? Thread 0’s cache in this example.

Hardware 363
article thumbnail

Fast memcpy, A System Design

ACM Sigarch

We look here at a Gedankenexperiment: move 16 bytes per cycle , addressing not just the CPU movement, but also the surrounding system design. A lesser design cannot possibly move 16 bytes per cycle. This base design can map easily onto many current chips. Cache pollution is addressed in a section below.). byte loads.

Design 145