Remove Cache Remove Code Remove Event
article thumbnail

Netflix’s Distributed Counter Abstraction

The Netflix TechBlog

By: Rajiv Shringi , Oleksii Tkachuk , Kartik Sathyanarayanan Introduction In our previous blog post, we introduced Netflix’s TimeSeries Abstraction , a distributed service designed to store and query large volumes of temporal event data with low millisecond latencies. Today, we’re excited to present the Distributed Counter Abstraction.

Latency 251
article thumbnail

Code-level observability for Flutter apps drives great user experience

Dynatrace

When Davis detects deviations from this baseline (for example, a sudden dip in usage or a user action that lasts longer than expected), it generates a problem event , identifies the root cause of the problem, and sends notifications based on the configured alerting profile. User actions in Dynatrace are more than just simple events.

Code 244
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Introducing Configurable Metaflow

The Netflix TechBlog

A natural solution is to make flows configurable using configuration files, so variants can be defined without changing the code. Unlike parameters, configs can be used more widely in your flow code, particularly, they can be used in step or flow level decorators as well as to set defaults for parameters.

article thumbnail

AWS serverless services: Exploring your options

Dynatrace

Amazon compute solutions are designed to streamline resource provisioning and container management with two services: AWS Lambda : Lambda provides serverless compute infrastructure that lets you run code in response to predetermined events or conditions and automatically manage all compute resources required for these processes.

article thumbnail

Sustainable IT: Optimize your hybrid-cloud carbon footprint

Dynatrace

Evaluating these on three levels—data center, host, and application architecture (plus code)—is helpful. After identifying about 100 idle host instances to be shut down, they learned that these hosts were provisioned in anticipation of upscaling to support an upcoming major sales event. Reduce inter-process communications overhead.

Cloud 264
article thumbnail

Performance Game Changer: Browser Back/Forward Cache

Smashing Magazine

Performance Game Changer: Browser Back/Forward Cache. Performance Game Changer: Browser Back/Forward Cache. With that caveat out of the way, let’s get to the guts of the article: What is the Back/Forward Cache and why does it matter so much? Didn’t The HTTP Cache Do All That Anyway? Barry Pollard.

Cache 106
article thumbnail

Single-core memory bandwidth: Latency, Bandwidth, and Concurrency

John McCalpin

Sustainable memory bandwidth using multi-threaded code has closely followed the peak DRAM bandwidth, typically delivering best case throughput of 75%-85% of the peak DRAM bandwidth in each generation. GB/s peak DRAM bandwidth, requiring 6 concurrent 64-byte cache line accesses to be pending at all times to maintain full bandwidth.

Latency 71