geeogi

Running applications at the edge on Cloudflare, May 2026

New software frameworks can be easier to reason about, make hiring easier or have a more developed ecosystem, but it's very rare that they bring a pure technical advantage over their predecessors - many great apps run on Java and jQuery from 10+ years ago. As an engineer you can learn a lot by studying older, bare metal approaches to programming.

That said, web architecture has evolved massively over the past 10 years and has brought significant advantages. Edge servers can dramatically reduce web latency compared with a traditional server model from years ago (e.g. EC2, on-prem) from 600ms+ to reliably under 250ms for dynamic page loads.

Edge computing is not just a frontend design choice - often misunderstood - but is now a fully enabled system of queues, serverless compute (Cloudflare workers) and cache storage (Cloudflare KV) that can push your deepest application logic and records within a 250ms round-trip of your users.

A modern application can use the Cloudflare stack (queues, compute, DB, cache) to bring records into the edge and use a SSR application to transform cached content into HTML and JS pages for user devices. This gets you close to the theoretically optimal latency for web users which is simply a static file served by a CDN. Much faster than having your FE wait for critical data from a BE origin server on page load.

The minimal TTL of edge storage (e.g. Cloudflare's KV storage) is typically 30-60s which means that any application data which can be a few minutes stale on page load can be served to your users within 300ms - you just have to push it into the edge.

George