Back to the server

Jim Nielsen wrote a short piece the other day about the shift of compute from clients back to servers over the course of the last year or so, and specifically on the new imbalance between the time it takes to load an application vs. the time it takes to execute. See, for a while, the loading part took the longest—you'd sometimes have to wait on the order of seconds before an application could be prepared by the server, and once it arrived, your client (the browser, usually) would execute the app (that is to say, render it out). Think of classical PHP-based CMSes like WordPress—that sort of thing.

But in recent years, we've started bundling the app up and sending the whole thing over to the browser to execute there. Servers handling SPA-style React apps have to do very little in this scenario—the whole app execution is distributed to all of the application's users. And this worked great! Everyone's personal fans spun up a little bit, but those numberless racks of silicon over at AWS all got to cool off.

Nowadays, though, the balance is starting to tip in the other direction: networks are speeding up and single-node servers are being replaced by something called edge compute (which is, effective, technical jargon for "a server near you") and aggressive caching strategies. Meanwhile, clientside bundle sizes are ballooning to Absolute Unit-esque proportions and CPU clock speed is stagnating (M1 notwithstanding). Jim quotes Igor Minar:

With networks becoming faster, client-side code execution is becoming the new bottleneck. This is especially true for low-end and mid-range mobile devices.

The difference nowadays, as Igor points out, is that access to that edge compute has increased to the point of being effectively free for the 80% use case. Most modern hosts—especially "serverless" ones like Vercel, Netlify, or Cloudflare—will cache your stuff as close to users as possible without you even asking. Which means that the server isn't the bottleneck anymore.

Heck, the API that powers this website is run by a single £6 DigitalOcean droplet running, simultaneously, a dozen or so other applications. But with some judicious caching from Cloudflare and NextJS's incremental static regeneration, that API only serves a few dozen requests every day—instead of the doubtless thousands of requests that this site must get every day—nay, hour—nay, minute!

Web Tech

Next

May 2022

Busy month: lots of long weekends, lots of driving, sun's back out, bring on the summer already.

Previous

Conference cost

I want to go to UX London, but I can't justify the cost of it.