Browsers do not scale. Servers do.

It was 1995 when I discovered the internet at the Utrecht University. I studied quantitative analysis for sociology, which meant generating pages of statistical data to support the empirical analysis of some hypothesis in my report. There was one matrix printer that dozens of students had to share, so it could take a while. As I waited, I surfed the web ( in Netscape and quickly learned that slow internet is worse than no internet.

More than 20 years later, I have a mobile phone with more computing power than all those computers in the university thrown together. It has a selection of browsers, each of which eats more RAM than the size of a then-large local harddrive. You could say devices and browsers have scaled up significantly over the past 20 years. And yet, when I’m on my way to work, more often than not I find myself struggling to get a good 4G or even 3G connection. More often than not, I’m finding that slow internet is still worse than no internet.

Any decent web developer these days should be able to explain the necessity of progressive enhancement. At its core, it’s about making sure the common denominator of browsers is as big as possible when designing core functionality for the web, adding additional less vital features depending on device & browser capability.

My colleague Syb Wartna wrote an article about progressive enhancement and its relation to slow internet in this day and age.

The story doesn’t end here.


When I mentioned ‘additional less vital features’ above, the first thing that came to mind was JavaScript. More specifically: JavaScript running in the browser – before the nodeJS community throws a fit. A decent responsive progressively enhanced site using properly sized images and a CDN can load in under a second on a decent connection. Once the content is in the browser however, there’s still JavaScript.

There’s more to writing JavaScript than just making it cross-browser compatible: minify it, version it, aggregate it, etc. Anything to minimize the amount and size of HTTP requests and responses, and thorough testing to ensure an error in one browser does not break your script leaving other scripts crippled. If you’re using AJAX or a frontend framework, this should be at the top of your list of concerns - I’m looking at you, Angular. There’s more to site delivery than writing properly tested code, too.

Post-delivery code as content

To this day, many content management systems operating as site generation systems offer one way or another for a content editor to add HTML and JavaScript to their website. This is a freedom that’s easy to explain and hard to contain. The subject deserves a blog in its own right. What is important to realize though, is that while web designers, developers and product owners may have gone to great lengths to ensure they've deployed a thoroughly tested site, they don’t have the last say in what ends up in the visitors’ browsers. That right belongs to the content editors who place custom scripts. And let's not forget scripts for tracking, A/B testing, heatmap scanning, performance monitoring, feedback inquiries, discussion forums, etc.

  • This is where many websites abandon decorum and do all that the cautious frontend developer worked hard to avoid.
  • This is where the websites are spawned that load in 2 seconds and then spend another 6 waiting for various scripts to load and execute.
  • These are the sites where flash of unstyled content (FOUC) transgressions are so foul you suspect intent when you click an ad where a link was, because the content was still jumping around 3 seconds after it first appeared in your browser.

Many vendors of website tracking software start and end with JavaScript: Load the site, load the scripts, track specific behaviour, fire events when triggered by the visitor, and load personalized content. The information for this behaviour is passed between servers using JavaScript. And this is often decided after the software has marched through the last quality gate of the delivery pipeline.

The browser doesn’t scale

Everyone’s going to the cloud, be it public, private or hybrid. The premise is: Your servers scale to your needs with so little hassle, that as long as you make sure your online strategy works, any cost for a bigger server park is compensated by an increase in conversion.

Now here’s the thing: your servers scale. Your visitors’ browsers do not. What I mean to say is this: Anything you want to show your visitor, you should try to generate on your server on the first load, where you have the capacity to scale.

You may be using fire-and-forget triggers to profile a visitor on your site, and that’s typically something happening clientside. JavaScript is an obvious choice. But once you’ve got him in a segment and want to proceed to show targeted content, why resort to JavaScript to load that content if you can avoid it? In most cases, it’s something you can show in a response to the next regular page request, and thus generate serverside. If you want to show a forum, generate it along with the rest of your page. That there’s software that allows you to load a forum with a clientside call after the page has loaded, doesn’t mean it’s a good idea to do so. Social media buttons? Do you really need more than 2, or 1, or 0? Generate them yourself serverside. No need for an extra JavaScript call. Don’t use AddThis, for this reason if not for the sake of privacy (case of leaking profiled medical data to third parties among Dutch home doctors’ websites:; page in Dutch).

I’m also saying: reconsider your tracking scripts. Do you really need 8? Do you realize you may be missing behaviour because your tracking script hadn't finished loading before your visitor clicked on a link?

Browsers do not scale. Servers do.

If all this means that you need a faster backend, then scale up – that’s what the cloud is for! Odds are that a potential customer will sooner leave your site for a competitor’s than buy a new device or wait until he's home on his trusted cable internet desktop PC to visit your site.