Build highly performant web sites like google/Netflix

M Adnan A
10 min readDec 7, 2019

I remember about 10 years ago, even before JQuery era, I wrote some article on web applications performance and mainly my article was emphasizing importance of concatenation, compression, minifying, caching, saving RTT, more or less all such things can be easily accomplished with right framework nowadays but whilst web page is not just about document anymore its full fledge application on which businesses rely. Even a fraction of second in performance makes a significant change in performance of loading/rendering so every second count actually to be more accurate every 100th of millisecond count now.

Nowadays, everybody running behind to adopt some frameworks thinking that’s the only thing required to build highly performant web applications. To a certain extent, it is very important but that’s not the only thing required. Whether you use Angular, React, Vue js or even VanilaJS unless you optimize your websites using best practices and guidelines, your website is going to be heavy and sluggish. Until you have a native application like performance user is not going to stick with your application.

I don’t think so it’s a mystery that most of the time performance blocking elements are mainly heavy images and javascript files so let’s talk about them first

Lazy Loading — assets

Chrome 75+ came up with an easy way to define native lazy loading for images and iframe so you can simply do the following

<img src="celebration.jpg" loading="lazy" alt="..." /> <iframe src="video-player.html" loading="lazy"></iframe>

loading attribute accepts three different values lazy, eager and auto, auto leaves on a browser to decide whether to do eager (quick) or lazy loading

But don’t confuse lazy loading with deferred loading as its more like loading on-demand as whilst you keep scrolling, images keep loading so initial loading will be fewer

But as it’s not adopted by every browser yet and you need to apply that across all browsers we can simply use this library called lazySizes as a fallback https://github.com/aFarkas/lazysizes

<img data-src="image.jpg" class="lazyload" /> <iframe frameborder="0" class="lazyload" allowfullscreen="" data-src="//www.youtube.com/embed/"> </iframe>

What else we can do to optimize Images, hmmm

Responsive Images

Lets not confuse it with bootstrap responsive classes, that’s also very good and somehow you can say idea came from there but in bootstrap responsive classes, that scale image but what I am going to talk about is showing/downloading different sizes for different device so if web page is opened in mobile browser not just 240x240 version of image is shown but also downloaded that much instead of downloading whole image but scale it and shown only 240x240 version of it. that should surely reduce and increase performance based on the browser size.

Simply use HTML 5 based img srcset attribute and defined different urls of an image either based on sizes or density,

<!-- Density based --> <img srcset=" /wp-content/uploads/flamingo4x.jpg 4x, /wp-content/uploads/flamingo3x.jpg 3x, /wp-content/uploads/flamingo2x.jpg 2x, /wp-content/uploads/flamingo1x.jpg 1x " src="/wp-content/uploads/flamingo-fallback.jpg" > <!-- Width based --> <img srcset=" /wp-content/uploads/flamingo4x.jpg 4025w, /wp-content/uploads/flamingo3x.jpg 3019w, /wp-content/uploads/flamingo2x.jpg 2013w, /wp-content/uploads/flamingo1x.jpg 1006w" src="/wp-content/uploads/flamingo-fallback.jpg" >

what else, I guess one more very important thing to do is use Image CDN

Image CDN allows availability, security, compression, reliability, scalability and as defined earlier having multiple versions of the same image for different devices, moving to Image CDN is a quite effective way to increase speed as downloading an image from the nearest available server would obviously decrease latency and increase speed and reliability.

Image Compression with MozJPEG/OptiPNG

I know you already know that the only reason I mentioned it highlights some very effective loss-less image compression algorithms like MozJPEG, OptiPNG, trust me I tried JPEG compression on an image from 2.7MB to 1.7MB (almost) and with MozJPEG it was only 0.7MB. Google released wonderful PWA for that which you can find online called Squoosh.app

All right let’s move on and let’s talk about another elephant in the room, SCRIPTS, Third-party libraries, scripts which we normally embed in our codebase, you will be surprised to know their size and how much they could impact initial loading of your application.

Defer / Async script tag

Try to use Defer or Async for all of your scripts, no matter its internal JS or external JS files. It’s important to know when you define JS script reference at the end of the HTML document it doesn’t make any difference as already HTML parsing is done. Let HTML be shown from the first couple of seconds on your user browser as the moment some HTML shown on the page, users get busy reading those contents instead of waiting and cursing sluggishness of the page.

<script defer src='...'> <script async src='...' >

Normal script execution

Async script execution

Defer script execution

You also have to choooooose wisely third-party libraries, sometimes we integrate third-party library just because its trending and we think their ecosystem must be good etc, mostly we use only one or two features of such library and rest of the library features’ javascript is burden for your application to carry so might be good idea to consider either specific feature-based third party which is not generic and contains so many different types of feature also find some lighter version from their competitors like nowadays, very popular Svelte library which is very small size in comparison to Angular/React/Vue js but to be fair its not apple to apple comparison when people compare Svelte nor I am trying to compare it but reason I mentioned it, you might only need Svelte level of library but you end up using Angular which is quite heavier than Svelte and you might end up paying for something you not even using to their full potential so always consider lighter and specific version of your need if performance is first priority.

As Svelte is compiler based, you can also use service worker and precache your react/angular fx based app to make your web pages load faster

Anyways remember 4 golden rule for JS library:

Depreciate the library if not required anymore, Replace with a lighter version, Defer JS script tag, Update to the latest but stable version.

Trick the JS engine

The default behavior of browser engine is to fetch JS file and execute it right away and whilst it executes JS file it holds HTML parsing means nothing gets painted on a browser which we wanted to avoid, Ideally we want HTML to be parsed and then JS to be fetched and executed through a Service worker. How about we trick the browser engine and define JS file as a non-JS file so browser downloads it but don’t execute it and our service worker make convert that file to regular JS file for the engine to execute it. All you have to do is define script type=’ anything but script’ once you do that browser will consider it some asset file and only fetch it instead of executing it. Register your service worker to change type=”script” and Voila!! let the engine deal with it. Keep minimal and critical JS as a script, obviously, you shouldn’t be using defer to all or async to all files or can apply trick, as a developer you need to define which JS file is critical or needing during initial load and which are not.

Web Fonts swap

There are so many cool fonts being used but there is an issue with them when they don’t load for some reason, normally popular browsers like chrome and firefox awaits around 3 secs to load font and then display the text which is quite a long delay to wait but you can define font-swap in CSS to let browser display text with default font and swap once font file is loaded, which enables user to see some text instead of empty space

@font-fact { ...... font-display: swap; }

I think it would be also very interesting if we define our fonts and such resources to be PreFetched or PreLoad. There are some CSS tricks you can try, like below

Http2 (H2) Server Push

It’s magical but it all depends on the kind of browser you use, as different browser handles it differently. It could save a number of round trips for a browser to fetch assets (images/CSS/js etc). As the name implies its server push, so when a browser does initial request, a server can push back more relevant files which could browser need in the near future, so without server push what happens

Browser request index.html then parse and realize oh i need index.css as well so it goes back to server and fetch index.css when it loads index.css it realizes oh I needed index.png then it goes back to server and get that, to be honest, it feels so stupid in terms of if you think in real-world example, as you go to grocery store for each item you need instead of getting all of them together.H2 Server push is there and allows browser to take more than what it demands as Server knows (actually server doesn’t know either, the developer knows and developer can specify those assets which he knows he stupid browser will need so he mention them somehow)

Once the server pushes relevant files back to browser it saving browser for the subsequent request. All those files go into browser cache but PROBLEM is that whenever the browser makes request Server push back relevant files which might be previously provided and stored in the cache but that can be SOLVED by bright-minded people like me who always there to provide solutions. My solution would be to use Service Worker and let SW check cache if that file already exists so instead of browser go to Server, SW furnishes that asset from the cache and save a round trip.

Following are some ways to identify them:

Preempt Loading Assets

Pre-fetching is a way of hinting to the browser about resources that are definitely going to or might be used in the future, some hints apply to the current page, others to possible future pages.

As developers, we know our applications better than the browser does. We can use this information to inform the browser about core resources.

DNS-Prefetch — resolve DNS lookup for that resource

Preconnect — resolve DNS and also establish TCP handshake

Prefetch — if you are sure about a certain item will be required in future, that can be fetched and stored in a cache, Subresource — is also does same but with higher priority

Prerender — it’s like fetch and render the page (DOM created, CSS applied, JS executed) as well but just keep it hidden, Preload — also does the same with higher priority but not well supported by browsers yet

<link rel="dns-prefetch" href="//example.com"> <link rel="preconnect" href="https://css-tricks.com"> <link rel="prefetch" href="image.png"> <link rel="subresource" href="styles.css"> <link rel="prerender" href="https://css-tricks.com"> <link rel="preload" href="image.png">

We need to be very careful with these options and should be used smartly based on analytics and machine learning. As you know your website better than a browser that does such things for you so if you do it correctly you can expect better performance.

Brotli vs GZip compression

Brotli is another compression method which is even better than GZip compression and reduces 15% more than Gzip compression. It doesn’t mean we don’t use Gzip anymore as Brotli is better in reducing the size but Gzip is still better with dynamic content compression due to its processing speed. Brotli used on static content compression-like CSS/js but on runtime request/response Gzip is a better option due to runtime performance for compressing/decompressing which is far better than Brotli processing time.

Adaptive Loading

Not everyone has access to high-end devices and good internet and even you get good internet at one place, you may not have good signal strength on every spot you be and if you do, consider yourself the luckiest person of this era. Idea is to render the content based on network status (strength) and hardware (CPU, memory) for that you can use certain libraries like React Adaptive hooks, the example below copied from Github

import React from 'react'; import { useNetworkStatus } from 'react-adaptive-hooks/network'; const MyComponent = () => { const { effectiveConnectionType } = useNetworkStatus(); let media; switch(effectiveConnectionType) { case 'slow-2g': media = <img src='...' alt='low resolution' />; break; case '2g': media = <img src='...' alt='medium resolution' />; break; case '3g': media = <img src='...' alt='high resolution' />; break; case '4g': media = <video muted controls>...</video>; break; default: media = <video muted controls>...</video>; break; } return <div>{media}</div>; };

PRPL Pattern

<link rel="preload" as="style" href="css/style.css">

You can use critical script as an inline script, use async or defer if that is holding initial paint. Another approach could be Server-side rendering of initial HTML content which can help but also increase HTML payload and increase the time of interactivity. Nevertheless, there is no hard and fast rule or single way of achieving quick rendering of HTML contents upon initial load.

You can use H2 Server Push described above or use Service workers to retrieve assets from the cache and save round trips, this idea can help to set up offline availability of assets for application to work if there is any glitch and save browser need to redownload on repeat visits

Define trivial assets and let them be lazy-loaded so a browser can initial load/render important contents and load non-critical ones once downloaded and partially painted critical contents. this way user doesn’t have to see a blank screen

TLDR;

Better performance means less bounce rate, more retaining users, more business, more revenue. There are few techniques I described which I found very effective, always use performance tools as they are very helping, I am mentioning some below:

  • Google Chrome Lighthouse
  • GTMetrix
  • PageSpeedInsights

Never expect all your users having the same sort of high-end devices which you could have or the same sort of internet speed, as even you can’t have 4G/5G speed in every room of your house or full bandwidth of your wifi in your bedroom. At least I never get that :) Always test your site performance as GPRS/2G/3G network user which you can easily do using some tool or chrome dev tools by setting throttling like below

Good luck guys, other than in bed, high speed is highly desired!!!

Originally published at https://www.linkedin.com.

--

--

M Adnan A

Love writing, learning, sharing and love being sarcastic :)