Working towards

A Faster Web

A faster web is a more enjoyable web. I aim to help make the web a faster place.

Quite often the obstacle to performant web pages is time. If it is not time, then it is most likely a lack of understanding. Either a lack of understanding of the need for fast websites or of an understanding of how to get websites to that point. There are of course other possible obstacles such as management or design choices. However, a faster web is a better web for many reasons. First and foremost is that faster websites typically translate to better user experiences. Coupled with this is a consciousness of the price and speed of data especially in third-world countries. Slow, large website eat up data and cause frustrating user experiences. Not entirely disconnected from this is the fact that page speed factors into Core Web Vitals which is used by Google to rank your site. Loosely speaking, the better your Core Web Vitals, the better your site ranks. This of course also factors in accessibility and other SEO guidelines, but the speed factor is not to be overlooked. A site that dumps 37mb of images and videos onto a user on the first load of the page is going to rank lower than other sites that are smarter about what they load and when.

One of my passions is tweaking and fine tuning websites to help boost their speed performance. This can be done in a multitude of ways and often involves having to dig in to the nuances of whatever framework or library you are using to build your site. Below are some of my best tips for performant websites.

Defer non-critical content

Many developers already know about the built in lazy loading attributes that can be used to defer off-screen images. Adding tags like those below can already boost performance greatly.

<img loading="lazy" /> <iframe loading="lazy" />

However, this is just the beginning. Videos, CSS and even some JavaScript can be deferred until after the page has loaded. I suggest using tools like Beasties to handle Critical CSS - inlining the most needed CSS and lazy loading the rest - and to load manual Analytics scripts or external CSS file using this pattern:

<script async defer>
  function loadNonCriticalFiles() {
    [
      ['script', 'https://example.com/script1.js'],
      ['script', 'https://example.com/script2.js'],
      ['stylesheet', 'https://example.com/style.css'],
    ].map(([type, uri]) => {
      if (type === 'script') {
        const script = document.createElement('script');
        script.src = uri;
        script.async = true;
        document.head.appendChild(script);
      } else if (type === 'stylesheet') {
        const link = document.createElement('link');
        link.rel = 'stylesheet';
        link.href = uri;
        document.head.appendChild(link);
      }
    });
  }

  if (window.document.readyState === 'loading') {
    window.addEventListener('DOMContentLoaded', loadNonCriticalFiles);
  } else loadNonCriticalFiles();
</script>
Use a CDN if possible

Using a CDN like Cloudflare or Fastly can help performance immensely. Such systems allow caching of your content and work in such a way that they will cache that content in locations that are nearby to your users. This means that a user gets a response to their request in the quickest time possible. 

A bonus to this is to make use of your frameworks API endpoints to proxy third-party scripts in order to let them be cached by your CDN. Use this with caution however, because some third-party scripts will complain about this. An example of this, written for Svelte, would be:

export const GET: RequestHandler = async ({ fetch }) => {
  const css = await fetch(`https://example.com/file.css`).then((res) =>
    res.text(),
  );

  return new Response(css, {
    headers: {
      'Content-Type': 'text/css',
      'Cache-Control': 'public, max-age=31536000, immutable',
    },
  });
};

This can also be done with images, videos and other files. Doing this can improve speed if you are using a cloudfront, but proceed with caution of you are not. Proxying a request will cause it to take longer and could slow down your site.

Use a service worker

Implementing a service worker will usually have less of an impact on your SEO score and more of an impact of your user's experience. Service workers allow us to control how web requests are made in our application and allow us to cache the results of requests locally. This means that users can save data when navigating your site because they won't have to fetch the same file more than once. A very basic service work setup would look like this:

const CACHE_NAME = 'media-cache-v1.0';

self.addEventListener('install', (event: ExtendableEvent) => {
  event.waitUntil(self.skipWaiting());
});

self.addEventListener('activate', (event: ExtendableEvent) => {
  event.waitUntil(self.clients.claim());
});

self.addEventListener('fetch', (event: FetchEvent) => {
  const request = event.request;

  // For this worker, we only want to check get requests so that we can find
  // files requested from the server and check our own cache first. If we
  // have the file on our side, we can return that instead of making a
  // network request.
  if (request.method !== 'GET' || request.headers.get('Range')) return;

  const url = new URL(request.url);

  if (
    request.destination === 'image' ||
    request.destination === 'video' ||
    (url && url.pathname.includes('.css'))
  ) {
    event.respondWith(
      caches.match(request).then((cachedResponse) => {
        if (cachedResponse) {
          return cachedResponse;
        }
        return fetch(request, { cache: 'no-store' }).then((networkResponse) => {
          return caches.open(CACHE_NAME).then((cache) => {
            try {
              cache.put(request, networkResponse.clone());
            } catch {
              console.log(request.url, 'failed');
            }

            return networkResponse;
          });
        });
      }),
    );
  }
});
Avoid libraries, third-party content and bloat

This one is more manual but can often have a massive impact on your site. Time is usually a pressure for development and so anything that saves time is a win. Complex tasks or function can often be difficult or annoying to write yourself and it can be easier to just use a library. However, libraries often cater for more than we need and bloat is not always removed by tree-shaking. Many functionalities that you are using could be written yourself or smaller packages could be used. Make use of Bundlephobia to see the impact of certain packages on the size of your app.