You probably know the Lighthouse-based tests or the WebPageTest audits focusing on frontend load time and performance. One missing metric though is the memory footprint of the page: on old or cheap devices, opening multiple tabs can quickly eat up hundreds of megabytes, leaving less memory for other applications to run at the same time.

Seeing my old laptop struggle to run a music player, Slack, VSCode and a web browser at the same time made me realize we often underestimate this aspect of web development, so I decided to implement an old idea I had: a tool that tests a URL and displays how much memory it eats. Inspired by an old HackerNews thread by @dominictarr, I decided to name it HowBloated. Here is how it works:

Demo video of HowBloated: enter a URL and after a few seconds get the detailed results of memory usage
Meh, you can do better Wordpress!

Step 1: type a URL. Step 2: get the memory usage of the page, with the totals by JS memory, CSS memory, and other objects (DOM, images, etc). Easy.

Track memory usage over time

HowBloated also offers to track memory usage over time for free. Each "tracked URL" will be scanned every 24h on the free plan, enabling you to measure the impact of your changes.

Integrating HowBloated with your CI pipeline to trigger a scan after each deploy is a feature included in the paid version, that is currently in private beta. Hit me up on Twitter or by email if you are interested!

First results

After a few days of coding, I released the first iteration of HowBloated on Twitter, and within a few days more than 400 URLs had been scanned. I also investigated some surprising results.

Remove the ads

I like to keep an eye on what's going on with my services, and a Slack webhook is super simple way to do it. So when I saw this, I was kind of shocked:

SRSLY!?

Someone had scanned the nature.com website with a magic parameter that seems to remove (some) third party ads on the page. Even with half of those third party scripts still present, this is a reduction of roughly 35MB. Looking at the detailed consumption, the memory used for Javascript "things" goes from 43MB to 10MB. More than 30MB of memory consumed in "ads" is huge. This is just so depressing. If only ads were optimized as well.

See it here: https://howbloated.howfast.tech/result/a82df68a-e10d-46c9-98a3-853cfa4d8d7c

But... Adblock Plus!??

After this traumatizing discovery, you might say "ha-ha, but I'm using an ad-blocker extension, I'm not affected by this and I can keep opening so many tabs!". Well, I've got sad news.

While investigating how accurate the measurement returned by the Remote Debugging Protocol was, I discovered that an empty page in my browser was still taking about 5MB of memory. You can use Chrome or Firefox's developper tools to manually take a memory snapshot of the page in its current state, so I did a first snapshot and then disabled AdBlock Plus: the second snapshot shows almost 4MB less memory consumed.

Before and after disabling AdBlock Plus, with Firefox's memory snapshot tool

Reminder: this is an empty page. If you have a more complex page such as the Notion.so login page, repeating the same test shows that AdBlock is responsible for a 10MB memory consumption. Multiply that by the number of open tabs, and you can quickly end up with hundreds of megabytes consumed by a single extension that is supposed to make pages lighter.

I had heard about it before so I guess it's not breaking news, but it's still very interesting to notice it first-hand.

So is everything doomed? Should we all go back to browsing the web using lynx? Maybe not, since you can still use DNS-based ad blocking, thanks to projects like Pi-Hole (obviously harder to install than a browser extension). At the very least we are now aware of the cost of using an ad-blocker.

Blogging platforms: Medium vs Ghost vs static website

One other very interesting example was around blogging platforms. Blogs are typically static pages, with very little Javascript required (if any). One could expect the memory footprint to be very small compared to more dynamic web applications. Here are the results:

  • An article on Medium takes 60MB of memory (results)
  • The same article on a Ghost blog takes 3MB of memory (results)
  • A static blog (Jekyll + Netlify) takes 2MB of memory (results)

The team at Ghost clearly wins, congrats! A static website is even more performant, if you don't mind writing Markdown and configuring a build system (and if you don't add a Disqus widget as I did on my own blog...).

Look at all the memory I'm eating, guys!

Let's go back to HowBloated itself now.

How it works: technical details

The frontend of HowBloated is written in VueJS (and I've also used Typescript with it for the first time), writing actual CSS instead of CSS-in-JS felt nice again. The backend is using ExpressJS and is fairly simple. Each scan request opens a new Firefox process, using the (sometimes obscure) Remote Debugging Protocol to measure the memory usage of the tab.

In its current version, there are some important caveats:

  • the memory usage is measured right after the page is loaded, meaning we have little information about how much memory the web application will consume once you start to use it, or keep it open for several hours (think of GMail)
  • since HowBloated is fairly simple, there is no way to measure pages that are protected by an authentication mechanism. If it isn't an issue for some websites like YouTube or GitHub where (some) content is public, some other apps like GMail don't expose anything to unauthenticated users
  • measuring the memory consumed by a specific page is hard because some of it will be used behind the scenes by the browser itself. For instance, an empty page will still report about 300KB of memory used. If you add browser extensions, this will quickly grow.

What's next

HowBloated was a weekend project that received some traction. I still need to work on scalability (if you know how to package Firefox inside an AWS Lambda layer, please let me know!) and add more features. If you feel like this would help you a lot, I'd be happy to talk about it :)

Another thing I want to do is to scan the Alexa top 1K (most popular websites) and aggregate the results for each category (CSS / JS / other) to provide a baseline and add context when presenting scan results.

Feeling curious? Try it out now at https://howbloated.howfast.tech/

HowBloated • How much memory does your website eat up?
Some browser tabs eat up a lot of memory. We should keep our web apps lean. HowBloated will tell you how bloated your web page is.

Thanks for reading me! Please enjoy your day.