Posted on May 19, 2014 by

How I Improved My Website’s Load Time by More Than 27% in Under 10 Minutes

I recently launched a new service offering with my consulting business and in doing so I went ahead and put together a well designed landing page in order to help promote it.

When I had finished creating the page I was slightly horrified to realise that it came in at a fairly heavy 2.1 MB as you can see below. I knew right away that unless I did something about this the load times on anything other than an amazing connection was going to end up costing me money. So it was around that time that I decided that it might be a good time to see how well I could optimise my site performance metrics with the least amount of effort.

page load metrics

Since I had tried to ensure that the page was created with just about every single best practice that I could think of I decided to reach out for some help by using Google’s amazing Page Speed Insights tool.

I was quickly presented with a laundry list of tasks to work on in order to start moving the needle.

Page Speed Insights Task List

Looking through the list it was clear that none of these tasks was particularly difficult from a technical perspective but they sure were time consuming if you wanted to do them all by hand which I had little interest in doing if I am going to be honest.

That’s when I remembered about a little project that Google had been working called mod_pagespeed that was designed to handle all of this for me automatically meaning that I could simply set it up once and then never have to think about it again.

So in this blog post I am going to show you how to do exactly that. If you aren’t technically savvy Google is looking to offer a separate version of this where they will handle all / most of the implementation detail for you. However, if you would like to save yourself some cash and maybe learn something along the way then you can just follow along with this guide.

So mod_pagespeed works directly with your web server software (it currently supports either Apache or Nginx). If you are one of the unfortunate souls who are stuck with IIS then you can check out a clone by the name of IISpeed that operates in more or less the same way.

That means that we are going to have to fire up the command line by SSHing into the web server in order to get started. Unfortunately if you are using Nginx the process of installing mod_pagespeed is a bit more difficult as it requires you to recompile your web server from scratch which is no ones idea of a good time.

So with all that in mind these instructions are designed to work best with Apache 2.2 and Ubuntu 12.04 (64-bit). However if you have a different setup hopefully these instructions should require minimal if any changes.

Let’s begin by downloading the mod_pagespeed module for Apache like so.

cd /tmp

Then we need to actually install it with the following commands.

sudo dpkg -i mod-pagespeed-*.deb
apt-get -f install

So 4 commands in and we are close to done already because this module automatically enables itself when it is installed. However, we will need to actually restart the Apache before it will begin working. That’s easily done with the following command.

service apache2 restart

NOW it should be working. That wasn’t too painful was it? The only problem now is that out of the box mod_pagespeed only enables a small number of filters (optimisations) by default. In order to really start making the kinds of improvements that I want I will need to do a tiny bit more tinkering.

Thankfully that isn’t at all difficult and is simply a matter of editing a text configuration file which we can open with:

sudo nano /etc/apache2/mods-available/pagespeed.conf

Now all I need to do is take a look back at the results I got earlier from the Page Speed Insights tool in order to figure out which filters I should enable.

I decided to go with the following:

mod_pagespeed filter settings

You can find a complete list of all the filters and what they do in Google’s provided documentation however for the sake of brevity I will give you a very brief rundown of the particular filters that I have enabled here.

rewrite_javascript & rewrite_css
These filters will take any Javascript or CSS code you have (either in external files or inline) and minify it for you in order to speed up the download in much the same way as the next filter works for HTML specifically.

This will remove all the unnecessary whitespace in the HTML source code for the page. Having nicely formatted code is essential for when you are working with it but when it is live on a web server all those extra spaces just translate into additional bytes of information that your browser has to download each time. This might not seem like much but those extra bytes can add up especially on big pages. It’s a simple quick win with no risk. In short there is no reason not to enable this one.

In HTML there are sometimes more than two ways of writing the same piece of markup. That is because the default behaviour of some elements is the same as the more explicit and clearer to understand code you may have written originally. However there is no need to worry because this filter will very smartly figure out any instances where you might have done that and rewrite the code for you to only do the bare minimum in order to get the job done thus finding yet another way to reduce the overall size of your page.

recompress_png & recompress_jpg
One of the recommendations that the tool had for me originally was to go through each individual image on my website and recompress them to that their quality remained the same but the filesize was reduced. Any time I need to add a new image to my page I have to go through that process all over again. In my experience these are the kinds of recommendations that make people just say “it’s not worth it” and they move onto the next thing on their list. However with these two filters mod_pagespeed will automatically do all this for me no matter what images I put on the page. What would take me who knows how long to do by hand can be done for me by adding this one line of code to my configuration file. Great success!

combine_css & combine_javascript
One of the best practices in modern front end web development is to combine all of your CSS and where possible JavaScript files into a single file. The reason for this is that each time you reference an external file within your page that results in a brand new HTTP request that your browser has to make to the server in order to get that file. Without going into how the HTTP protocol works, you just need to know that it involves a lot of back and forth which is additional time it takes to load the page. This filter will take all of my CSS files and ensure that they are merged into one super file meaning that instead of making a whole bunch of HTTP requests for each CSS file, we now only have to make one.

If you aren’t already familiar with the idea of what a cache is and how it works then you can think of it like this. When you download a page there are inevitably a number of files that you will also load in the process that just don’t change very often. Think of the CSS files, the JavaScript files and the images on the page for example.

Knowing that they don’t change browsers can be instructed to save a copy of these files on the users hard drive so that next time they decide to visit the page the browser can just load these files from the hard drive rather than having to download them all over again thus making things much faster.

Now one of the things that the initial Page Speed Insights tool told me was that I should leverage this feature of all my images, CSS and JS files. That is exactly what this line does. It will tell the browsers that they can keep a copy of those files on the users hard drive for up to a year before they need to bother checking again for an updated version.

So unless you are a geek there is probably no reason why you would know this but every time a browser comes across a URL it has to match that domain name with the IP address of that server. This is done through what is known as DNS.

With all that in mind every time I call a file on an external server (say for example my Google Analytics JavaScript tracking code) the browser has to spend a small amount of time looking up the IP address for each unique domain name in my code while everything else waits for it to complete. This process is usually very fast but if you have enough files on external servers this can certainly add up into a noticeable difference.

What this filter will do is it will instruct the browser to begin looking up those IP addresses from the start and in the background meaning that by the time it needs that information it will already have it and it won’t hold up everything else on the page from loading.

I keep my HTML code fairly well documented with comments when developing any page. However like the whitespace these serve absolutely no purpose in production, so let’s strip them out and further cut down on the number of unnecessary bytes the user needs to download in order to display the page.

Finally, one of the other things that the tool was complaining about was my use of Google Fonts or more specifically the way in which I was calling the external files associated with them. Nothing on the page could load until these had loaded and to make matters worse I was downloading things that I wasn’t even really using.

Thankfully mod_pagespeed is smart enough to understand exactly what I DO and DON’T need and will again handle all of this gracefully for me.

So without further ado let’s restart apache one more time and check out the results for ourselves.

My final Page Speed Insights Score

There is a huge difference in scores between where we were when we started 10 minutes earlier and now and yes I am aware that the tool still mentions the leverage browser caching but it is only for one file that I specifically don’t want to be cached for various reasons.

Final page load speed

Take a look at the difference between where we started in terms of not just load times but the overall size of the page and the associated files not to mention to fewer requests that are needed to be made.

Overall, I think we can agree that was a rather productive 10 minutes of time and now I have things entirely set up so that I don’t ever need to touch them again, everything just works as it should.

If you have any questions make sure to leave them in the comments below otherwise hit me up on Twitter @mdhoad.

Hopefully, this guide is able to help others in the same way that this little experiment helped me.

If you are interested in bringing world-class online marketing insights and analysis to your business you may want to consider working with me as a part of a new service I have launched called Easy Insights.

I currently only have a small number of positions left so get in touch if you would like to be a part of it now as I am likely raising my prices in the near future. Even if this service isn't for you check out how you can make $500 simply by referring a client to me.