Speed up website

Speed matters - website speed optimization

Why does it matter to optimize a website?
Reward your visitors!

Website speed optimization is a series of steps that should be done in order to make specific website load faster -> time spent by your favorite browser to chew up some web page. If we continue following food analogy, some pages represent favorite dish, and others are inedible.

What is needed to be done to speed up specific website?

Unfortunately, sometimes it’s not economically viable to fix everything we find problematic, but there are always quite a few steps that we can easily implement to improve page / website loading speeds. First things first, to see where we are, we need to analyze current website state. To do that we use tools like GTmetrix or page insights (both based on Lighthouse engine). Here are the links:

Don’t chase 100/100! Not because it’s impossible, but because it’s not necessary. Surely it’s great to have top score, but it will not affect your ranking position that much in the way you think this works. Faster websites equals happier visitors, equals more time spent on your page and more page visits, and vice versa. There is a actually a direct connection between number of articles bought on some fast e-commerce website vs slow ones.

What we need to do achieve this?

Understanding the importance of website speed leads us to explore how to address loading speed issues through a few common steps. To illustrate, picture your browser as a waiter equipped with a set number of hands, say six to eight. These hands can only manage a certain number of dishes at once. If they’re idle, waiting for the server to fill plates with the ordered food, the experience parallels waiting in a restaurant. In the world of websites, just like in dining establishments, prolonged waits for your content may prompt you to seek an alternative.

So to speed up this data delivering process, on one side we should group requests (more food on the same plate) – and on another we should decrease total transmitted data (compact high calory / low fiber dishes).

Elements that are mainly responsible for poor website performance

I’ll make a list of elements that hurts website performance – created of all sites we boosted/fixed/faced so far, ordered by importance:

  1. Images
  2. Uncontrolled number of javascript and CSS files
  3. Gzip not enabled/configured
  4. Caching – server / client
  5. CDN

Images

Bellow you can find a common list of problems while implementing/embedding images:

  • Embedding  issues
    Web developers/programmers often implement images by using so called img tag, filling its src attribute, point it towards specific image file via url and that’s it. Embedded image is afterwards controlled via CSS, and after embedding it, they proceed to next burning problem, without seeing what they’ve missed – because image is displaying properly. But that’s bad, really bad. Image that is intended for desktop resolutions, loads on devices with smaller resolution screens than the image itself.Touch devices are now dominant surfing devices, so by forcing large images to be loaded on a device with smaller resolution/screen instead of its smaller version is a bad thing – for example 1200×400 pixels loads on a device which screen resolution is 320×568 pixels. That brings larger network usage and mobile data is being spent for nothing. Even worse because solution already exists in that same img tag.

    Besides src attribute, img html tag supports width, height (old) attributes and with them srcset and sizes (new) attributes as well. Image dimensions can’t be red by browser, until the image is downloaded first. So don’t be lazy, write image dimensions, so browsers can go to the next page element, without waiting that information from the server, which is packed with large image (remember waiter hands, well you are just wasting one hand, until browser loads an image without width and  height attributes declared, and he can’t proceed to the next order until the image is downloaded), and with those attributes (width & height) you could also declare & populate sizes & srcset as well.

    Those two powerful img attributes are telling browser on which screen resolutions to load which image. So, you have that one large image, for desktop resolutions, and out of it, create several more smaller images/crops. In sizes & srcset attributes, you should refer to smaller crops of original image, and instructing browser when to show specific crop. This works extremely well, browsers also detect retina screens, and in those cases they’ll download two times larger image from server, if that size is available and listed in srcset attribute.

  • Wrong image format (extension)
    I’ve witnessed more then several times that even so called web designers do not know when to use proper image format. They don’t know when to use jpg and when png for instance. If designers don’t know the technical difference (bytewise), how we could expect web developers or administrators to know when to use proper formats. Developers and administrators are mostly indifferent what they are implementing, as far as it looks as it should. I will not go now in technicalities behind all of this, although this will be covered in some future post, I’ll just write about proper image format for different scenarios (there are also new image formats / compression methods rolled out, but in December 2021 they are still unsupported in all browsers, like WebP for example, so they will not be in this list, for now):

    • JPG – raster – use this format when you want to display images with a lot of colors – like a photo taken from your mobile phone
    • PNG – raster – use this format when image has limited number of colors – if it has gradient/shadow – it uses a lot of colors – try jpg, or if you want transparency – you’re stuck with png, so you must use it no matter the cost
    • GIF – raster – use this format if you need image animation, this format also supports transparency, but if the image isn’t animated, go with png
    • SVG – vector – image made out of textual code (XML) – it can be scaled to infinity and beyond, without proprietary blur – which happens in every raster format. This format supports transparency as well
  • Image compression methods – lossy and lossless
    Images can be compressed (lowering image size in bytes), we can decrease their quality, in visible or invisible manner. Both methods are trying to reduce image size, lossless by perfect data reconstruction – which results in near original quality, and lossy compression which is quicker, by removing some data from original image – this process is irreversible, but you can control output quality. You can always use some online tool for this job, like tinyPNG.
  • Grouping page/website graphics into one larger image called sprite
    All graphics, instead calling them one by one from different paths/urls, are better to be packed into one larger image, which parts are then shown on proper places, or even better, if it’s possible to upgrade them into proper svg files, and place them all inside graphic font, this would be perfect solution. Because afterwards you can scale them up via font size and reuse them on your website. Main reason of packing several smaller files into one larger file is speed, it is faster to download one larger, then ten smaller files.

Uncontrolled / out of control number of injected javascript and CSS assets

If you open page source of some website based on, for example themeforest theme – these are the scenes you can face …

css overload - classic themeforest theme behaviour

javascript overload - classic themeforest theme behaviour

20+ javascript files, and 30+ css files?!?

What the hell happened here? Why so many? It’s important how the website looks, not how fast is it, is that so? Well you can have both, only either someone (client) don’t know site speed counts as well, initially & find out on the hard way afterwards, or someone don’t know it can/should be fast as well (designers & developers this is a shout out to you).

Every tool, from screwdrivers to hammers on one end, to WordPress / Drupal / whatever, on the other end can be used properly or irregularly. Ignorance or lack of knowledge, call it as you wish, is a root cause of that miss-use, it’s not (always) the tools fault.

Gzip isn’t used

Gzip is a way of asset minification, similar to zip, between server(doing minification) and browser(reading minified asset), instead of downloading uncompressed file from the server directly. Sometimes this compression isn’t enabled on the server side, sometimes it isn’t used – lack of configuration, which looks something like this (Apache config, lines that are needed to be added to previously backuped .htaccess file):

# GZIP file compression: HTML, CSS, JS, Text, XML, fonts
<IfModule mod_deflate.c>
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/vnd.ms-fontobject
AddOutputFilterByType DEFLATE application/x-font
AddOutputFilterByType DEFLATE application/x-font-opentype
AddOutputFilterByType DEFLATE application/x-font-otf
AddOutputFilterByType DEFLATE application/x-font-truetype
AddOutputFilterByType DEFLATE application/x-font-ttf
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE font/opentype
AddOutputFilterByType DEFLATE font/otf
AddOutputFilterByType DEFLATE font/ttf
AddOutputFilterByType DEFLATE image/svg+xml
AddOutputFilterByType DEFLATE image/x-icon
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/javascript
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/xml
</IfModule>

Cache  – client/server

Main role of caching is to deliver stale resources, instead of fresh ones, and by doing so decreasing page loading times. There are two types of caching:

  • Client / Browser caching
    for instance if we have some image, it’s served from one url. That url, is remembered together with actual image in browser cache, once image is downloaded from the server, originally. Although image can be changed/updated on the server, leaving same url to it would produce that your browser is showing you old/stale image. So where that stale image comes from, if it’s removed from  the server? Well, from browser cache. Even more, developers have special powers which we can be used to instruct browsers how long to store these images in browsers cache, before downloading a fresh one. All that is done for one purpose, to decrease website/page loading times, and by doing so, same image / asset isn’t downloaded over and over again, for instance some logo or sprite(already mentioned above when I was writing about image optimizations – asset grouping). That can be handy sometimes, but in some cases it can be tricky – when we choose to change some image, but that’s a different topic (you should change image url to be sure that page wouldn’t show cached asset instead). Now it’s cleared what is a client cache or browser cache. Bellow you can find example for .htaccess where  we can define expiration dates of specific assets, by their file extension:
    # Once downloaded, cache these resources, for a period of one week
    <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf|svg)$">
    Header set Cache-Control "max-age=604800, public"
    </FilesMatch">
  • server caching
    to explain this, I need to explain how data is being loaded on vast majority of websites nowadays. When request arrives for page contents, programming language connects with database, and gets data from it, and after parsing you show them on the page in some format. That takes time. It’s much better if all the data is already on the same page, so we don’t have to get them from the database. But the downside of that approach is that we would then have a lot of physical actual html files on the server, which makes it difficult to administer & admins should be developers as well etc.
    Server cache is filling that gap, taking best from both worlds – page is served as a static one, but it’s already generated in advance using data from database. For this to work, we need proper server architecture with accompanying software that is cache friendly. Although often this is the case, either caching isn’t set or not configured properly, which results that caching isn’t used or it never expires. As result we either have slow or stale pages shown. If server is overloaded by requests, it can become unavailable, so caching is a must for any high visit count website/page.

CDN – Content Delivery Network

Network delivery system has only one purpose – to serve a file from a location nearest to you. For example, think that our website is hosted in Serbia, and someone from Canada visit us. All data and assets, must be fetched from Serbia, which is on the other part of the globe from the visitor. That geographic location difference, influences asset download times, because assets must travel over the web, over all physical lines from one part of the globe to another. CDN isn’t a must have, even more if your target audience isn’t on other side of the globe, it’s more of the question would you like to decrease loading times for other visitors. Website will be loaded for distance visitors as well, ofcourse, but it will be loaded slightly slower than for the nearest visitors.

This is just a standard list of checks, when website speed analysis is needed, although there are even more factors which can influence loading times – from hosting service, chosen architecture etc.

If this all didn’t helped, no biggie – we’re here to help you out – contact us 😉 .

Comments

  1. No comments, yet.

All fields are required.