Speed is a particular problem with Magento sites, caused by the heavy “object-oriented” internals of Magento, and Magento site owners often struggle with performance particularly as a site grows and becomes more popular. The points in this article, of course, apply equally to sites of any type; but more so to Magento where load times on shared servers can easily balloon out to 20-30 seconds for a home page.
Load time (the time it takes for everything to appear) for an eCommerce site is vital, as every second over 4 seconds can lose you 7% of your customers. Past a certain point customers may give up, intending to come back later but never returning in the end.
By the way, this article is intended for an eCommerce site owner that has had some exposure to Magento and eCommerce concepts, and is intended to provide an overview of k
It’s possible, even on shared servers, to boost the performance of Magento many times over with a variety of tricks; however, these tricks are not well understood in the industry and are often missed by web developers. Of course, a dedicated or semi-dedicated server or a small AWS instance can work wonders, but that may not be needed if enough other areas can be addressed.
In understanding web performance there are two major performance factors your web people will have to be aware of –
– time to first byte, or, the processing time required to work out what the page contains;
– payload size, or, the amount downloaded to display the page
Strangely, web performance is often not understood at all by developers and most web hosts; you may need to talk to a specialist hosting company who understands some of the issues to fix the problems your site may have.
In this article we’ll give you a birds-eye view of the issues and solutions which you should find useful in at least understanding the basics when dealing with developers and hosting administrators.
Time to First Byte – TTFB – the “thinking time”
The first major performance factor, Time to First Byte or TTFB, indicates how much processing time is involved in displaying a page. This is the “thinking time” that the server needs to do every single time the page is displayed. Obviously, if a lot of processing time is involved, as the number of page visits goes up your server performance will go down as it has to think harder and longer. This results in page load times getting even longer when the server is busier. It may also result in you being asked to move your site elsewhere off the shared server it is on, as it is impacting other users.
The tricks in solving this factor are all based around reducing the thinking time per page – so pre-caching the page is a common fix, either in a CDN (described later) or on your server. For instance, a typical unoptimized home page might do 300 database queries which could be reduced to 0 by calculating the page contents and saving them, a practice I’l describe as pre-caching in this article.
Page Payload – The download factor
The second major performance factor is what we call “page payload”. This is the amount of data required to be downloaded to display your page. This should probably, for most home pages, be around the 500k – 600k mark, and for other pages, around the 200k – 300k mark. The higher the amount your home page downloads, the slower it will display – it’s common for unoptimized homepages to download 3mb of data, which can take ages to display and makes the page seem really slow when it is looked at on a slow connection. Obviously part of optimizing an eCommerce site is making sure the page displays by downloading as little as possible and there are some tricks to that.In general, using a variety of techniques, it’s possible to reduce a fairly average poorly optimized site payload of 2000k down to about 600k or less with a variety of techniques such as compression, using caching, and optimizing images.
Another closely related factor is the number of requests made by a page. An unoptimized homepage can typically make some 150 requests to the server to display the page, and even if the objects being requested are small, this makes a page very slow. This can also be reduced by 50% or more by using simple techniques such as using the Google CDN, and JS and CSS file merging.
In the remainder of this article we’re going to cherry pick some of the simplest and most useful concepts.
The first strategy – Using pre-Caching
Caching is used to reduce the TTFB “thinking time” and thus make the page display faster. Essentially, there are three types of caching – pre-caching, ISP-based caching, and CDN. Though many pages could be written on each of these we’ll just overview them quickly here.
Pre-caching does all the time consuming work of building out the page contents and looking up details in the database once, and saves the results in a ready-to-go file. This can then be recalled very quickly (often 10x or more quickly!) when needed, thus reducing TTFB and server load. This is a critical step and there are some very good pre-caching tools for Magento, though they often require knowledgeable developers to setup.
Second strategy – ISP and/or CDN caching
ISP-caching in my book is also known as using other people’s caches! The idea is that all Internet providers (ISPs) will cache internet objects when they fetch them for users. This means that when they are next requested, the objects will come from their caches rather than your server, provided you have sent the required “headers” (in this case, an Expires header). This gives you a good speed boost for free and also removes a degree of load from your server. If this doesn’t quite make sense, think of it this way – once someone on each ISP loads your site, they remember the details and serve them without touching your site.
CDN caching is used for large, active sites, and involves a high speed “Content Data Network” (or CDN) which is very good, and very fast, at grabbing and remembering web content. So long as your web server sends the correct Expires headers you can include a CDN in your web hosting setup so that most, if not all, requests are served from the CDN.
A critical problem with any form of caching is in caching webpage text (HTML) – the problem being how to handle parts of pages that change depending on what you are doing – for instance, you may have items in a shopping cart and the page needs to display that at the top. This can partially prevent caching and the technique used to resolve this is nicknamed “hole punching” and involves grabbing just that small area directly from the server while using the rest of the page from the cache. This is still a huge reduction in the amount of work required to download a page but will often require some work or professional expertise to setup.
In the simplest cases of caching it’s about reducing the number of hits on your server by allowing the ISP to cache fixed objects, such as images and CSS, and this in itself makes a huge difference to site load time and also to server load.
Third strategy – File Compression
Web servers can also serve many file types in compressed form. This typically saves a minimum of 50% of file size (ie 600k becomes 300k) although it can save as much as 70% for some file types. Obviously, if a user has to download half the size everything will appear to be twice as fast.
Many shared servers allow you to specify that some files should be compressed. Even better, some will allow you to pre-compress files, and then serve those files directly. The Apache module for this is called mod_gzip.
The problem here comes on shared servers, many of which don’t allow file compression for the simple reason that it is requires a lot of processing time and slows sites down for other users. Talk to your host and see what is possible for them.
Strategy 4 – Image Compression
Computer screens, in relation to images, are actually quite small, and this is fact that is often missed when sizing images for the web. Web optimized images are generally very small and can be as little as 20k. However, images can also be web optimized by ensuring they are effectively encoded – this can save as much as 50% off images!
When this saves 50% off a 400k image, you’re talking about a big increase in display time which adds up well when done across all images. It’s common for an image-rich home page to have 400k – 500k of images which can optimize down quite a lot.
A new, related, technique uses a trick with JPEG images to make them display while still loading through “interleaving”. This can halve site load time visually; that is, the person visiting the site sees it display in half the time. The images continue to download and refine in display quality after displaying, and for most eCommerce sites, this makes no difference to the user experience.
Strategy 5 – Replacing Apache or offloading static content
Apache, unfortunately, is particularly slow at static file serving – at producing files that don’t change much. One way of working around this is to move images off to a service such as Amazon’s S3, which ultimately allows almost infinite downloading at a reasonable price. This can act as a second server which then means your site can download items in parallel, providing a huge speed boost to your site and moving a lot of the load off your server.
Another way of replacing Apache is to completely replace the Apache software with another webserver program. We like nginx and statistics demonstrate that it can download images and fixed content almost 10x faster than Apache (it uses the kernel directly for these files). There are a number of other fast alternative webservers as it turns out that Apache is one of the slower systems (though it can be well tuned). This generally requires some sort of VPS or dedicated server. (Paradoxically, Apache with mod_php can be faster at serving dynamic pages!)
We’ve merged these two quite different concepts here, but they’re worthy of some separate discussion which we don’t have room for here.
A Magento site can download 30 or 40 of these small files during a load. Once loaded, they are of course cached, but the first load can be very slow. Magento offers a facility called “Combine JS and CSS files” which condenses multiple small files into one large file, thus turning many small stop-and-start downloads into a single large continuous download which is generally very much faster.
This technique is useful across all site types, not just Magento!
Strategy 7 – Test and Measure
There is no replacement for testing and measuring. You simply can’t be certain a webserver will work and continue to perform acceptably without actually checking! This may seem logical but it often gets forgotten, for even large endeavours. Don’t be left embarrassed – always check!
The best all-round test tool I’ve found so far is www.webpagetest.org. This site gives your website a performance grade in a number of areas and makes it relatively clear which areas need followup. Google Page Insights gives similar information, as does the Yslow plugin; your developers should be able to use all of these to assess your site and come up with recommendations. Remember, every site is different and some people just don’t realize that their homepage is downloading 4mb every time it is visited – it just “seems” slow!
There are also a number of systems available for simulating load on a webserver, for example, www.blitz.io. This allows you to be confident that your site will perform under pressure and should be done after making improvements based on the above site. This is essential for a large site that will be public facing, especially if you have large campaigns coming up. It’s typical for a Magento site, with a 2mb payload and maybe loading 150 objects, to crash a server even on small campaigns.
Strategy 8 – Using developers who know Magento well
Several of our customers have experienced problems after using cheap overseas developers who have modified Magento core files and thus made future upgrades more expensive and difficult. One of our customers had developers modify core files, introducing a bug where shopping cart contents were lost under load, introducing a difficult to isolate bug which making purchase impossible.
We can’t emphasize the importance of using good developers. While it may seem more expensive initially, some of the mistakes made in early stages are impossible to fix without rebuilding a site from scratch, especially if core files have been modified.
To ensure your developers understand Magento, ask them how many sites they have developed and ask for a few samples to look at. You might even ask them how they would make a Magento site faster and see how many of the suggestions in this article they’re able to produce! References from other customers are also useful, as is meeting them in person and discussing options.
Bear in mind that even good developers don’t fully understand web performance and that it is a specialist area. It’s also terribly easy for a small change to break an aspect of performance and you should use the above tools regularly – particularly one that assesses your download size.
A lot more could be said about performance than we’ve been able to fit into this short article; however, we hope it helps you understand the issues!
You should have seen that web performance is not a simple issue – often many factors combine to get a site to load quickly, and combining those factors can produce stunning results where a site doesn’t even seem like the same site.
Some example results we’ve been able to achieve using these principles are reducing site load times from over 12 seconds to under 3 seconds, for a first load of a homepage, and for secondary pages, from 7 seconds to under 2 seconds. Some example results we’ve been able to achieve using these principles are reducing site load times from over 12 seconds to under 3 seconds, for a first load of a homepage, and for secondary pages, from 7 seconds to under 2 seconds.
The key to getting your site to perform excellently is going to be taking the time to test and to evolve your site towards better performance. There’s just no substitute for testing and tweaking on an ongoing basis.