Regular visitors to my site will know that as of late I have been absolutely stretched to my limits, after all I am human like the rest of us, unless there are others that walk amongst us? Considering the enormous demands placed on my time, I feel good to have found just enough time to fit this in for you.
As you know Images often place the greatest demand on our poor servers, especially when people use images for aesthetic purposes whereby a site can consist of images amounting to multiples of 10. If you didn’t know the solution to that would be to sprite those little buggers into a big single image and use CSS to position the individual images where ever they are required, however that is not what this Post is about.
Recently I have been working on a skin for Thesis 2.1 and one of the requirements for that skin was to produce a gallery that was capable of loading well defined images that are reasonably large whilst being able to load them quickly. Sounds fairly simple…
Images of this order, are usually pretty darn big, which presents us with a problem.
I may be good at optimizing websites, but I am by no means a miracle work, I simply research and experiment in order to find solutions to problems.
- Reasonably large images ( 1400px max whether that’s portrait or landscape)
- Smallest possible file size without loss of noticeable quality.
- Fastest Possible load times.
Now you may be thinking that with today larger screens coming out 1400px may seem a little small, however other dimensions can easily be added and swapped out depending on the screens sizes. So 1400 was selected as the middle ground for the largest possible device set.
The file size must be small as bandwidth on mobiles for one is generally not unlimited nor is it cheap. The other reason is if you are going to display 1000s of images across the site, and your site is expected to be viewed by a fairly large audience then you want to reduce the total amount of time Apache spends on processing each request.
Naturally by achieving the above this should result in a faster overall load time across all devices and connections.
I must point out that when it comes to poor end user connections there is simply very little I can do about that, if you can only download at speed of say 200kb/s then loading 10 images are say 200Kbs each will load as fast as your connection can receive.
There are a whole bunch of tricks and practices you can use to generally speed up the overall rendering of your sites. Gzip compression, caching, CDN to name a few. But what I have found when it comes to images is that there is a certain limit to which they can be crushed before they explode onto your screen as an undesirable mess.
The first thing I did was use methods I have used before, i.e. running them through a tool such as Shrink-O-Matic.
Although Shrink-O-Matic produce some good results, I was still left with some pretty big images that were still taking time to load.
Time that I felt was not acceptable!
I did try a whole load of other typical procedures and to be honest non of them were any better, so lets take a look at what the image in question.
The original Image is 1400 x 933 and weighs in at an ugly 603KB
For the purposes of testing, I ran it through Shrink-O-Matic and kept the quality the same, in this case the original quality was 93/100
Shrink-O-Matic produced an image weighing in at 285KB
As you can see the size difference alone is pretty amazing already. From 603KB to 285KB (52.73% reduction), however for the more savvy folks out there, you would notice a key difference between them other than the size. One is progressive while the other is not.
Progressive JPEGs appear to load faster as they load the image in layers, starting out blurry and gradually building the image up, layer by layer.
I could of stopped here and be done with it, but I decided to continue on my journey and shall you.
Shrink-O-Matic at 85/100 quality produced an image weighing in at 187KB
Ok now we are getting somewhere, from 603KB to 187KB (68.98% reduction) from the original image and so far no noticeable loss of quality. However we still have that problem of Shrink-O-Matic in producing a non progressive JPEG, but it is getting closer.
Again, I could have stopped here with an image that is 68.98% smaller than the original which is a pretty good but I felt I could do better than this.
That’s right people, I got my GIMP out! and no, not the latex laden sub from the urban dictionary.
The GNU Image Manipulation Program (GIMP) is a tool that allows you to edit photos and view PSDs for FREE.
After many trials of exporting the image using various options I stumbled across an option I never heard of called Chroma subsampling.
A quick search on Wikipedia gave me the answers I needed.
Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information, taking advantage of the human visual system’s lower acuity for color differences than for luminance.
It is used in many video encoding schemes — both analog and digital — and also in JPEG encoding.
So in laymen terms it takes advantage of our visual systems poor ability to notice differences in color. Example, a red and orange pixel next to each other could be combined to a single color and you wouldn’t notice a thing.
By default most images on the web use a Chroma subsampling setting of 4:4:4 which means full definition, a block of 8 pixel, 4 x 2 would be rendered as 8 separate pixel each with there own unique color depending on the picture being displayed.
By changing the Chroma subsampling to Chroma halved ( 4:2:2 ) and setting the quality to 85 I was able to score a double whammy and produce an image that was not only smaller, but importantly there was no noticeable loss of quality.
In reality there was of course loss of quality, but as we are unable to notice these small changes it makes no difference.
This is why the British wore red tunics because we see red poorly (for a start) when it comes to extracting details such as counting the lines in order to work out how big an army you have. If each soldier was a pixel, then you would just see one big red line rather than 100 separate red dots. This is basically how Chroma halved works, by reduce the number of independent pixels on the horizontal resolution. Thus similar colors blur together reducing the amount of data required to render the image.
The other benefit of using GIMP was that I could also maintain the progressive nature of the image.
GIMP export settings used
The results of exporting the image in this manner we pretty amazing.
Exported with GIMP at 85/100 quality 4:2:2 chroma halved subsampling produced an image weighing in at 148KB
Astonishing, the image is a staggering 75.45% of the original image without loss of quality and is still a progressive JPEG. Result!
It’s not quite the end of the story though.
GIMP is a nice tool, but what if you have 100s or 1000s of images in a folder?
I looked across the net for a tool and I found one that stated to perform the miracle, however it did not work as intended. However it pointed me in the right direction.
Image Magick is a command line tool that allows you to manipulate vast numbers of images in a folder fast and reliably.
But I don’t expect you to go and learn how to use that sort of tool, some servers don’t even have it installed by default so that could present a problem for you. But rest assure I have a tool that I am working on, it works already, however I need to make it more user friendly for you before I can release it to the public.
The future of images
JPEGs have been around for a while now and they are by far the most reliable image for the web to date, at least for photos and color intensive images.
But there is hope on the horizon. Two new formats are battling it out and there respective browser support them already, but there isn’t much collaboration going on right now which could mean a slightly long road ahead before we see them adopted universally.
One of those contenders is WEBp (pronounced weppy).
WEBp produced an image that was 65.7KB without any loss.
Amazing results, but currently only Chrome supports it natively.
I hope you have enjoyed this little rundown and I look forward to releasing this tool to allow you to optimize your images.