Optimizing Large Images for Maximum Quality and Performance

Regular visitors to my site will know that as of late I have been absolutely stretched to my limits, after all I am human like the rest of us, unless there are others that walk amongst us? Considering the enormous demands placed on my time, I feel good to have found just enough time to fit this in for you.

As you know Images often place the greatest demand on our poor servers, especially when people use images for aesthetic purposes whereby a site can consist of images amounting to multiples of 10. If you didn’t know the solution to that would be to sprite those little buggers into a big single image and use CSS to position the individual images where ever they are required, however that is not what this Post is about.

Recently I have been working on a skin for Thesis 2.1 and one of the requirements for that skin was to produce a gallery that was capable of loading well defined images that are reasonably large whilst being able to load them quickly. Sounds fairly simple…

..however

Images of this order, are usually pretty darn big, which presents us with a problem.

I may be good at optimizing websites, but I am by no means a miracle work, I simply research and experiment in order to find solutions to problems.

The Mission

  • Reasonably large images ( 1400px max whether that’s portrait or landscape)
  • Smallest possible file size without loss of noticeable quality.
  • Fastest Possible load times.

Now you may be thinking that with today larger screens coming out 1400px may seem a little small, however other dimensions can easily be added and swapped out depending on the screens sizes. So 1400 was selected as the middle ground for the largest possible device set.

The file size must be small as bandwidth on mobiles for one is generally not unlimited nor is it cheap. The other reason is if you are going to display 1000s of images across the site, and your site is expected to be viewed by a fairly large audience then you want to reduce the total amount of time Apache spends on processing each request.

Naturally by achieving the above this should result in a faster overall load time across all devices and connections.

I must point out that when it comes to poor end user connections there is simply very little I can do about that, if you can only download at speed of say 200kb/s then loading 10 images are say 200Kbs each will load as fast as your connection can receive.

The Methods

There are a whole bunch of tricks and practices you can use to generally speed up the overall rendering of your sites. Gzip compression, caching, CDN to name a few. But what I have found when it comes to images is that there is a certain limit to which they can be crushed before they explode onto your screen as an undesirable mess.

Shrink-O-Matic

The first thing I did was use methods I have used before, i.e. running them through  a tool such as Shrink-O-Matic.

Although Shrink-O-Matic produce some good results, I was still left with some pretty big images that were still taking time to load.

Time that I felt was not acceptable!

I did try a whole load of other typical procedures and to be honest non of them were any better, so lets take a look at what the image in question.

The original Image is 1400 x 933 and weighs in at an ugly 603KB

Figure - 1: Original image unmodified. Click to see the full sized image.

Figure – 1: Original image unmodified. Click to see the full sized image.

For the purposes of testing, I ran it through Shrink-O-Matic and kept the quality the same, in this case the original quality was 93/100

Shrink-O-Matic produced an image weighing in at 285KB

Figure – 2: Optimize with Shrink-O-Matic Same quality as Original. Click to see the full sized image.

Figure – 2: Optimize with Shrink-O-Matic Same quality as Original. Click to see the full sized image.

As you can see the size difference alone is pretty amazing already. From 603KB to 285KB (52.73% reduction), however for the more savvy folks out there, you would notice a key difference between them other than the size. One is progressive while the other is not.

Progressive JPEGs appear to load faster as they load the image in layers, starting out blurry and gradually building the image up, layer by layer.

I could of stopped here and be done with it, but I decided to continue on my journey and shall you.

Shrink-O-Matic at 85/100 quality produced an image weighing in at 187KB

Figure – 3: Optimize with Shrink-O-Matic 85/100 Quality. Click to see the full sized image.

Figure – 3: Optimize with Shrink-O-Matic 85/100 Quality. Click to see the full sized image.

Ok now we are getting somewhere, from 603KB to 187KB (68.98% reduction) from the original image and so far no noticeable loss of quality. However we still have that problem of Shrink-O-Matic in producing a non progressive JPEG, but it is getting closer.

Again, I could have stopped here with an image that is 68.98% smaller than the original which is a pretty good but I felt I could do better than this.

GIMP

That’s right people, I got my GIMP out! and no, not the latex laden sub from the urban dictionary.

The GNU Image Manipulation Program (GIMP) is a tool that allows you to edit photos and view PSDs for FREE.

After many trials of exporting the image using various options I stumbled across an option I never heard of called Chroma subsampling.

A quick search on Wikipedia gave me the answers I needed.

Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information, taking advantage of the human visual system’s lower acuity for color differences than for luminance.

It is used in many video encoding schemes — both analog and digital — and also in JPEG encoding.

So in laymen terms it takes advantage of our visual systems poor ability to notice differences in color. Example, a red and orange pixel next to each other could be combined to a single color and you wouldn’t notice a thing.

By default most images on the web use a Chroma subsampling setting of 4:4:4 which means full definition, a block of 8 pixel, 4 x 2 would be rendered as 8 separate pixel each with there own unique color depending on the picture being displayed.

By changing the Chroma subsampling to Chroma halved ( 4:2:2 ) and setting the quality to 85 I was able to score a double whammy and produce an image that was not only smaller, but importantly there was no noticeable loss of quality.

In reality there was of course loss of quality, but as we are unable to notice these small changes it makes no difference.

This is why the British wore red tunics because we see red poorly (for a start) when it comes to extracting details such as counting the lines in order to work out how big an army you have. If each soldier was a pixel, then you would just see one big red line rather than 100 separate red dots. This is basically how Chroma halved works, by reduce the number of independent pixels on the horizontal resolution. Thus similar colors blur together reducing the amount of data required to render the image.

The other benefit of using GIMP was that I could also maintain the progressive nature of the image.

GIMP export settings used

Exporting image with Gimp using 85 quality and subsampling

The results of exporting the image in this manner we pretty amazing.

Exported with GIMP at 85/100 quality 4:2:2 chroma halved subsampling produced an image weighing in at 148KB

Figure – 4: Exported with GIMP using 85 quality and subsampling 4:2:2. Click to see the full sized image.

Figure – 4: Exported with GIMP using 85 quality and subsampling 4:2:2. Click to see the full sized image.

Astonishing, the image is a staggering  75.45% of the original image without loss of quality and is still a progressive JPEG. Result!

It’s not quite the end of the story though.

GIMP is a nice tool, but what if you have 100s or 1000s of images in a folder?

I looked across the net for a tool and I found one that stated to perform the miracle, however it did not work as intended. However it pointed me in the right direction.

Image Magick

Image Magick is a command line tool that allows you to manipulate vast numbers of images in a folder fast and reliably.

But I don’t expect you to go and learn how to use that sort of tool, some servers don’t even have it installed by default so that could present a problem for you. But rest assure I have a tool that I am working on, it works already, however I need to make it more user friendly for you before I can release it to the public.

The future of images

JPEGs have been around for a while now and they are by far the most reliable image for the web to date, at least for photos and color intensive images.

But there is hope on the horizon. Two new formats are battling it out and there respective browser support them already, but there isn’t much collaboration going on right now which could mean a slightly long road ahead before we see them adopted universally.

One of those contenders is WEBp (pronounced weppy).

WEBp produced an image that was 65.7KB without any loss.

Amazing results, but currently only Chrome supports it natively.

I hope you have enjoyed this little rundown and I look forward to releasing this tool to allow you to optimize your images.

Meet the Author

Matthew Horne

Matthew Horne is web developer who specializes in optimized development. He also builds custom solutions instead of reverting to plugins. Matthew Has a strong understanding of PHP, JavaScript, jQuery.

9 comments… add one
  • Rudd Oct 20, 2013, 4:14 pm

    Instead of using any plugin to compress the size of the image, I usually use TinyPNG (for png)and Jpegini (for jpeg)

    • Matthew Horne Oct 21, 2013, 3:14 am

      Yeh those tools are also very useful but I have been able to achieve far better results with Image Magick and some code that I made that allows you to point it to a folder and batch optimize all images. I just need to find a way to make it easy to use for people.

  • Matt Snyder Oct 23, 2013, 3:08 pm

    I’ve always used Photoshop with the save for web feature @ around 60% quality. I haven’t really tried using progressive though.

    • Matthew Horne Oct 25, 2013, 5:34 am

      Progressive is just creates an illusion, but it is a useful one at that. 60% seems rather low in terms of quality, I would not go lower than 85% but then again I am not sure what algorithm Photoshop uses. Plus most people aren’t going to pay for Photoshop to optimize images.

      If you click on the 1st images and download the large image and open it with Photoshop and then save for web at 85% what size do you get?

  • Matt Snyder Oct 29, 2013, 11:29 am

    You are right, the average person doesn’t need Photoshop to edit/optimize images. I only use it when I build websites. Lightroom might be a better choice for the average user. I use it 95% of the time to edit my images.

    Here is the image at 60% saved for web using Photoshop [153k] http://alamode-designs.com/Original-Image.jpg.

  • Fred Jones Nov 4, 2013, 3:32 pm

    Well thought through, something all people should think about whilst building their site. A well optimised bank of images on your web site, will allow it to run much faster whether your hosting your site on a shared server, a cloud hosting or your own dedicated hosting platform.

    Sites that load up faster are ranked higher by google, and minimise the click through rate by casual browsers. There is nothing worse than waiting for a site to load, and peoples patience is now lower and lower.

    We have done this for several larger clients who were wondering why they weren’t ranking and why people were not staying on their site.

    • Matthew Horne Nov 6, 2013, 4:13 am

      Its very much true, my site doesn’t look like the best in the world but I have been at around Alexa 50k for over a year and under 100k 2 months after I started and I focused on efficiency over aesthetics, but that will change in my new site. The new site will be aesthetically good as well as optimized.

  • Bob Tolbert Nov 24, 2013, 12:06 pm

    After a lot of testings we are using the Photoshop “Save for Web” option and setting the quality to less than 50% and check progressive on for better loading illusion.

    This gives us the best image size with any loss in most cases.

    The only problem is that Photoshop “Chroma subsampling” option is tied into the JPG quality and it’s only fired if the quality is set to less than 50%.

    Another thing is that Photoshop has a batch option that enable us to edit multiple images at once.

    Great post

    • Matthew Horne Nov 24, 2013, 1:53 pm

      Thank you for the insight. I don’t have Photoshop but for those that do your comment will certainly be helpful.

Leave a Comment