Site Optimization Post YSlow
I talked about YSlow awhile ago and thought it had some interesting things to say, but wasn’t fully convinced it was optimization for the masses (where the masses is pretty much everyone not running a gigantic web site). Fortunately, I recently came across this piece talking about other optimizations you can make to your site that are much more targetted towards regular folk.
It’s quite good and detailed and well worth a full read. There’s a few things I’d like to highlight from the piece, the first one is the one I’m most interested in – the usage of image sprites to limit the number of downloads. The technique involved combining several images used on a page into one big image and then using CSS to show only the part of the image you want. I’ve used this on various occasions in a limited way, for rollover menus, so each menu item would be one image the top half is the on state and the bottom half is off, when you rollover it the CSS tells it to only show the bottom half.
This technique takes that to the limit and combines many images into one big one. This has two effects the first and major one is that it reduces the number of requests your browser has to make which should really improve performance – combining many small images (if you’ve got that kinda site) should dramatically improve the mobile experience where latency is high and each request takes a lot of time. The other effect is that the total image size should be smaller since it can more efficiently compress larger images. There’s even a tool to help you build these sprites which combines all the images you upload and even gives you the html you can use to display each portion. Nice!
The main downside is that there is increased complexity in your CSS since you can’t just place an image, you place this big image and then you need to add sizing and coordinate information to pull just the right piece of the master image out at each point. Changing the image requires changing the master and god forbid if you have a nice grid of images and one of them needs to grow, you’ll need to reshuffle a lot of things around. Still, I think this is something well worth investigating, especially if your site uses a lot of constant smaller images.
Another bit he looks at is hosting images and js/css on different domains. It turns out that browsers limit the number of simultaneous downloads on a domain basis. So if all your assets are coming from a single domain you’re going to be capped by that – if you move them out to different domains you sidestep this issue. Of course, it’s a fine balancing act since you also want to reduce the number of DNS lookups the browser has to do – the authors of the piece recommend 2 to 4 domains. Which seems eminently reasonable. They take this idea to the next level when the recommend keeping these asset domains cookie free so that browsers save the overhead of even sending whatever cookies your main site is using. That’s deep thinking.
This is all fairly easy implement since you can just add a virtual host to your apache (you’re using apache, right?) – it can be more or less the same setup as your main host, just make sure you prevent it from accessing your HTML, search engines do not like duplicated content. This is easy when you keep all your assets under specific directories like /images, /css and /js.
He does recommend compressing your content on the way out. This reduces the size of your text deliveries (html, css, etc..) which is good. But I’ve always had reservations about this – it adds load to the server side as it has to compress all these before it serves them up. So if your server comes under load, which is when performance will suffer most, it seems like this would exacerbate the problem. I remain mildly skeptical of this technique – I’d like to see some sort of benchmarks/performance numbers for this. Has anyone out there put this in and seen real changes?
It’s a really good piece with a lot of recommendations that I didn’t highlight, I just pulled out some of the ones that I found the most interesting. He recommends LiveHTTPHeaders in there (a firefox extension) but I recommend Tamper Data which does the same thing but better, IMHO. Give it a read if you run a website, it’s sure to give you some ideas.