Ok, so maybe secrets is a bit superlative in this case, nevertheless, several of the topics touched on in this post remain a mystery to many clients and professionals alike, so we thought we should expose some keys to well-performing web sites in more detail. Before we begin, let’s address the first possible question of “who cares?” Well yes it’s true that almost everyone has broadband and even the new iPhone will be pretty speedy on the web, what will always be true is that users don’t like to wait. In fact, what we can be sure of is that as devices become faster, a user’s patience will dramatically decrease. So to fight the attrition (user’s becoming so frustrated with a site’s performance that they never return), often caused by slow web site performance, we must always keep web site optimization in the back of our minds. After all, nothing kills a killer app’ faster than slow performance.
There have always been great tools and resources that help web developers and the like improve the user experience of their sites by following some best practices. However, what are often difficult to come by are some specific techniques that not only satisfy the requirements of the best practices, but also address issues that are even more circumstantial. In other words, we’re going to share some techniques that resolve nearly all of the most significant performance issues that web sites and web applications can face.
Understand first how your page(s) load by using Firebug for FireFox 2+ or IEInspector for Internet Explorer 5+. For those interested in Safari, you should check out this post from the webkit (Safari) team. It’s straight forward to find the area within either plug-in that allows you to observe the HTTP transactions and understand the behavior of your page from a transactional standpoint. We recommend using Firebug because it’s free; however using the IEInspector will allow you to see the page render behavior differences between IE and FF. Some relevant issues that impact performance that we’re not going to address in this post are:
- Rendering performance — how does your markup and style sheet actually behave as the browser renders it and how does that impact the perceived speed of the page from a user’s perspective.
- Database latency or page parse time — Dynamically generated pages or assets called in a page, like using PHP for server side includes or generating a table of data from a database entries play a role in the performance of a web site and we’ll set those issues aside for now and assume that you’ve optimized these factors as far as you can using server-side script caching, database caching etc.
- Who is your target audience and what are the limitations of their browsing environments?
- How much data would your server end up having to deliver if it was answering requests of thousands of concurrent users?
- Aside from the actual “horsepower” of your web server and the quality/limitations of your server’s bandwidth, what are the things that you can change about your site that will realize the biggest impact? In other words, let’s apply the 80/20 rule.
The following concepts satisfy nearly any conceivable answer to the questions above:
- Reduce file sizes of assets and reuse them as much as possible
- Optimize HTTP transactions
- Further reduce the size of text-based assets
- Reduce the number of files
- Put everything in its place
- Scale to fit
Revisiting the issue of scale, now from a different point of view, use of a Content Delivery Network (CDN) has become a much more accessible solution to this problem. Since the days when Akamai was seen as an innovator and the “only” solution the problem of insatiable demand for a sites content (or to overcome poor developmental practices), the CDN has been instrumental in reducing latency in delivering objects to users by providing multiple regional POPs for your assets. There are a number of other more affordable leverage points for content delivery, nothing against Akamai, but these other options put this powerful solution in the reach of more people. When your web applications simply are not performing as well as you would like during peak times per day a CDN allows you to offload the busy work of delivering static assets and focus your web server on the thinking. Obviously point #4 should not be skipped when moving to this solution as you’ll see more leverage than you can imagine when these solutions are combined, not to mention save a tremendous amount on bandwidth charges (~60% usually). Meanwhile users will feel like your site or application has more speed because most of the assets a given user will be downloaded will come from the closest possible point on the web.
- Throw some horses at it
For more complicated situations, you can look at throwing more hardware at the problem when the previous items have all be addressed and implemented. Specifically I’m referring to the Amazon Computing Cloud. This tip deals more with the web server component of solutions, so we’ll just consider this a bonus tip for those of you looking to make some computationally intense applications. This is a phenomenal offering from Amazon (and there are even others to consider from them) to be able to instantly scale and access a tremendous a lot of computing resources on-the-fly. Services like BrowserCam come to mind for solutions like this one.
So let’s see how techniques 1-5 combine to take shape:
It’s hard to argue with results!
A bonus tip is to use YSlow to get even more from Firebug! We’ve achieved some great performance with our home page:
But YSlow shows us where we can still improve:
Unfortunately YSlow doesn’t pick up on the pre-compressed content we send to users, we’ll have to play with our headers more to satisfy #3 and #4 at the same time no doubt. We will work on these things as we see the need; regardless the techniques we discuss (points 1-5) are demonstrated in the results shown in these screen shots. Many of you may be familiar with some classic tools like Andy King’s Web Page Analyzer are a great starting point for identifying some troublesome areas of your page, but in recent years yahoo’s developer network has really put in a single place the findings that we’ve uncovered ("the hard way") over the years. Unfortunately, as with this post, you’ll still have to develop your own solutions, nonetheless we’d recommend heading over to developer.yahoo.com, they’ve done a great job documenting best practices for creating optimal user experiences, including:
- Reduce HTTP requests (as stated above)
- Reduce DNS lookups
- Avoid HTTP redirects
- Make your AJAX cacheable
- Post-load components
- Pre-load components
- Reduce the number of DOM elements
- Split components across domains
- Minimize the number of inline frames
- Eliminate 404s (file not found errors)
So tell us what you think, if you’re interested we can put together some examples for you and/or touch on server related optimization techniques as well.