May 13, 2013 9:13 am | 11 Comments
[Originally posted in the 2012 Performance Calendar. Reposting here for folks who missed it.]
There’s an elephant in the room that we’ve been ignoring for years:
window.onload is not the best metric for measuring website speed
We haven’t actually been “ignoring” this issue. We’ve acknowledged it, but we haven’t coordinated our efforts to come up with a better replacement. Let’s do that now.
What we’re after is a metric that captures the user’s perception of when the page is ready. Unfortunately,
perception.ready() isn’t on any browser’s roadmap. So we need to find a metric that is a good proxy.
Ten years ago,
window.onload was close enough. Plus it had other desirable attributes:
window.onloadmeans the same thing across all browsers. (The only exception I’m aware of is that IE 6-9 don’t wait for async scripts before firing
window.onload, while most other browsers do.)
window.onloadis a page milestone that can be measured by someone other than the website owner, e.g., metrics services like Keynote Systems and tools like Boomerang. It doesn’t require website owners to add custom code to their pages.
window.onloadis a lightweight operation, so it can be performed on real user traffic without harming the user experience.
Fast forward to today and we see that
window.onload doesn’t reflect the user perception as well as it once did.
There are some cases where a website renders quickly but
window.onload fires much later. In these situations the user perception of the page is fast, but
window.onload says the page is slow. A good example of this is Amazon product pages. Amazon has done a great job of getting content that’s above-the-fold to render quickly, but all the below-the-fold reviews and recommendations produce a high
window.onload value. Looking at these Amazon WebPagetest results we see that above-the-fold is almost completely rendered at 2.0 seconds, but
window.onload doesn’t happen until 5.2 seconds. (The relative sizes of the scrollbar thumbs shows that a lot of content was added below-the-fold.)
But the opposite is also true. Heavily dynamic websites load much of the visible page after
window.onload. For these websites,
window.onload reports a value that is faster than the user’s perception. A good example of this kind of dynamic web app is Gmail. Looking at the WebPagetest results for Gmail we see that
window.onload is 3.3 seconds, but at that point only the progress bar is visible. The above-the-fold content snaps into place at 4.8 seconds. It’s clear that in this example
window.onload is not a good approximation for the user’s perception of when the page is ready.
The examples above aren’t meant to show that Amazon is fast and Gmail is slow. Nor is it intended to say whether all the content should be loaded before
window.onload vs. after. The point is that today’s websites are too dynamic to have their perceived speed reflected accurately by
The reason is because
window.onload and the user’s perception of website speed. In other words, this problem is just going to get worse.
The conclusion is clear: the replacement for
window.onload must focus on rendering.
This new performance metric should take rendering into consideration. It should be more than “first paint”. Instead, it should capture when the above-the-fold content is (mostly) rendered.
I’m aware of two performance metrics that exist today that are focused on rendering. Both are available in WebPagetest. Above-the-fold render time (PDF) was developed at Google. It finds the point at which the page’s content reaches its final rendering, with intelligence to adapt for animated GIFs, streaming video, rotating ads, etc. The other technique, called Speed Index and developed by Pat Meenan, gives the “average time at which visible parts of the page are displayed”. Both of these techniques use a series of screenshots to do their analysis and have the computational complexity that comes with image analysis.
In other words, it’s not feasible to perform these rendering metrics on real user traffic in their current form. That’s important because, in addition to incorporating rendering, this new metric must maintain the attributes mentioned previously that make
window.onload so appealing: standard across browsers, measurable by 3rd parties, and measurable for real users.
Another major drawback to
window.onload is that it doesn’t work for single page web apps (like Gmail). These web apps only have one
window.onload, but typically have several other Ajax-based “page loads” during the user session where some or most of the page content is rewritten. It’s important that this new metric works for Ajax apps.
I completely understand if you’re frustrated by my lack of implementation specifics. Measuring rendering is complex. The point at which the page is (mostly) rendered is so obvious when flipping through the screenshots in WebPagetest. Writing code that measures that in a consistent, non-impacting way is really hard. My officemate pointed me to this thread from the W3C Web Performance Working Group talking about measuring first paint that highlights some of the challenges.
To make matters worse, the new metric that I’m discussing is likely much more complex than measuring first paint. I believe we need to measure when the above-the-fold content is (mostly) rendered. What exactly is “above-the-fold”? What is “mostly”?
Another challenge is moving the community away from
window.onload. The primary performance metric in popular tools such as WebPagetest, Google Analytics Site Speed, Torbit Insight, SOASTA (LogNormal) mPulse, and my own HTTP Archive is
window.onload. I’ve heard that some IT folks even have their bonuses based on the
window.onload metrics reported by services like Keynote Systems and Gomez.
It’s going to take time to define, implement, and transition to a better performance metric. But we have to get the ball rolling. Relying on
window.onload as the primary performance metric doesn’t necessarily produce a faster user experience. And yet making our websites faster for users is what we’re really after. We need a metric that more accurately tracks our progress toward this ultimate goal.