3 Trends in Web Performance
Website performance is always an ongoing effort and an elusive target. In the early days, pageLoad time was the gold standard of web performance monitoring. However, websites have evolved to be a lot more dynamic, Javascript heavy and filled with rich images and 3rd party content. We need better metrics, a better model and better tooling to measure and monitor the performance. In this blog post, I explain the evolution 3 different trends that is shaping the industry:
- Response-Animation-Idle-Load (RAIL) model: It is both a model and a prescriptive framework that sets up performance benchmarks
- Perceived Performance / User-centric performance metrics: A set of measures that tries to more accurately predict the user’s level of engagement / frustration at using a web page.
- Tooling: Set of tools like CrUX, PageSpeed Insights, LightHouse and RUM solutions like mPulse
So let’s dig in!
RAIL Framework
RAIL is a framework that is being publicized by Google. According to the Google developer page it is
a user-centric performance model that breaks down the user’s experience into key actions
The framework breaks down performance into these 4 key goals:
Source: https://developers.google.com/web/fundamentals/performance/rail
Measure | Explanation | Target |
---|---|---|
Response | Complete a transition initiated by user input within 100ms. | < 50 ms |
Animation | Aim for visual smoothness. Users notice when frame rates vary | Produce frame < 10 ms |
Idle | Maximize idle time to increase the odds that the page responds to user input | Response < 50 ms |
Load | Load pages so that they are interactive quickly based on network / device type | on 3G, time to interactive < 5s |
Basically, the RAIL model says that load your page fast so that it is interactive quickly. Once it is interactive, ensure that any dynamic aspects are fast enough so that the page remains interactive to user input.
For more information about RAIL, please watch this video about RAIL in real word.
Perceived Performance
How fast your site appears to load is as important as it really loads. The problem with perceived performance is that there was no way to measure it. It’s something of a conundrum like this:
Source:https://www.cartoonstock.com/directory/p/public_perception.asp
We have had a lot of measurements like Time to first byte, DOM loaded, onLoad and a host of other measurements. These metrics have been immensely helpful in the evolution of the web performance industry. However, they were designed on the events that could be measured by browsers. However, they failed to answer some basic questions like these:
Experience | Meaning | Metric |
Is it happening? | Did the navigation start successfully? Has the server responded? | First Paint / First Contentful paint |
Is it useful? | Has enough content rendered that users can engage with it? | |
Is it usable? | Can users interact with the page, or is it still busy loading? | |
Is it delightful? | Are the interactions smooth and natural, free of lag and jank? |
Source: https://developers.google.com/web/fundamentals/performance/user-centric-performance-metrics
Going beyond this, we have a few more measurements like the following:
Experience | Meaning | Metric |
Is it slow to interact? | How much time did it take from the first user interaction to the page responding to this action? | First Input Delay |
Is there a mismatch in expectation? | Based on content being loaded, users may try to interact but the page fails to respond. The gap measure this difference. | Time to First Interactive - Time to First Interaction |
How frustrated is the user? | When the user can’t interact, they may first double click and click continuously in the hope of making a page work. | Rage Clicks |
All this boils down to measuring a host of metrics that is spread across different tools and platforms.
Source: PERCEPTION MATTERS: MAKING RUM MORE REAL
If you notice, we’ve now gone from hard measurement of DNS or onLoad to softer measurements related to user’s interaction and experience. All this is harder to measure but, Google has announced that their search indexing will now take these factors into account. Specifically, Google will now:
- use mobile page speed as a ranking signal
- encourage “developers to think broadly about performance”
- and hint at using suggestions from tools like LightHouse, CrUX and PageSpeed Insights which we’ll discuss next.
Source: Using page speed in mobile search ranking
Tooling for Perceived Performance
WebPageTest has been the best and de-facto tool to visualize and show the concept of perceived performance. The film strip view, video and visual progress charts have been the toolset that we have been using for a long time. However, we now have more tools that can help in building a case for measuring and optimizing for perceived performance.
- LightHouse: Project LightHouse by Google is able to run a lot of audits and present findings. These findings are wide ranging like providing a filmstrip view to Chrome Javascript timeline. By using the audit, you would be able to dig into problems like a particular Javascript taking high amount of time and thus causing rendering issues.
- Chrome User Experience Report (CrUX): Google Chrome collects performance measurements from real browsers. This information is sent back to Google, if users have opted in for the data collection. This anonymized data is now being exposed as the Real User Monitoring (RUM) data for Chrome under the project CrUX. This tool is especially helpful since it gives you a spread of the user experience. Synthetic tests can only give you the data from simulated users with pre-defined setup. CrUX can provide histograms of actual performance behaviors, including perceived performance information.
- Google PageSpeed Insights: This tool has now evolved to provide both suggestions to improve the page and pull up the CrUX data for the URL being audited. By using the PSI report, you should get an idea on the current performance and tips to analyze and fix your website.
- Akamai mPulse: Akamai’s mPulse Real User Monitoring solution provides all the perceived performance data along with the standard metrics like onLoad. The tool can be instrumented to measure rage clicks as well to identify the user’s frustration levels when using a website.
What do you look for?
By correlating the data from syntehtic tests and the RUM solutions, you should be able to identify potential issues in perceived peformance. Some of the issues could be:
- synchronous scripts holding up the browser’s main thread and preventing it from rendering the content
- A/B testing solutions that implement some fix leading delayed rendering of the page (Preventing flickering with A/B Solutions)
- page appearing to be ready but, actually busy. This can be seen in the first input delay (FID) as well as the difference between Time to First Interaction and Time to First Interactive.
- page appearing to render but not showing the content that matters. This can be caught by metricsl like First Contentful Paint, First Meaningful Paint and Visually Ready
I have highlighted a very small subset of issues that could be tracked and worked on by leveraging the pereceived performance metrics. If you have any use cases, I’d like to hear more as well!