Hashtag goals
Towards the launch of a project, we always get super excited about ensuring that our performance scores are as high as we can reasonably get them. We love seeing that green lighthouse ring giving us a nice pat on the back for a job well done.
And achieving that 90%+ score is a great goal!
But typically after a site is launched and the client is happy, we're not as vigilant about re-checking these scores. And a lot can change — new code is added, clients are adding their own content, 3rd party integrations are brought in — not everything is as squeaky clean as when we first launched. How should we handle this? Is achieving 90% even realistic anymore? What about old legacy projects that would require a major refactor and all new content to reach that?
Keep it in the budget!
This is where performance budgets come in.
With performance budgets, we are interested in what the minimum viable budget is, or MVB. It exists purely to fight regressions. We set it at a level that is maintainable with all current factors taken into account, and we ensure that it never drops below this level.
A great way to determine this is to measure your performance over a period of time (ex a few weeks) and look for the lowest point. This is your MVB. Your goal can still be higher than this, but until your MVB is met and maintained, reaching your goal on occasion is not as relevant.
You can set up alerts in your CI/CD or with GitHub actions so that you're aware when you dip below your minimum threshold and tackle those items that triggered the regression first. This ongoing awareness helps ensure the health of the projects.
Pitfalls of optimism
It's important not to set your budgets too high. If you are constantly getting alerts and/or never have the time to fix them, they become white noise and easy to ignore. We're better off setting them low enough so that when we get alerts, we act.
What to test?
There are a lot of acronyms out there. LCP, FCP, FID, CLS... And really what you measure should be what matters most to you.
A good place to start is with the Largest Contentful Paint, or LCP. This is a stable web core vital that is unlikely to go away any time soon. It measures how long it takes for the largest(and thus assuming, rightly or not, the most important) content to load. This typically comes in the form of images or block-level text (h1
, h2
etc.).
While imperfect, this is a consistent way to measure how long it takes your site to load.
One major drawback — it's currently only supported on Chromium browsers.
Like any good measurement, the key is to use the same tool consistently, to get a read on how your performance tracks over time.
But what pages do I test?
Again we don't want to create white noise or tasks that are so huge we are overwhelmed. We want to test key pages. This can vary from project to project but often is Home, Search, Product, Cart etc.
Team Budget
Overall having performance budgets help give us a sense of confidence that our sites are performing how we expect. They ensure that we keep attending to performance in a way that is realistic to real-world scenarios and are a healthy part of maintenance.
Happy budgeting!
Top comments (0)