DEV Community

Cover image for Making a fast website is SUPER EASY 😏
Kasper Andreassen for Enterspeed

Posted on • Originally published at enterspeed.com

Making a fast website is SUPER EASY 😏

That's right, I said it. Making a fast website is super easy, barely an inconvenience.

I've built a website that gets a perfect score in Google Lighthouse and that can be deployed right to the edge.

It's built completely in a high-performant, battle-tested language that will last for ages.

Want to see for yourself how fast it really is? You can check it out right here: https://enterspeed-hw.netlify.app/

Did you see how fast it was?

Okay, I’ll admit, this was a cheap shot. Some of you might sit with a bitter feeling in your mouth, feeling cheated about this almost clickbaity article. But hang on, there’s a point to the madness.

My website scores 100/100 in Google Lighthouse

The point is anyone can make a fast website. Making a website that looks good and has the functionality you want while being high-performant is where the challenge lies.

We have properly all stumbled on a post on LinkedIn where someone is bragging about their website scoring 100/100 in Google Lighthouse.

A score like that is also quite an achievement and should indeed be celebrated. However, I think many of us also have seen high performant websites which, how should I put it… Looks like shit. Excuse my language.

You know the type. Late 2000’s-look, barely any images, maybe a few sorry-looking icons, and just text as far as the eye can see.

Don’t get me wrong – for some websites, e.g., blogs, this makes perfect sense. However, for a website whose goal is to convert customers by selling goods or services, the website should also be attractive to look at.

Your website is balogna!

Whether a website looks good or not is of course, unlike a performance score, highly subjective. As the saying goes: “The beauty is in the eye of the beholder”.

The visual aspect of a site can be costly. Each image, video, custom font, animation library, etc. is an extra request for the user and additional kilobytes which must be transferred to the client.

Another side of the coin is the website’s functionality. My beautiful “Hello world”-example above, written in pure HTML, properly doesn’t scale that well to a full-blown company site.

Then we introduce some kind of framework that increases the bundle size and the response time. How much overhead this framework adds can also vary depending on which you choose, as we found out in our previous article: We measured the SSR performance of 6 JS frameworks - here's what we found.

Okay, now we got the foundation in place – now for the content. Marketing comes up with new ideas almost as often as a new JS framework is introduced. They don’t want to bother you all the time, and just as important, you don’t want to be bothered every time there’s a tiny change to be made.

So now we go and implement a CMS. Depending on how we implement it, this can also introduce additional overhead in the performance. If we render it using SSG, it won’t have any meaningful performance changes. However, if we use SSR (or for some reason you choose CSR – please don’t), then we start seeing some extra requests enter the picture.

Okay, now we have given marketing what they want – a way to edit and publish content, they’re properly satisfied now, right? Oh, my sweet summer child, you have clearly never worked with a marketing department.

Remember the scene in Lord of the Rings where the thousands of orcs attack Minas Tirith? Picture this scene, Minas Tirith is your website, and the thousands of orcs are the shitton of tracking scripts that marketing wants to implement. In this case, you're Gandalf, trying your best to defend Minas Tirith.

Stop, stop! He's already dead

As fun as it is to blame marketing for everything, the truth of the matter is they of course play a vital role.

If a website doesn’t get any visitors, it doesn’t matter how fast it is. Moreover, if these visitors don’t convert (buy, sign up, book a call, download material, etc.) it also doesn’t matter.

So, should we then just give the marketing department free rein?

Eh…

Just like a kid with a bowl of sugar, we should still keep a watchful eye and not allow them to turn their bowl of oatmeal with sugar into a bowl of sugar with oatmeal.

As many studies have shown, there’s a direct correlation between how fast a website is and how well it converts.

Also in terms of search engine optimization, core web vitals became an official Google ranking factor in 2021.

Therefore, we must balance the visual elements, the website’s functionality, and the website’s performance so they all can co-exist.

Perfectly balanced... As all things should be

I know what you’re thinking: “Enough fluff talk, this isn’t marketing, what can I actually do”? (Sorry, having switched from marketing to development myself, I simply can’t help it).

And don’t worry, in the second part of this article we will look at some actionable tips.

Making your website fast, functional, and visually pleasing 👌

Let’s try to hit that magical trifactor that makes everyone happy.

Images

Let’s start with one of the big performance sinners – images. Images take up a huge part of the website's total size.

As of August 2022, images made up on average 45% on desktop and 44% on mobile of a page’s total weight. The number of image requests made up 32% on desktop and 30% on mobile of a page's total requests (Source: HTTP Archive).

Optimizing the images can therefore result in big and easy wins. I’ve previously made an article about this exact subject: How to optimize your images for performance.

I will quickly summarize the points from the article and move on to a concrete example.

1. Choose the right image format (JPEG, PNG, GIF, SVG, WebP)
Each image format has its purpose. For instance, PNGs can be great for icons and smaller images, but the file size can quickly become pretty huge for larger images.

2. Be careful with using animated GIFs
GIFS can be fantastic for small “videos”, for instance in tutorials, but they come with a cost. File sizes can be enormous, so consider using a video instead.

3. Always try to compress images
It always amazes me how many kilobytes can be saved by running images through a compressing tool. I once managed to compress an SVG file with around 97% – absolutely bonkers.

One of my all-time favorite tools to use is TinyPNG (aka. TinyJPG). Their UI is beautiful, dead simple and more importantly – they have some insane compression results. They allow compression of PNGs, JPEGs, and WebP-files.

Remember to always check the quality after. One thing many compression tools don’t do well is trying to compress gradient colors – the result can be quite janky.

Y'all got anymore of them pixels?

Online tools I can recommend:

💡 Tip: Squoosh lets you adjust the quality (compression rate) and see the results live.

4. Scale down your images
There’s absolutely no reason in serving a 3000px width image. Moreover, there’s no reason in serving a 500px width image if you only render it in 300px.

5. Serve responsive images
Instead of using a single image for both desktop and mobile, consider serving 3 – 5 different sizes of the image depending on the viewport.

Check out Mozilla's article on responsive images here: Responsive images on MDN.

6. Lazy load images
Never load images until it’s necessary. Lazy loading images helps reduce the number of initial requests and page size resulting in a much faster site.

Nowadays we’re blessed with a built-in lazy loading attribute supported by all major browsers: Browser-level image lazy-loading for the web.

⚠Warning: Never lazy-load your LCP element!

Now, let’s look at an example of how we could optimize one of the big image performance sinners – the hero image.

Optimizing the hero image

Almost all modern websites use some kind of image in their hero section (the area right under the logo and navigation – often mostly used on the homepage).

The hero section may be the most important part of your website since it is responsible for quickly telling the visitor:

  1. What you offer (your product or service)
  2. What your USPs are (Unique Selling Proposition)
  3. Communicating trust (why the visitor should trust your site)
  4. The main call to action (Sign up, book call, buy now, etc).

There are a thousand ways to design a hero section, but for now, we will split it into two categories:

A) Full-size hero image (background)
B) Partial-size hero image (often placed in the right section of the screen and taking up somewhere between 1/2 to 1/3 of the width).

The reason why the hero image is interesting from a performance perspective is that it often will be the LCP element (Largest Contentful Paint), which in the Lighthouse performance scoring is the second largest weighted metric with a 25% weight, only outdone by TBT (Total Blocking Time) with a 30% weight.

Moreover, the hero image tends to also be one of the largest when it comes to file size (especially for full-size hero images).

For the partial-size hero images, depending on their size, they may not be the LCP – this can very well be the heading text – especially on mobile where designers tend to shrink the image to make room for the text and buttons.

So, let’s optimize our hero image. For this example, our hero image will be a full-size hero image.

The first thing we should do is make sure we don’t lazy load our hero image (some might choose to add the lazy=true attribute to all images) since this will negatively impact our score.

The next thing we should do is preload it. Setting your hero image to preload will enable the browser’s preload scanner to load the element early in the page lifecycle.

Preloading LCP

Source: https://web.dev/preload-scanner/

Be aware, that if you use responsive images (different images depending on viewport size), then you need to specify the imagesrcset attribute on the element.

Thing brings us to the next tip. Use responsive image sizes.

We’re using a full-size background image with a width of 1920px. Loading a 1920px width image on for instance an iPhone SE which has a 375px width viewport would be absolutely insane.

By shrinking from 1920px width to 375px width you can reduce your image size to just a 1/10 of the original size. Talk about saving data.

Responsive images

I know what you’re thinking. Making all these variations sounds like a lot of work. I feel you – I’m lazy too, but luckily there are awesome tools available.

The Responsive Image Breakpoints Generator automatically creates the number of images you wish based on your resolution range. Moreover, it even generates the HTML img tag for you.

The tool is created by Cloudinary (another fantastic service), which also has an API available to upload your images to the cloud and automatically generate breakpoints programmatically.

Cloudinary API

Long live the lazy 🙌

Now let’s look at optimizing the main image size. In this example, I will be using Photoshop, but there are plenty of free tools available for doing the same.

We want to build this fantastic hero for our website:

Hero example -

We have found this beautiful image we want to use (Shout out to the photographer Matheus Cenali).

Our largest image size will be 1920x1080. Unfortunately, the image is a bit too big with a resolution of 1920x1440. Time for some cropping.

I select the crop tool in Photoshop and input the resolution I want.

Cropping in Photoshop

Sweet, no redundant pixels here.

Next, I click File > Export > Exports as. Here I can choose image format – and for JPG select image quality while inspecting quality and file size.

Photoshop

If you don’t want to use their predefined Quality scale (Good, Very Good, Excellent, etc.) you can use their legacy export tool, which lets you select a quality from 0 – 100, similar to Squoosh. You’ll find the under File > Export > Save for Web (Legacy).

I usually end up selecting the “Good”-quality, since it often hits the sweet spot between quality and file size.

Now, we could be satisfied with the 188,9 KB file size, which also isn’t that bad for a large background image, but let me show you another cool trick.

As you could see from our hero example above, we want to have a black overlay over the image to better view the text and make the CTA’s stand out.

Aha! Then we just add some CSS to create a black overlay with some opacity, right? Yes, we could do that, but check this out.

Now I take the same Apple image as above, open it in Photoshop and right click the layer, and select Blending options. Click color overlay, select a black color, and chose an opacity (in this example we use 60%).

Setting black overlay in Photoshop

Now let’s try to export it again in the “Good”-quality as before, which resulted in a 188,9 KB file size.

Photoshop

Our file size has shrunk to 99,5 KB! That’s almost a 50% reduction. Fewer image details result in a smaller image size.

The same would have been the case if we have chosen to blur our image:

Photoshop

Now we have a 78,7 KB file size. I wonder what happens if we add a black overlay to our blurred image?

Photoshop

We will end up with an impressive file size of 49,7 KB. This gives you an idea of how many kilobytes can be saved when playing around with effects in Photoshop.

Okay, enough about images – we could go on forever optimizing the various assets. Let’s take a look at fonts.

Fonts

If there are one thing designers absolutely love, it’s fonts. So, it was no wonder that the entire web design community almost went bananas when Google recently launched their Color Fonts on Google Fonts, which to be honest is a pretty cool feature.

Color Fonts on Google Fonts

So, let’s talk about fonts and how to optimize them. We’re not going to cover every aspect in this article since, believe it or not, actually is quite a big subject.

First, why are fonts/typography important for web design? Let’s take a look.

1. Readability
Remember the old days of the web where it seemed like a universal law that your site should have an 11px body font? (What is this? A website for ants?)

Luckily the days of squinting our eyes together to browse the web are over. Now most, but definitely not all, websites run a decent font size.

The choice of font is of course also important. The Bureau of Internet Accessibility has recommended the following fonts for accessibility: Times New Roman, Verdana, Arial, Tahoma, Helvetica, and Calibri.

2. Legibility
Related to readability, legibility is the measure of how distinguishable individual characters and words are to the eye of the reader; readability is the measure of how easy it is to read the text overall (Source: Legibility & readability).

3. Color and contrast
Not everyone perceives colors the same way. Some color combinations can be very difficult or impossible for some people to read.

Usually, this is because of poor color contrast (the foreground and the background color being similar).

The WebAIM guidelines recommend an AA (minimum) contrast ratio of 4.5:1 for all text. However, large text (120 – 150% larger than the default body) is an exception, here the ratio can go down to 3:1 (Source: Color and contrast accessibility).

4. Branding
That’s right, you thought we were done with marketing, but just like the Undertaker, they sneak their way in when you least expect it.

Marketing sneaking in the developer

Fonts are a crucial part of the brand. They can evoke emotions and communicate what the core of the brand is to the consumers.

Don’t believe me? Check out these fantastic logo swaps and prepare to feel uncomfortable.

Choosing the right font combination for your website can be what makes or breaks a good design. Check out Canva's ultimate guide to font pairing.

Before moving on to how to optimize fonts for performance, we also need to talk about the two basic types of web fonts: Web-safe fonts and Web fonts.

Web-safe fonts
These are the fonts that are pre-installed on your device – also known as system fonts. Examples of these are Arial, Times New Roman, and Courier.

Web fonts
These are fonts not pre-installed on your device and which must be downloaded by the browser before they can be displayed. These can either be self-hosted (on your own web server) or via a third-party host like Google Fonts, Adobe Typekit, etc.

The attentive reader properly already knows which type we’re going to optimize – that’s right, web fonts. As nice as it could be to only use web-safe fonts in your design, it’s hard to make a website look good with just these.

So, let’s see what we can do.

The first thing you should do is to make sure you’re not downloading unnecessary font variations.

Downloading unnecessary resources - that's a paddlin'

If you’re using a theme, boilerplate, etc., it may come with a lot of unnecessary font variations.

Take for instance the popular Roboto font – it comes in 12 different styles (6 different font weights with an italic and non-italic version of each). Each variation is extra kilobytes the user must download.

Luckily, it’s easy to find out which styles you are using and which you are not. Afterward, you can change your link / @import to only download these variations.

Next, let’s see how these fonts may impact our performance score. Web fonts can negatively impact core web vitals like Cumulative Layout Shift (CLS) and Largest Contentful Paint (LCP), and non-core web vitals like First Contentful Paint (FCP), which is also used to calculate the Lighthouse performance score with a 10% weight.

So, how do we optimize all of these? We can’t.

Wait, what? We can’t?

No, optimizing fonts for CLS and LCP/FCP conflict with each other. It all has to do with how we use the font-display property.

When loading custom fonts, each browser has its own behavior. Edge swaps it out with a system font until it’s ready, Chrome and Firefox will wait 3 seconds before swapping it out with a system font and Safari will simply hide it until the font is ready.

We can modify this default behavior by using font-display.

So first let’s try to optimize for LCP/FCP.

The First Contentful Paint will often be your text which, as we learned previously, also can be your Largest Contentful Paint. The quicker these elements load the better.

Therefore, the loading behavior of Chrome, Firefox, and Safari, can potentially harm these metrics. The same if you choose to manually set font-display to “block”, which gives the font face a short block period and an infinite swap period.

We can avoid these “invisible fonts” by setting the font-display to “swap” making the block period extremely small. This will load a system font and “swap the font” when the custom font is ready.

Pretty nifty, but do you know what can happen when you “swap content” during load? Your layout can shift, resulting in a poor Cumulative Layout Shift (CLS) score.

So, unfortunately, there’s no solution that fixes all things. If your text is your LCP, it makes sense to use the swap method, since LCP and FCP have a total weight of 35% (25% + 10%) in the Lighthouse performance score.

But there’s also a way we can reduce the amount of layout shift happening. We can make sure our fallback font best matches our custom font. You can use the Font style matcher tool for this.

Now, let’s try to optimize how fast we get the actual font. We can use preconnect to establish early connections to important third-party origins, here our fonts.

Preconnect Google Fonts

The next thing we can do is to tell the browser to load our font early in the page’s lifecycle. We do this using preload in the link element.

That way our font is less likely to block the page’s render since it will load before the browsers' main rendering engine starts.

Preload Google Fonts

Lastly, if you choose to self-host your fonts, make sure you use a modern web font format – here WOFF 2.0. WOFF 2.0 offers up to 30% better compression than its predecessor WOFF.

Third-party scripts (Tracking scripts and other fun stuff)

Now for something that’s even more frustrating than the Game of Thrones ending (I’m sorry Bran, but anyone has a better story than you) – third-party scripts.

From Facebook to Google Analytics, to LinkedIn Insight Tag. The amount of third-party scripts almost seems endless.

When a certain department, who shall remain nameless, then comes to you and ask you to also add the Facebook Page Plugin so “our customers easily can follow our company page”, you can almost feel your soul leave your body.

But don’t worry, there’s a way to overcome these obstacles. Remember what we said about balance?

Let’s start with the easy one – a third-party script rendering content, like the Facebook Page Plugin.

Scripts which are responsible for rendering some kind of content should always be lazy-loaded, meaning the request is first made when you scroll to the content.

The Facebook Page Plugin actually has this built in. Simply insert data-lazy=”true” and you’re good to go.

Now for the more annoying part – all the tracking scripts.

If your website only targets European visitors, you’re in luck. Due to the GDPR, and the ePrivacy Directive you can’t legally set a tracking cookie until the user has accepted it.

This means your initial load time – and here your Core Web Vitals won’t be affected by these third-party scripts.

Problem solved. Shut down your website for the rest of the world and move to Europe – don’t worry, we got beer 🍻

No? Okay, let me show a trick then.

Hold back your requests

Braveheart -

First things first. You should of course only load third-party scripts which are needed on the particular page. This means, that if you have a cool animation that requires animate.css (not a script but a stylesheet but bear with me) that you use on your company page – then this shouldn’t be loaded on every other page.

You can use a tool like Google Tag Manager to manage and orchestra all your scripts. This is exactly also what we are going to use for this nifty trick.

Third-party scripts like Facebook Pixel, LinkedIn Insight Tag, or even your chat plugin aren’t strictly necessary to request right away. They can wait until the user has interacted with the site – or some time has passed.

I don’t necessarily recommend doing this with Google Analytics, since you might risk losing valuable data.

However, for all the nice-to-have stuff, they can wait their turn.

So how do we do this?

Inside Google Tag Manager you set up Triggers that tell the tag/script when to fire. These triggers can be conditional, meaning you also can chain them into an OR statement.

So, for our non-critical scripts, we could set up a condition that looked for if:

  1. The user has clicked on any element (useful for navigation)
  2. Has scrolled 5% (indicated the user starts interacting with the site)
  3. 5 seconds have passed (our initial load is done).

Be sure to test these things out and see if everything works. As someone who lives under Margrethe Vestager’s watchful eyes, I don’t use this technique myself.

Loading content from your CMS

That’s right, we’ve saved the best for last – serving the actual content.

There are multiple ways to go about it and not a subject we can cover completely in this article.

One way to make sure your content loads really fast is to use Static Site Generation (SSG) which renders your content at build time and serves it as (you guessed it) – static content.

This is really performant, but not always that flexible. Another alternative is to use something like Next.js’ Incremental Static Regeneration (ISR), which is a hybrid between Server Side Rendering (SSR) and Static Site Generation (SSG).

Again, cool stuff – but not everyone can utilize these rendering strategies – many organizations have some dynamic content that needs to be rendered via SSR.

These organizations are also typically not the type you can easily convince to move away from their old CMS and on to the latest and greatest headless CMS.

Fetching data from these CMS’ can be slow and risk creating a bottleneck. So, what do we do?

We decouple it, baby.

If you read my previous article, Using Google Sheets as your CMS, you might be familiar with the concept.

Using a solution like our own product, Enterspeed, you can sync your data to a high-performant data store, and still keep the editor experience. Moreover, you’re able to also transform and combine multiple data sources into a single call – reducing the number of requests.

Decoupling your CMS also brings other benefits. You’re able to scale down your server since it’s no longer going to take the high traffic load.

If you need to do maintenance, you can even shut it off without affecting your site.

Syncing your data to Enterspeed also makes it possible to migrate to another CMS – or a newer version of your current one. Check out how we migrated from a Umbraco V7 setup to an Umbraco V10 setup in no time.

Not to toot our own horn, but that is pretty sweet 🙌

Closing thoughts

This article could go on forever (and it kind of feels like it already has), but we must stop at some point. We haven’t covered all the topics of performance optimization, since there’s enough content there to cover a full book.

Some of the things we didn’t cover were:

Conclusion

It’s not hard to build a superfast website. It can be hard to do this while making it look visually stunning and having the functionality all department wants.

Luckily, we can solve many of these obstacles by doing some optimization tweaks and learning to achieve a balance between speed, functionality, and visual elements.

Remember, Rome wasn’t built in a day. These things take time and comprise sometimes must be made. Talk to each stakeholder for the website and make sure they understand why there needs to be a synergy between all elements.

Top comments (42)

Collapse
 
corentinbettiol profile image
Corentin Bettiol

That's a great article, with very interesting content and examples!

Here's the score for my own personal website:

web.dev/measure score

I spent a long time optimizing all I could, because the website is hosted on a small old computer (yeah my server is an old dell optiplex fx160 :P) next to my box (I have 200kB up, that means I have literally no connection anymore when a simple txt post is going viral on hacker news).

Here's what I optimized (I will forget some things) :

  • fonts: I'm using a custom font only for the title & nav links, so I used fontsquirrel's tool to remove all unused chars (thus my font weights ~3.5ko).
  • images: I optimized all the static files of the website (the logo is 368 Bytes!). I use webp for cover photos, so a page with a lot of banners is still less than ~150kB.
  • cache: I tried to cache a lots of things for a long time. I don't use a CDN, I serve all my content from my small server (may get worse performances but at least it's consistent across all resources :P).
  • no js: we don't need no educ javascript on most websites. This personal website makes no exception.

I recommend 1mb.club for a great list of small & lightweight websites (and 512kb.club for even lighter websites). Some are just like the "hello world" example, but some other websites includes a lots of things for only a few kilobytes :P

Collapse
 
kaspera profile image
Kasper Andreassen

Great job, @corentinbettiol!

With limited resources like that, every byte counts. That's a great exercise, and is also important when thinking about the next billion users.

The only thing you need to optimize, to get a perfect score all round in Lighthouse, is the accessibility of your sticky header on mobile. Increase the contrast ratio on the page title and it's 100 all way 🙌

Collapse
 
corentinbettiol profile image
Corentin Bettiol

Thanks for your kind comment :)

The issue with the contrast ratio of the header is here today, but sometimes the website pass all tests and have a score of 100. I may update the color of the header, but I think the score depends on how the screenshot of my website is compressed and what the header looks like :P

Collapse
 
jigar_online profile image
Jigar Shah • Edited

Indeed a research-oriented article! You explained everything in details. Amazing to read and know insights.

Last few months, I have been working on one project - enterprise website. And I came across one fastest JavaScript framework. It's TezJS.

Well, I have used TezJS- a Jamstack frontend framework for my website - Radixweb. and here's a score that gives us when it comes to website performance.

Image description
(Page Speed Report)

As you can see, my website score is 99 even on Mobile. And it has almost 1300 webpages -a very complex websites with multi-functionalitites. Isn't it amazing?

As per the TezJS, it ensures to offer 98+ core web vitals for turbo speed and blazing-fast website performance.

I recommend TezJS for performance-first complex websites. Do give your best shot - GitHub

Collapse
 
kaspera profile image
Kasper Andreassen

Thank for the kind words, @jigar_online 🙌

Congrats on the amazing results - mobile can really be b-word to hit those upper 90's in Performance 😬

Collapse
 
perssondennis profile image
Dennis Persson

Fun thing is that if you optimize a hero image properly, you can actually decrease LCP by adding a hero image. I have explained how in this article.

Collapse
 
kaspera profile image
Kasper Andreassen

Ha, that's pretty cool! Thanks for the resource, @perssondennis!

Collapse
 
strift profile image
Strift • Edited

Very cool! Thanks for the article. Gotta love those 💯 score on Lighthouse! :p

As far as images go, I like using something like TwicPics to generate variants on demand at runtime, instead of having to manage this manually (or at build time.)

It works great for me in combination with Nuxt SSG.

PS. +10 points to Enterspeed for the memes

Collapse
 
zihanc profile image
Zihan Chen

Image description

Image description
Mine
The second one is on here(it is also provided by google, web.dev)

Collapse
 
fadhilsaheer profile image
Fadhil ⚡

Thanks for sharing, it really helped me to optimize my website

Image description

Collapse
 
kaspera profile image
Kasper Andreassen

Glad to hear, @fadhilsaheer! Just tested your site and I actually get a 99 score in Performance 🤯 - great job!

Seems all your missing in Accessibility on your text to hit that 100 🙌

Collapse
 
fadhilsaheer profile image
Fadhil ⚡

thanks for the info, I'll look into it

Collapse
 
corentinbettiol profile image
Corentin Bettiol

Dude why is your favicon 90kB ?!

Collapse
 
fadhilsaheer profile image
Fadhil ⚡

Coz I'm lazy to compress, thanks for pointing it out, I'll compress it ASAP, thanks

Collapse
 
cicirello profile image
Vincent A. Cicirello

Another good tool for optimizing SVGs is svgomg. It works really well.

Collapse
 
kaspera profile image
Kasper Andreassen

Thanks for the resource @cicirello! I've added it to the article 💪

Collapse
 
cicirello profile image
Vincent A. Cicirello

You're welcome. Your post was a good read.

Collapse
 
kissu profile image
Konstantin BIFERT

Great article! 💪🏻

Optimizing perfectly for images is a HUUUUGE work, especially when your factor in the size, the type, the color subsampling, the encoder etc etc...
Cloudinary is not free but still a simple solution regarding this topic. 👌🏻

Collapse
 
kaspera profile image
Kasper Andreassen

Thank you so much 🙌

Indeed, but it also often results in huge payoffs. I'm also a big fan of Cloudinary – I love that you can control things like size, quality, etc., with a simple URL parameter.

Collapse
 
2kabhishek profile image
Abhishek Keshri

I just came here for the memes!

Collapse
 
kaspera profile image
Kasper Andreassen

Then you, sir, know what's truly important in life. Therefore I made a freshly squished meme just for you 🙌

Image description

Collapse
 
2kabhishek profile image
Abhishek Keshri

hahaha XD
the meme is painful though :'(

Collapse
 
palalet profile image
RadekHavelka

Well if you care about gdpr, stay away from google. Any tools, tag manager, analytics, i mean all of them. Also, consider what you want to achieve, what resources you have and if having 100 score is actually that important, because it is only one metric of one tool / company. And that company does whatever they want to us, as the is hardly any competition. And that is evil. Dont subdue to evil. Do your sites nice, with resources and ofc users in mind, but dont be scared if you only have 99 ... world will not collapse ... and if you use EvilCorp, be sure that THEIR scripts are those who break almost all the rules, recommendations and best practices. They are hardly in position to dictate us how we shall do our sites :)

Collapse
 
kaspera profile image
Kasper Andreassen

Haha, getting some Mr. Robot vibes here 😄

But yes, luckily there are now many great alternatives to Google's products nowadays.

Google has always been "do as we say, not as we do" when it comes to performance 😉

Collapse
 
palalet profile image
RadekHavelka • Edited

Hehe, no, I am not that paranoid, but the truth is there are many lawsuits against google tools across Europe nowadays, and there is no "we do not store any cookies" google product :) People need to be aware of that, maybe even Analytics will be soon "illegal" from GDPR point of view. Not speaking about Chrome browser itself ...

and nicely said about the performance, exactly my point :) I got hit by "CLStoo big" issue recently, guess what caused it ... Adsense :) Seeing "do not use document.write" in Lighthouse makes me smile, as the only scripts not following this rule are ... Adsense :) Optimise your javascript, compress it, compress images ... basically ANY google javascript is not 100% compliant with this. Used google translator on our pages. Oh boy, what a nightmare :)

but the biggest issue I have is with constant changing of the rules. Its easy if you have your private blog with few text there, not so easy if you run website with 24yrs of history, thousands of articles, images, users that are used to use it in some way. And now someone somewhere starts to tell us how we shall (re)do it, just because ... Use amp. Dont use amp. Do this. Dont do this. Every month, every year new rule. I understand those can be applied to some ecommerce sites, but internet isnt only about ads and ecommerce. Google stole the internet from us, and that piss me off really badly TBH.

Collapse
 
kaspera profile image
Kasper Andreassen

Fun update, @palalet. Google Analytics just became illegal to use "out of the box" in Denmark: computing.co.uk/news/4056735/denma...

Collapse
 
corentinbettiol profile image
Corentin Bettiol

(I agree, Google is bad)

So here's some others tools that can generate numbers from random bytes you throw on the internet (aka a website):