...in the previous part, we somehow made our codebase smaller, or just reduced the amount of code needed at a single point of time
- implemented the basic idea behind code splitting. And now we have to somehow send our bundle from our Server to our Client.
Not a big deal? 😉
To be honest - loading JavaScript is not a big deal, but making it efficiently - is.
And everything is bound only to two moments:
- optimizing network utilization. Ie “transferring” scripts in an efficient way.
- optimizing script loading and execution, not blocking the main thread much.
- and shipping the "right" scripts to the client, not shipping the “wrong” ones.
Network Utilization
Rule number 1 - never use <script src="..." />
. It's will block everything till it's loaded and executed. It's the slowest possible way, however the most popular.
Rule number 2 - always use defer attribute to load all scripts in the parallel, and execute them in the "safe" order and time - after your page is ready to safely execute any scripts - after "DOMReady"
.
Defer
mode has another good power - it will download scripts in a random order, but execute in the "right". However - only your bundler knows the right order, and only it controls is, so - that's not a benefit if you use Webpack for example.
Another drawback is the network priority of such scripts - they are deferred, so nobody would complain if you will get them a second later - as a result they would be loaded in a lowest network priority.
Rule number 3 - try to use async mode, but be careful - as long as your scripts might be executed when your page is not yet ready. But it's ok to do some job, like load extra dependencies, and continue job after ‘DOMReady’, and you have to keep that in mind. Remember jQuery's $.ready
?
However, async scripts are also are "not required right here right now" - async
is a clear signal - I don't need it now - so network priority is still low.
For example in Chrome, CSS loaded in typical fashion via the
<link>
element in the<head>
will be assigned a priority of highest, as it blocks rendering - HTML could not be displayed without matching CSS. Images in the viewport may be assigned a priority of high, whereas images outside the viewport may be assigned a priority of low. A<script>
loaded at the end of the document may receive a priority assignment of medium or low, but this can be influenced bydefer
andasync
.
Here is a table Addy Osmany created to help you understand how priority depends on your actions:
The easiest way to handle this would be to use Priority Hints, but unfortunately, they are not supported just everywhere, even in Chrome they are still behind a flag.
So there is only one way to "fix" scripts, and here it is:
Rule number 4 - try to use link preload
to make your scripts more important than images, thus make their loading faster and solve the uncanny valley of SSR problem, when everything is “visually” ready, except the page brains 🧠, still transferring over the air.
Thus, the best advice I could give you - load all your scripts using async
, and pair <scripts
tags with <link prefetch
tags to make it faster.
Easy, until you will try to ship even less code to the browser, I mean "ship the right scripts to the client", which is also known as a double bundling
, module/nomodule
, or modern/legacy
approach.
What could be wrong this time?
Shipping less code
“Double bundle” shipping is something everybody is talking about, but just a few uses in production. The reasons are usually the same - it's freaking hard.
JFYI:
module/modern
is a bundle created for "new" browsers, with all cool features you can use out of the box, whilenomodule/legacy
is a bundle for old browsers, which are still around - like IE11.
Let's start with some reasons to use it. With modern
bundles...
- you don't have to ship polyfils, which is not a deal-breaker for big fat apps, which would not gain measurable benefit by reducing their size by 100kb, but a greater deal for more fit ones (read: almost for all normal apps).
- you don't have to transpile everything down to es5.
classes
would be stillclasses
(and not that overweighted templates they become in es5) -async/await
andgenerators
would keep their form, not become these huge es5-compatibleswitch/case
based state machines. -
css
would need less prefixing, but CSS is not a big problem usually.
Let me cite @pastelsky article Smart Bundling: How To Serve Legacy Code Only To Legacy Browsers
So modern bundles could be slimmer and sometimes even faster. Not much slimmer, and not much faster, sometimes almost not measurable...
Unfortunately, nothing is under you control, and the problem here is NPM modules, which might be a quite big part of your application, and which are already in ES5, so you can't ship them in a modern way - there are no "modern" sources provided.
For the Instagram it ended as 5.7% size reduction. Read about their experience with shipping modern bundles.
Jason Miler has got a great article about it - Enabling Modern JavaScript on npm, and might one day, one bright day, our expectation about lowest common denominator (es5) would change.
...but as long as you can get modern bundles almost for free - then why not?
I can get it almost for free?
Sounds a bit strange, having in mind I've just said - "it's freaking hard.". It is, but not because it's hard, but because nobody knows what to do. Or even because everybody knowns - every, like every article, provides different bits of advice. And yes - every article offers some advice, and this one would not be an exclusion.
And there are 3 different ways to load module/nomodule
bundles, expained here in detail.
1 - use module/nomodule
. Obviously.
<script nomodule/>
would not be loaded if your browser could load module
, and <script module />
would not be loaded if your browser could not load it. 99% articles and GitHub issues would say that it's not supported by Safari 10, however, I am writing this text using Safary 13...
And well, there is a project which tracks how it actually works - https://github.com/johnstew/differential-serving - and according to is Safari is not an issue (Edge is)
Browser | Version | Browser Test Link | Browser Test Results |
---|---|---|---|
Chrome | 73 | View | ✅ |
Chrome | 61 | View | ✅ |
Chrome | 60 | View | ✅ |
Safari | 12 | View | ✅ |
Safari | 11.1 | View | ✅ |
Safari | 10.1 | View | ⁉️(1) |
Firefox | 66 | View | ✅ |
Firefox | 60 | View | ✅ |
Firefox | 59 | View | ⁉️(2) |
MSIE | 11 | View | ⁉️(2) |
MSEdge | 18 | View | ⁉️(3) |
MSEdge | 16 | View | ⁉️(2) |
MSEdge | 15 | View | ⁉️(2) |
iPhone XS Safari | Latest | View | ✅ |
iPhone X Safari | Latest | View | ✅ |
iPhone 8 Safari | Latest | View | ✅ |
Pixel 2 Chrome | Latest | View | ✅ |
Galaxy S9 Chrome | Latest | View | ✅ |
2) MSIE - Downloads both bundles
3) Edge downloads legacy bundle and downloads ESM bundle twice 💩
PS: Edge 18 is also known as Edge 44.17763
2 - serve different bundles from a "smart" server-side, basing on the user agent, as seen at the Shubham's article.
router.get('/', async (ctx, next) => {
const useragent = ctx.get('User-Agent')
const isModernUser = matchesUA(useragent)
const index = isModernUser ? 'dist/modern/index.html', 'dist/legacy/index.html'
await send(ctx, index);
});
That would work, but if you will load a modern script without module attribute - you will lose stream parsing and off main thread compilation, modern browsers could provide for modern scripts (depends on the browser).
So the better, and almost the same way, is to handle this during HTML generation, which is, unfortunately, possible only if you have a non static site. I mean - you do have something like SSR.
function renderPage(request, response) {
let html = `<html><head>...`;
const agent = request.headers.userAgent;
const isModern = userAgent.isModern(agent);
if (isModern) {
html += `
<link rel=modulepreload href=modern.mjs>
<script type=module src=modern.mjs></script>
`;
} else {
html += `
<link rel=preload as=script href=legacy.js>
<script src=legacy.js></script>
`;
}
response.end(html);
}
What if your site is static? Majority of SPAs have "active" backend, but absolutely static frontend assets, served from CDN's. And for these sites, we need a bit different approach.
3 - module
detection in runtime.
Use run time feature detection to discover browser capabilities and... add all scripts dynamically. So no <script src
anymore - only JS calls.
<link rel="modulepreload" href="/modern.js"> <--- hey?
<script type=module>self.modern=1</script>
<script>
$loadjs("/modern.js","/legacy.js")
function $loadjs(e,d,c){c=document.createElement("script"),self.modern?(c.src=e,c.type="module"):c.src=d,document.head.appendChild(c)}
</script>
// or
<script>
var script = document.createElement('script');
var prefix = (!('noModule' in check)) ? "/ie11" : "/esm";
script.src = prefix + "/index.js";
document.head.appendChild(script);
</script>
💡 dynamically injects scripts are async, but don't have
async
attribute, which raises their network priority
But there is a problem, actually a quite big problem - how to handle code splitting in this case? Any ideas?
Modern bundles and code splitting
The key idea, and the key problem is: there are two sets of scripts, and you have to load only one - differential loading
is to load only one, and the best version for you.
First of all - it shall be the same set of scripts, divided in the same way, just content of scripts would be different - optimized for your bundle as much as possible.
Then - it would be different scripts, which shall never interfere with each other, as long as they were built for the different targets.
Finally - publicPath
might be a bit different. And it's up to you - are you going to use contentHash
to generate unique names for scripts, or you will just put them into separate directories.
So, yet again - what shall you do:
-
for legacy bundles - add
preload
and loadlegacy
script -
for non-Chrome - just load using
<script module async
, which will do not require additionalpreload
according to the priority table -
for Chrome(80%) - add
modulepreload
to get the maximum from it.
Right now - for the better safety I would recommend using feature detection to decide which bundle you are going to use (variant 3), accompaniment by modulepreload
to make it even better for Chrome.
Keep in mind -
module
is already hight priority async resource, and you don't have to usepreload
to get it faster.
Right now only one code-splitting solution - loadable-components
- has something like double-bundle shipping, but it's still just random code inside the github issue.
How to get two bundles?
Well, this is the real problem why differential loading is not so popular - the problem with differential bundling itself.
And there are 3 ways to get it right:
-
bundle twice. Just run
yarn build:es5
one time, andyarn build:es2015
just after it. In other words -
BABEL_ENV="production:modern" yarn run ...
BABEL_ENV="production:legacy" yarn run ...
You can build them in a parallel, or one after another. It would work for any bundler, just don't forget to somehow separate file names (use content hash).
this is a way any bundler could do the job
-
bundle once, but twice, which is also known as a
Multi-compiler mode
. Both webpack and parcel(2) support it. Theoretically, it's more efficient, as long as could share some cache between builds, however - not a fact - both mentioned bundles haven't implement that shareable cache yet, but it's expected in the nearest future.
this is a smart and modern way to do differential bundling.
-
bundle and rebundle. Like first bundle everything into
esmodules
target(something with async/await in short, "modern" in short), and then transpile that bundle to a lower target. And it would be just a very simple transpilation process - all babel plugin magic, tree shaking, module concatenations and optimizations already applied, not much left. In other words - you don't have to usebabel
for it, but could try something a bit faster: - swc - 16x times faster than babel
-
sucrase - 20x times faster than babel, however, it could not be used for this case, as long as it does not handle transpilation down to
es5
. -
babel... which could be way faster than
babel
, without any plugins and majority of syntax transformers removed.
this way is "bundler independent", and also solves the problem with es2015 code in
node_modules
Solves es2015 code in node_modules
? Yep - your code would be transpiled as a whole, so it does not matter where you had modern code - in your own code, or node_modules
.
For now, the only library which handles this process is devolution
theKashey / devolution
🦎 -> 🦖A de-evolution gun for your bundle!
🦎 -> DEvolution -> 🦖
de-evolution gun, as seen in Mario Bros, to help you ship modern, and de-modernized bundles
Why?
-
ship more modern, more compact and more fast code to 85+% of your customers
-
do not worry about transpiling node_modules - use as modern code as you can everywhere
-
don't be bound to the bundler
-
well, it's just faster than a
multi-compiler mode
and 100% customizable. -
🚀 fast - uses swc to be a blazing 🔥 fast!
-
📱 multi threaded - uses jest-worker to consume all your CPU cores
-
🗜 compact - uses terser without mangling to re-compress the result
-
🦎 optimized - uses rollup to handle polyfills
-
🦖 supports
core-js
2 and 3
TWO bundles to rule the world
- One for "esm"(modern) browsers, which you may load using
type=module
- Another for an "old"(legacy) browser, which you may load using
nomodule
Usage
1. Compile your code to the
…The only downside of devolution
- it does not work without proper code-splitting - running ше on a big script would just throw out of memory error. So many smaller scripts, not a single big one, is a must.
Conclusion
So, how efficient is this technique? Would it really "optimize" JS delivery? Let's measure!
- "vendor" bundle es5:
992.420
bytes "vendor" bundle "esmodules":
990.441
bytes (2kb diff)"src" bundle es5:
291.115
bytes"src" bundle es2015:
271.122
bytes (20kb diff)
In short - as it was mentioned above - "modern" bundles could not do much with "non-modern" code in node_modules
, and might not do much with your own code, if it does not use "modern" constructions, like async
/await
for classes
a lot.
But if you do - it would be a deal breaker. However - every day we are using classes less and less, making using more modern target less and less efficient.
However, there is a light at the end of the tunnel - look like we could compile npm modules... back to es2015
Another light is babel <-> lebab
. A magic library which transforms your es5 code into es6 (however not 100% safe), and could uplevel existing code.
Lebab
Lebab transpiles your ES5 code to ES6/ES7 It does exactly the opposite of what Babel does If you want to understand what Lebab exactly does, try the live demo.
Install
Install it using npm:
$ npm install -g lebab
Usage
Convert your old-fashioned code using the lebab
cli tool,
enabling a specific transformation:
$ lebab es5.js -o es6.js --transform let
Or transform an entire directory of files in-place:
# .js files only
$ lebab --replace src/js/ --transform arrow
# For other file extensions, use explicit globbing
$ lebab --replace 'src/js/**/*.jsx' --transform arrow
For all the possible values for --transform
option
see the detailed docs below or use --help
from command line.
Features and known limitations
The recommended way of using Lebab is to apply one transform at a time read what exactly the transform does and what are its limitations, apply it for your code and…
This all is a micro-optimizations, which is good to have, if you could have them for free and don't have spend much time setting this up, but the real low hanging fruits are growing on other trees:
- naturally smaller bundles are always better than a minimized ones.
- always check for package duplication, which is a quite common problem for any big application.
- speaking of duplication - try to remove babel helpers duplications among node_modules, for example using runtime-compress-loader. This would not affect gzip size, but would reduce real size of your scripts letting them be evaluated faster.
- don't forget that JSX in default settings is using quite verbose
React.createElement
syntax - use jsx-compress-loader or (react-local babel)[https://github.com/danya/react-local] to make it more tidy. - consider SSR to improve FirstContentPaint, and HTML state to mitigate hydration uncanny valley problem.
For example - optimizing CSS delivery could give you more...
Top comments (0)