I have been intending to move my writing off of Medium for a while now, and while I haven't put in the work to move old posts away, I'm at least st...
For further actions, you may consider blocking this person and/or reporting abuse
I approve this message.
There is definitely a case to be made that trading bandwidth for performance is a winning exchange for desktop users with big pipes and no data caps.
I don't think people with data caps on their mobile plan will appreciate all the their minutes disappearing though, which I guess doesn't matter because how do you even optimize preloading with no hover selector on mobile?
On mobile devices, preloading starts on “touchstart”, giving 300 ms (Android) to 450 ms (iOS) for preloading the page.
Thank you for clarifying
I think what you are describing at the end is Next.ja, you use React to build your app, then you can either generate the HTML at build time or at runtime querying the DB (or an API) directly, you can then leverage HTTP and CDN cache, it will also prefetch the code for new pages when you hover them.
I second this, this is the problem Next.js intends to solve with SSR, which gives an edge with SEO over standard CSR solutions.
I agree with this article because I'm tired of loading times with CSR's, it kills me a little bit inside every time. I don't know how people can be so blind or have so much patient.
Hi Kenny, great to see you here! And hello Sergio!
My understanding of next.js is that it handles SSR and React routing. I think this is different than what I have in mind, which is essentially static-site-generation (but not to final HTML output, but to templated views for MVC).
The conceptual difference is this: SSR is still rendering the output HTML in real time (only in server side code, instead of client side). It still takes server resource to do this on-demand. SSG (static site generation) is sort of what the JAM stack is about -- you pregenerate HTML from your React/Vue/Svelte frontend components, and it simply lives as static HTML files on the web server (and in the case of JAM, they talk to an API server afterwards). In the case of what I propose, they are rendered as templated views, where an MVC framework could just call
render(pagename, variables);
at the end of a controller to render the view.Perhaps I would need to run some benchmark to know if there's actually significant performance improvements to gain at all. Like I said, I spent no more time than what it took to write down the thoughts, to think about it, so more work is warranted. :)
This article might interest you Bigi. It talks about the frontend stack including InstantClick.
dev.to’s Frontend: a brain dump in one act
Nick Taylor (he/him) ・ Apr 23 '19 ・ 8 min read
I wrote it before I started working at DEV and I believe it was my now co-worker @jacobherrington that integrated parts of that article into the DEV docs . Since starting at DEV I’ve also updated the docs related to the frontend, including webpacker 4 upgrades.
Other things that we do in the frontend to make things faster are leveraging service workers as well as dynamic imports. You can dynamic see imports in action in for example the onboarding process or the logged on feed.
In regards to reusable components, we use Preact components mainly for the logged on experience. The only exception to this I believe is search.
Something that I would like to experiment with during the next cooldown period or the following one is server-side rendering Preact components in a Rails app as it is possible. I put the question out on Twitter and the creator of Preact, @_developit chimed in that it was possible.
@ben also has some great posts about making DEV fast.
Making dev.to Incredibly fast
Ben Halpern ・ Feb 2 '17 ・ 5 min read
What it Takes to Render a Complex Web App in Milliseconds
Ben Halpern ・ Aug 18 '17 ・ 4 min read
Instant Webpages and Terabytes of Data Savings Through the Magic of Service Workers ✨
Ben Halpern ・ Dec 18 '19 ・ 5 min read
Thanks for showing DEV some love and looking forward to your next article! 👋🏻
Great links! Just finished reading the frontend brain dump, planning to read the others as well.
Thank you so much for using React/Preact in various modules of the site dispersed in different places, instead of a SPA. I believe this is the right way to go for content-based web apps. Too many tech startups default to React and SPA just because that's what everyone else is doing, and perhaps in some ways it's "easier" to just go full SPA when you already want to use React (for organizational and team-scaling reasons). When React modules are small and separate on a page, they are really snappy. When it's a SPA and entire pages changes, it feels way more sluggish than what you do with Instant Click and HTML outputs from server. I wish what you do is the standard across the web among content sites.
As someone using exactly none of the above - dotNET Core with no JS frameworks at all - even I see potential immediate benefits from InstantClick-like acceleration. The frontend is about as simple as these things get: one main CSS and JS file alongside a few helpers, so adding something like this is probably easier. Thanks for posting!
That new paradigm might already exists. I use unpoly.com/ which together with any server side templating that allows you to componetize your markup (I use my own JavaScript template string based wrapper npmjs.com/package/html-string ) makes a very productive stack. And it's faster than any spa framework I've ever used.
Very nice, I had not heard of Unpoly before. Reading the "How it works" and skimming through some of what it does, it reminds me of what Intercooler.js does. It's also a lib I always meant to try on a project but never got a chance to.
Intercooler.js has now a jquery free sibling called htmx htmx.org/
Another similar lib is Trimming postlight.github.io/trimmings/
Any tutorial on how to add InstantClick to a rails app? I added it to my Javascript packs, but somehow I get an
InstantClick is not defined
error in console. Do you have to do anything to disable Turbolinks when adding InstantClick? Thanks.Issue described here: stackoverflow.com/questions/611164...
I have to admit that dev.to is so fast that it sometimes borders to annoying me. I kid you not, I keep wondering how the heck they manage to load an online page faster than I can load local html pages.
This is the sort of thing that might already be proposed as a feature on the GitHub for dev.to - since it is open-source after all, you can even suggest things yourself. How cool is that?
Totally agree. How is dev.to so fast? Unbelievable even in Seoul, Korea.
Korea's average internet speeds are fastest in the world as of the last stats I checked, which makes dev.to's speed even more remarkable!
Great thoughts, loved it.
that is true