Server Side Rendering (SSR) seems to be all the rage. Hydration strategies are the talk of the town, and honestly, it's been kind of a refreshing change from the client side JS-heavy framework status quo. However, I'm always surprised at how little exploration into service workers takes place in these discussions.
Single Page Application Progressive Web App (whew) architecture has been well established by now; You build your application shell, precache your required assets, and fetch dynamic data that makes your app do what your app does. Additionally, Single Page Applications (SPA's) are usually relatively easy to PWA-ify after they're already build.
The same can't be said for Multi Page Applications (MPA's) however, for MPA's you really have to take any kind of offline-capabilities along in your architecture right from the start of your project. And I just can't help but feel there's currently no real good solution for PWA-ifying MPA's, that has an excellent developer experience, like so many JS frameworks, or SSR frameworks have. Static Site Generators don't seem to really be investing in this space a whole lot, either. In fact, there only a handful of solutions I could find for this kind of architecture at all!
Service Workers on the Server
One of those solutions is made by the brilliant Jeff Posnick. His blog jeffy.info is completely rendered by a service worker. The initial render happens on the server, in a Cloudflare Worker which uses the same API's as a service worker that would run in the browser. What's interesting about this approach is that this allows Jeff to reuse the same code both on his server, as well as the client. This is also known as isomorphic rendering.
When the user visits the blog for the first time, the Cloudflare Worker renders the page, and on the client side the service worker starts installing. When the service worker has installed, the service worker can take control of network requests, and serve responses itself; potentially omitting the server entirely, and delivering instant responses.
You can read all about how Jeff build his blog on, well, his blog about it, but it mainly comes down to: Streams.
Stream Stitching
Consider the following example:
registerRoute(
new URLPatternMatcher({pathname: '/(.*).html'}).matcher,
streamingStrategy(
[
() => Templates.Start({site}),
async ({event, params}) => {
const post = params.pathname.groups[0];
const response = await loadStatic(event, `/static/${post}.json`);
if (response?.ok) {
const json = await response.json();
return Templates.Page({site, ...json});
}
return Templates.Error({site});
},
() => Templates.End({site}),
],
{'content-type': 'text/html'},
),
);
Here, Jeff makes use of a pattern I'll refer to as stream stitching. This is cool, because browsers can already start rendering streamed HTML as it arrives. This also means you can already stream the <head>
of your page, which may already start downloading scripts, parsing styles, and other assets, while waiting for the rest of the HTML to come streaming in.
While from a technical point of view this is really exciting, I can't help but feel the developer experience is somewhat... lacking. Workbox does an excellent job at providing abstractions over streaming APIs so you dont have to do things manually, and helps with things like registering and matching routes, but even then it still feels somewhat close to the metal, especially compared to the developer experience of all these flashy SSR frameworks. Why can't we have nice things with service workers?
Service Worker Side Rendering with Astro
I've recently been hacking on Astro SSR projects a bunch, and was looking into creating a Cloudflare adapter to deploy my Astro SSR application to a Cloudflare environment. It was when I was reading up on Cloudflare workers that I was reminded of this chat by Jeff Posnick and Luke Edwards about his blog and the architecture laid out earlier in this blogpost, and it made me wonder; if I'm able to deploy Astro on an environment thats so similar to a service worker... Why can't I run Astro in an actual service worker?
So I started hacking on some code and, well, it turns out you totally can. In this example, you can see a real Astro SSR application run by a service worker. This is exciting for several reasons:
- Your Astro app is now offline-capable
- Your app is now installable
- The function invocations of your hosting provider are reduced dramatically, because requests can be served by the service worker in-browser
- Huge performance benefits
- It's a progressive enhancement
But most of all, it may mean we're getting super close to having an excellent developer experience! Astro may very well be the first framework capable of delivering us developer experiences like the following:
/blog/[id].astro
:
---
import Header from '../src/components/Header.astro';
import Sidemenu from '../src/components/Sidemenu.astro';
import Footer from '../src/components/Footer.astro';
const { id } = Astro.params;
---
<html>
<Header/>
<Sidemenu/>
{fetch(`/blog/${id}.html`)}
<Footer/>
</html>
Wouldn't this be amazing? This code could run both on the server, as well as in a service worker. However! As cool as this would be, we're not quite there. Currently, Astro doesn't yet support streaming responses, we'll get into that in a little bit, but for now dream along with me for a minute.
What would happen in the code snippet is the following: On initial visit, the server renders this page, much like in Jeff's blog example. The service worker then gets installed and can take control of requests, which means that from then on, the exact same code can get rendered by the service worker in the browser instead, and deliver responses immediately.
Furthermore, in this example the <Header/>
and <Sidemenu/>
are static components and can be streamed immediately. The fetch
promise returns a response, which body is... You guessed it, a stream! This means the browser can already start rendering the header (which may also start download other assets), render the sidemenu, and then immediately start streaming the result of the fetch
to the browser.
Isomorphic rendering
We could even expand on this pattern:
---
import Header from '../src/components/Header.astro';
import Sidemenu from '../src/components/Sidemenu.astro';
import Footer from '../src/components/Footer.astro';
const { id } = Astro.params;
---
<html>
<Header/>
<Sidemenu/>
{fetch(`/blog/${id}.html`).catch(() => {
return caches?.match?.('/404.html') || fetch('/404.html');
})}
<Footer/>
</html>
Imagine if we visited a URL with an id
that doesnt exist. If the user doesn't have a service worker installed yet, the server would:
- Try to fetch
/blog/${id}.html
, which fails - Run the
catch
callback, and try to executecaches?.match?.('/404.html')
, which we don't have access to on the server - So it'll fall back to
|| fetch('/404.html')
instead
However, if the user does have a service worker installed already, it could have precached the '/404.html'
during installation, and just load it instantly from the cache.
You can probably even imagine some helpers like:
<Header/>
{cacheFirst(`/blog/${id}.html`)}
{staleWhileRevalidate(`/blog/${id}.html`)}
{networkFirst(`/blog/${id}.html`)}
<Footer/>
The downsides
Not quite there yet
Currently, Astro's responses are not streamed yet. Nate, one of Astro's core maintainers, did however mention that:
The good news about Astro is that streaming has been the end goal since day one! We don’t need any architecture changes to support it—Astro components are just async iterators. We’ve mostly been waiting for SSR APIs to stabilize before exposing streaming.
Consider the following code snippet from Astro's source code:
export async function render(htmlParts: TemplateStringsArray, ...expressions: any[]) {
return new AstroComponent(htmlParts, expressions);
}
Where an AstroComponent
looks like:
class AstroComponent {
constructor(htmlParts, expressions) {
this.htmlParts = htmlParts;
this.expressions = expressions;
}
get [Symbol.toStringTag]() {
return "AstroComponent";
}
*[Symbol.iterator]() {
const { htmlParts, expressions } = this;
for (let i = 0; i < htmlParts.length; i++) {
const html = htmlParts[i];
const expression = expressions[i];
yield markHTMLString(html);
yield _render(expression);
}
}
}
As Nate said, just an async iterator. This means that it could potentially even allow for promises and iterables in Astro expressions, e.g.:
---
import Header from '../src/components/Header.astro';
function* renderLongList() {
yield "item 1";
yield "item 2";
}
---
<html>
<Header/>
{renderLongList()}
</html>
Or the example with fetch
we saw earlier in this post:
<Header/>
<Sidemenu/>
{fetch(`/blog/${id}.html`)}
<Footer/>
There's currently some discussion ongoing in this RFC discussion on the Astro repository. If this is a future that you're excited for, please do leave a comment to signal some interest to the maintainers. There is a cost, however. There have also been other feature proposals that would make streaming responses impossible, like for example post-processing HTML, or the concept of a <astro:head>
element, where a child component can append to the head. Both of these things are not compatible with streaming responses. Although, perhaps these features dont have to be mutually exclusive; maybe renderers could even be made configurable by Astro via the astro.config.mjs
:
export default defineConfig({
ssr: {
output: 'stream'
}
});
Much to think about and consider, but either way, please do check out the RFC discussion and leave your thoughts, or simply an upvote/emoji!
Bundlesize
The other downside is bundlesize. Admittedly, Astro's bundle when run in a service worker is... large. However, I've not done too much experimentation here yet, but it seems like there's a lot of room for improvement on bundlesize.
Astro-service-worker
While streaming responses in Astro may be a ways off yet, I did turn my service worker experimentation into an Astro Integration that you can already use today: astro-service-worker
. This integration will take your Astro SSR project, and create a service worker build for it.
Getting started is easy, install the dependency:
npm i -S astro-service-worker
And add the integration to your astro.config.mjs
:
import { defineConfig } from 'astro/config';
import netlify from '@astrojs/netlify';
+import serviceWorker from 'astro-service-worker';
export default defineConfig({
adapters: netlify(),
integrations: [
+ serviceWorker()
]
});
Demo
You can find an example of a small app that uses astro-service-worker
in this demo, and you can find the source code for the demo here.
Server-first, server-only, service-worker-first, service-worker-only
When service-worker-izing your Astro applications, you have to keep in mind that the Astro code you write in your Astro frontmatter should now also be able to run in the browser. This means that you can't make use of any commonjs dependencies, or node built-ins, like 'fs'
, for example. However, it could be the case that you have need for some server-only code, like for example accessing a database, or webhooks, or redirect callbacks, or whatever. In this case, you could exclude those endpoints from the output service worker bundle.
This means that you can have an entire fullstack codebase with: Server-first, server-only, service-worker-first, and service-worker-only code in the same project. Additionally, the service worker is entirely a progressive enhancement. If your user uses a browser that doesn't support service workers, the server will still render your app just fine.
Network-only
It could be the case that you would like to make use of some server-only endpoints or pages, perhaps for creating database connections, or other things that depend on Nodejs built-in modules that are not available in the browser. If that is the case, you can specify which page you'd like to exclude from the service worker bundle:
export default defineConfig({
integrations: [
serviceWorker({
networkOnly: ['/networkonly-page', '/db-endpoint', 'etc']
}),
]
});
Customize Service Worker logic
You can also extend the Service Worker and add your own custom logic. To do this, you can use the swSrc
option.
export default defineConfig({
integrations: [
serviceWorker({
swSrc: 'my-custom-sw.js',
}),
]
});
my-project/my-custom-sw.js
:
self.addEventListener('fetch', (e) => {
console.log('Custom logic!');
});
Combine with other integrations
You can even combine this with other SSR integrations; if your components are SSR-able, they should also be SWSR-able! Do note however that there may be some differences in a traditional server environment, and a service worker. This means there may be additional things you need to shim.
import { defineConfig } from 'astro/config';
import netlify from '@astrojs/netlify';
import customElements from 'custom-elements-ssr/astro.js';
import serviceWorker from './index.js';
export default defineConfig({
adapter: netlify(),
integrations: [
customElements(),
serviceWorker()
]
});
Top comments (3)
This looks really amazing! Looking forward to the future of web apps!
This is the future of client-side websites and web apps that need routing and/or HTML streaming, but don't need a server.
We went from a big js file doing everything some years ago when it was trending, to SSR/SSG frameworks, and then.. this could be the future because it is a good mix of both worlds and looks promising once we solve the problem of the bundle size for the workers.
Very inspiring. I will follow the progress on this closely :)
Is there some news about this with Astro 1.0 out or something to look forward to in 2023?