DEV Community

Cover image for Encore.ts — 17x Faster cold starts than NestJS & Fastify
Marcus Kohlberg for Encore

Posted on

Encore.ts — 17x Faster cold starts than NestJS & Fastify

A couple of months ago we released Encore.ts — an Open Source backend framework for TypeScript.

Since there are already many frameworks out there, we want to share some of the outlier design decisions we've made and how they lead to remarkable performance numbers.

We recently published performance benchmarks showing that Encore.ts achieves 9x request throughput compared to Express.js, and 2x compared to Fastify.

Today, we're continuing on our performance journey by diving into how Encore.ts achieves incredibly fast cold start startup times.

Performance benchmarks

This time we've benchmarked Encore.ts, Fastify, NestJS and Express to see how each framework performs when it comes to cold startup times.

The benchmark program registers 10 API endpoints, each with a simple schema, and sets up schema validation.
For schema validation we used Zod where possible.
In the case of Fastify we used Ajv as the officially supported schema validation library.

We measured the time from when JavaScript code begins executing until the server is ready to accept incoming requests.
For each benchmark we took the best result of five runs.

Enough talk, let's dig into the numbers!

Encore.ts cold starts are 17x faster than NestJS and Fastify

Cold starts benchmark graph

(Check out the benchmark code on GitHub.)

As you can see, Encore.ts achieves remarkable fast cold startup times, over 5x faster than Express and over 17x faster than NestJS.

How is this possible? From our testing we've identified two major sources of performance, both related to how Encore.ts works under the hood.

But before we get there, let's talk about what cold starts really are, and why they matter.

What is a cold start?

In the context of serverless, a cold start is when the underlying platform first needs to spins up a new instance of your server in order to serve an incoming request. (It can also refer to the first time a new instance of your server is started up to handle a request, for example after a deployment.)

Since the request is effectively on hold until the process starts up and is ready to handle the request, reducing cold startup times can have a large impact on the long-tail latency of your application.

This is especially important for distributed systems where you have multiple serverless functions, as it's much more likely you will encounter a cold start in some part of the system when handling a request.

The anatomy of a cold start

Exactly what happens during a cold start depends a bit on the platform you're deploying to (Kubernetes, Lambda, Cloud Run, etc.).
But in general, the process looks something like this:

  1. Platform downloads the code/container image for the serverless function
  2. Platform spins up a new instance of the container/serverless function/container
  3. The container/function initializes itself (importing JavaScript modules, running initialization code, etc.)

After these initialization steps the cold start is complete, and the serverless function begins processing the incoming request.

The first two steps are largely out of our control (other than by making sure the size of the code/container is optimized), so let's focus our attention on the third step.

In fact, let's further break down the third step, and assuming we're running Node.js:

  1. The node process starts up and begins initializing the V8 JavaScript engine
  2. The entrypoint file is parsed, loaded, and begins executing application code
  3. When the JavaScript code executes import and require statements, yet more files are loaded, parsed and executed. (Repeat many times for applications with lots of dependencies.)

Finally, after all dependencies have been loaded and all the initialization code has executed, the container/serverless function is ready to handle incoming requests.

Optimizing cold starts

The breakdown above gives us clear targets for optimization, and Encore.ts heavily optimizes all the steps it has control over.

Optimization 1: Rust runtime

Encore.ts is implemented in Rust and loaded into Node.JS as a native module. This has several benefits for cold starts:

Less JavaScript to parse and execute. Since JavaScript is an interpreted language, all JavaScript code needs to be read from disk, parsed, and executed. Encore.ts, as a pre-compiled native module, loads extremely quickly and doesn't need to be parsed or executed by the JavaScript engine (V8).

Zero NPM dependencies. Since Encore.ts implements all its functionality using Rust, it has no NPM dependencies whatsoever, which further reduces the amount of JavaScript that needs to be executed during a cold start.

Pre-compiled and optimized. JavaScript relies heavily on just-in-time compilation (JIT), where code that gets executed repeatedly gets optimized by the JavaScript engine. This makes a lot of sense for an interpreted language, but it also means that execution is quite a bit slower the first time a piece of code runs, which impacts cold starts considerably. Since Encore.ts is implemented in Rust, it's pre-compiled and heavily optimized for the platform it's running on, which means it's fast from the first time it's executed.

Optimization 2: Efficient Docker images

Encore.ts by default builds minified Docker images, by only including transpiled JavaScript and the necessary dependencies to run the application. This reduces bundle sizes, which in turn reduces the time it takes to download and start up the container.

Additionally, several compute platforms have added support for streaming Docker images, which means that the platform can start the container before the entire image has been downloaded. Encore.ts has built-in support for this, and automatically prioritizes the parts of the image that are needed to reduce cold starts.

Wrapping up

By combining a Rust runtime with optimized Docker images, Encore.ts is able to achieve remarkable cold start times, which can have a large impact on the long-tail latency of your application.

If performance matters to your project, it might be a good idea to try out Encore.ts.

And it's all Open Source, so you can check out the code and contribute on GitHub.

Or just give it a try and let us know what you think!

Top comments (18)

Collapse
 
zaidmaker profile image
DevMirza

Babe wake up! New framework just dropped 🔥

Collapse
 
messified profile image
Jesse Reese

Haha

Collapse
 
tsolan profile image
Eugene

As always, the hundredth js framework released in a week claims to be faster than rust…

Collapse
 
vicariousv profile image
AndrewBarrell1

I don't remember reading anywhere that it claimed to be faster than rust, only that it is built in rust.

Where did it claim that?

Collapse
 
tsolan profile image
Eugene

For ignorant, it was a metaphor

Thread Thread
 
williamukoh profile image
Big Will

Please stop with the metaphors. It was a misleading comment

Thread Thread
 
vicariousv profile image
AndrewBarrell1

It wasn't a metaphor for anything. Just a guy that didn't read an article but still had something negative to say.

Willfully ignorant and calling other people ignorant? I actually read the article

Thread Thread
 
tsolan profile image
Eugene • Edited

I come across tens of similar articles about “how good this new js framework, how it outperforms everything else” each day. Shit.

Collapse
 
kusuma_atessae_9b70947c profile image
Kusuma At Essa E

Bio

Collapse
 
leenattress profile image
Lee Nattress

Am I reading this correctly, that you are dealing with multiple endpoints in a single Lambda?

Don't do this. One per Lambda. This benchmark is pointless.

Collapse
 
mohamedlamineallal profile image
Mohamed Lamine Allal • Edited

Container serverless. Like cloud run does have cold starts. And this would be useful for them. Microservices applications ...

Having containers get started only when there is traffic save compute time and can be of better value for some type of applications.

And u would have cold starts.

Nice also to know the context of encore

encore.dev/docs/introduction

Microservices, cloud, distributed systems ...

Collapse
 
gersosval profile image
Gerson Flores

The approach of handling multiple endpoints with a single lambda is known as Lambda Proxy Integration, I don't see any problem with that approach.

Collapse
 
nevodavid profile image
Nevo David

I am just looking at Encore.
I thought it could be a nice NestJS alternative, but after going in, I realized it's more like Express.

I am unsure why you compare a robust IoC framework to express/fastify style.
Also, who runs NestJS on a lambda haha

Collapse
 
mohamedlamineallal profile image
Mohamed Lamine Allal • Edited

Cloud run ...
Container serverless.
Microservices ..

Collapse
 
ducthang310 profile image
Thang Vu

I do! What's wrong with running NestJS on Lambda? :)

Collapse
 
jk_245326cf414ce17a0d2357 profile image
JK

I feel like this is quite flawed. My typical start up time for my various nestjs apps are measured in milliseconds.

Collapse
 
programordie profile image
programORdie

Most hosting services don’t even support loading binaries/native modules…

Some comments may only be visible to logged-in visitors. Sign in to view all comments.