If you stumbled upon this article, you’re probably wondering what Bun is. You’re in luck, as I’m about to tell you everything there is to know about Bun.
So what is Bun? Essentially, it’s a new JS runtime, similar to Node. Unlike Node, however, Bun is insanely fast. Like seriously, seriously fast. We’ll look at that later though, let’s first look at the existing problem with Node.
What’s wrong with Node?
Node has been around since 2009. Since then, the web and server ecosystem has vastly changed. Many of Node’s issues have been covered by the creator, Ryan Dahl (in this conference). A quick TL;DR is that Node doesn’t support built-in TypeScript, JSX, or Environment Variables. Moreover, its package manager, NPM, is famous for the node_modules
folder of doom.
How is it so fast?
Bun is built with Zig, a low-level programming language with manual memory management. It uses the JavaScriptCore Engine, which tends to be a little more performant than Google’s V8 Engine.
Bun mostly accredits its speed to Zig, stating the following on their website:
Zig’s low-level control over memory and lack of hidden control flow makes it much simpler to write fast software.
Benchmarks
Jarred Sumner, has made numerous benchmarks on Twitter regarding the speed of Bun compared to Node and Deno. Below, I’m going to be running some tests locally to see if Bun really stands up to these other runtimes. In each test, the script will simply save a text file locally. I’m using Mitata to test the speed.
Testing Bun
// ./scripts/bun.js
import { write } from "bun";
import { bench, run } from "mitata";
const file = "./out/bun.txt";
bench("bun:write", async () => {
await write(file, "hello world");
})
await run();
➜ bench bun ./scripts/bun.js
cpu: Apple M1
runtime: bun 0.1.6 (arm64-darwin)
benchmark time (avg) (min … max) p75 p99 p995
------------------------------------------------- -----------------------------
bun:write 76.86 µs/iter (64.79 µs … 2.35 ms) 75.5 µs 139.38 µs 246.17 µs
Testing Node
// ./scripts/node.mjs
import { writeFileSync } from "fs";
import { bench, run } from "mitata";
const file = "./out/node.txt";
bench("node:write", async () => {
writeFileSync(file, "hello world");
})
await run();
➜ bench node ./scripts/node.mjs
cpu: Apple M1
runtime: node v18.7.0 (arm64-darwin)
benchmark time (avg) (min … max) p75 p99 p995
-------------------------------------------------- -----------------------------
node:write 94.55 µs/iter (65.92 µs … 29.59 ms) 78.29 µs 129.25 µs 217.13 µs
Testing Deno
// ./scripts/deno.mjs
import { bench, run } from "https://esm.run/mitata";
const file = "./out/deno.txt";
bench("deno:write", async () => {
Deno.writeTextFileSync(file, "hello world");
})
await run();
➜ bench deno run -A ./scripts/deno.mjs
Download https://cdn.jsdelivr.net/npm/fs/+esm
cpu: Apple M1
runtime: deno 1.24.2 (aarch64-apple-darwin)
benchmark time (avg) (min … max) p75 p99 p995
-------------------------------------------------- -----------------------------
deno:write 110.66 µs/iter (74.25 µs … 5.88 ms) 129.79 µs 162.33 µs 179.75 µs
On all three occasions, a file was written to storage. Below is a table containing the runtime used, the native API used, and the final speed.
Runtime | API | Average Speed |
---|---|---|
Bun | Bun.write() | 76.86µs |
Node | fs.writeFileSync | 94.55µs |
Deno | Deno.writeTextFileSync | 110.66µs |
As you can see, Bun is clearly ahead of Node and Deno in terms of server-side operations. I say server-side operation, as Bun doesn't fare as well when using client-side operations. In an upcoming post, I’ll be comparing Bun + Next.js against Deno + Fresh.
Also, a quick reminder that Bun is still in development. What you’ve seen in this post may be irrelevant in a few months. Just keep that in mind.
Anyway, I hope you found this article helpful 😄
Please consider sharing + following
Top comments (0)