WebAssembly started as a “JavaScript alternative for browsers”. The idea is to run high-performance applications compiled from languages like C/C++ or Rust safely in browsers. In the browser, WebAssembly runs side by side with JavaScript.
Figure 1. WebAssembly and JavaScript in the browser.
As WebAssembly is increasingly used in the cloud, it is now a universal runtime for cloud-native applications. Compared with Docker-like application containers, WebAssembly runtimes achieve higher performance with lower resource consumption. The common uses cases for WebAssembly in the cloud include the following.
- Runtime for serverless function-as-a-service (FaaS)
- Embedding user-defined functions into SaaS apps or databases
- Runtime for sidecar applications in a service mesh
- Programmable plug-ins for web proxies
- Sandbox runtimes for edge devices including software-defined vehicles and smart factories
However, in those cloud-native use cases, developers often want to use JavaScript to write business applications. That means we must now support JavaScript in WebAssembly. Furthermore, we should support calling C/C++ or Rust functions from JavaScript in a WebAssembly runtime to take advantage of WebAssembly’s computational efficiency. The WasmEdge WebAssembly runtime allows you to do exactly that.
Figure 2. WebAssembly and JavaScript in the cloud.
WasmEdge
WasmEdge is a leading cloud-native WebAssembly runtime hosted by the CNCF (Cloud Native Computing Foundation) / Linux Foundation. It is the fastest WebAssembly runtime in the market today. WasmEdge supports all standard WebAssembly extensions as well as proprietary extensions for Tensorflow inference, KV store, and image processing, etc. Its compiler toolchain supports not only WebAssembly languages such as C/C++, Rust, Swift, Kotlin, and AssemblyScript but also regular JavaScript.
A WasmEdge application can be embedded into a C program, a Go program, a Rust program, a JavaScript program, or the operating system’s CLI. The runtime can be managed by Docker tools (eg CRI-O), orchestration tools (eg K8s), serverless platforms (eg Vercel, Netlify, AWS Lambda, Tencent SCF), and data streaming frameworks (eg YoMo and Zenoh).
Now, you can run JavaScript programs in WasmEdge powered serverless functions, microservices, and AIoT applications! It not only runs plain JavaScript programs but also allows developers to use Rust and C/C++ to create new JavaScript APIs within the safety sandbox of WebAssembly.
Building a JavaScript engine in WasmEdge
First, let’s build a WebAssmbly-based JavaScript interpreter program for WasmEdge. It is based on QuickJS with WasmEdge extensions, such as network sockets and Tensorflow inference, incorporated into the interpreter as JavaScript APIs. You will need to install Rust to build the interpreter.
If you just want to use the interpreter to run JavaScript programs, you can skip this section. Make sure you have installed Rust and WasmEdge.
Fork or clone the wasmegde-quickjs Github repository to get started.
$ git clone https://github.com/second-state/wasmedge-quickjs
Following the instructions from that repo, you will be able to build a JavaScript interpreter for WasmEdge.
$ rustup target add wasm32-wasi
$ cargo build --target wasm32-wasi --release
The WebAssembly-based JavaScript interpreter program is located in the build target directory. You can now try a simple “hello world” JavaScript program (example_js/hello.js), which prints out the command line arguments to the console.
args = args.slice(1)
print(“Hello”,…args)
Run the hello.js
file in WasmEdge’s QuickJS runtime as follows. Note, the --dir .:.
on the command line is to give wasmedge
permission to read the local directory in the file system for the hello.js
file.
$ wasmedge --dir .:. target/wasm32-wasi/release/quickjs-rs-wasi.wasm example_js/hello.js WasmEdge Runtime
Hello WasmEdge Runtime
Next, let’s try a few more advanced JavaScript programs.
A JavaScript networking example
The interpreter supports the WasmEdge networking socket extension so that your JavaScript can make HTTP connections to the Internet. Here is an example of JavaScript.
let r = GET("http://18.235.124.214/get?a=123",{"a":"b","c":[1,2,3]})
print(r.status)
let headers = r.headers
print(JSON.stringify(headers))let body = r.body;
let body_str = new Uint8Array(body)
print(String.fromCharCode.apply(null,body_str))
To run the JavaScript in the WasmEdge runtime, you can do this on the CLI.
$ wasmedge --dir .:. target/wasm32-wasi/release/quickjs-rs-wasi.wasm example_js/http_demo.js
You should now see the HTTP GET result printed on the console.
A JavaScript Tensorflow inference example
The interpreter supports the WasmEdge Tensorflow lite inference extension so that your JavaScript can run an ImageNet model for image classification. Here is an example of JavaScript.
import {TensorflowLiteSession} from 'tensorflow_lite'
import {Image} from 'image'let img = new Image('./example_js/tensorflow_lite_demo/food.jpg')
let img_rgb = img.to_rgb().resize(192,192)
let rgb_pix = img_rgb.pixels()let session = new TensorflowLiteSession('./example_js/tensorflow_lite_demo/lite-model_aiy_vision_classifier_food_V1_1.tflite')
session.add_input('input',rgb_pix)
session.run()
let output = session.get_output('MobilenetV1/Predictions/Softmax');
let output_view = new Uint8Array(output)
let max = 0;
let max_idx = 0;
for (var i in output_view){
let v = output_view[i]
if(v>max){
max = v;
max_idx = i;
}
}
print(max,max_idx)
To run the JavaScript in the WasmEdge runtime, you can do the following on the CLI to re-build the QuickJS engine with Tensorflow and then run the JavaScript program with Tensorflow API.
$ cargo build --target wasm32-wasi --release --features=tensorflow
... ...
$ wasmedge-tensorflow-lite --dir .:. target/wasm32-wasi/release/quickjs-rs-wasi.wasm example_js/tensorflow_lite_demo/main.js
label:
Hot dog
confidence:
0.8941176470588236
Note:
- The
--features=tensorflow
compiler flag builds a version of the QuickJS engine with WasmEdge Tensorflow extensions. - The
wasmedge-tensorflow-lite
program is part of the WasmEdge package. It is the WasmEdge runtime with the Tensorflow extension built in.
You should now see the name of the food item recognized by the TensorFlow lite ImageNet model.
Make it faster
The above Tensorflow inference example takes 1–2 seconds to run. It is acceptable in web application scenarios but could be improved. Recall that WasmEdge is the fastest WebAssembly runtime today due to its AOT (Ahead-of-time compiler) optimization. WasmEdge provides a wasmedgec
utility to compile the wasm
file to a native so
shared library. You can use wasmedge
to run the so
file instead of wasm
file to get much faster performance.
The following example uses the extended versions to wasmedge
and wasmedgec
to support the WasmEdge Tensorflow extension.
$ wasmedgec-tensorflow target/wasm32-wasi/release/quickjs-rs-wasi.wasm quickjs-rs-wasi.so
$ wasmedge-tensorflow-lite --dir .:. quickjs-rs-wasi.so example_js/tensorflow_lite_demo/main.js
label:
Hot dog
confidence:
0.8941176470588236
You can see that the image classification task can be completed within 0.1s. It is at least 10x improvement!
The
so
shared library is not portable across machines and OSes. You should runwasmedgec
andwasmedgec-tensorflow
on the machine you deploy and run the application.
A note on QuickJS
Now, the choice of QuickJS as our JavaScript engine might raise the question of performance. Isn’t QuickJS a lot slower than v8 due to a lack of JIT support? Yes, but …
First of all, QuickJS is a lot smaller than v8. In fact, it only takes 1/40 (or 2.5%) of the runtime resources v8 consumes. You can run a lot more QuickJS functions than v8 functions on a single physical machine.
Second, for most business logic applications, raw performance is not critical. The application may have computationally intensive tasks, such as AI inference on the fly. WasmEdge allows the QuickJS applications to drop to high-performance WebAssembly for these tasks while it is not so easy with v8 to add such extensions modules.
Third, it is known that many JavaScript security issues arise from JIT. Maybe turning off JIT in the cloud-native environment is not such a bad idea!
What’s next?
The examples demonstrate how to use the quickjs-rs-wasi.wasm
JavaScript engine in WasmEdge. Besides using the CLI, you could use Docker / Kubernetes tools to start the WebAssembly application or to embed the application into your own applications or frameworks as we discussed earlier in this article.
In the next two articles, I will focus on using JavaScript together with Rust to make the most out of both languages.
- How to incorporate simple JavaScript snippets into a high-performance Rust app in WasmEdge.
- How to make a high-performance native function available as a JavaScript API in WasmEdge.
JavaScript in cloud-native WebAssembly is still an emerging area in the next generation of cloud and edge computing infrastructure. We are just getting started! If you are interested, join us in the WasmEdge project (or tell us what you want by raising feature request issues).
Top comments (0)