In today's world, microservices architecture has become a popular approach for building scalable and reliable applications. With microservices, an application is broken down into smaller, independent services that can be developed and deployed separately. This approach then necessitates the need for a communication protocol that will allow these services to talk to each other. Traditionally, REST APIs have been used to serve this purpose, but there's a new kid on the block with a claim to the throne called gRPC which works like normal RPC calls, allowing you to call methods implemented on other machines directly.
You can read the original blog post here.
In this blog post, we'll be looking into the benefits of gRPC and how to implement server side code using node and typescript. We'll be using Postman as a client for calling and testing our gRPC method implementations. Here's a link to download.
A Brief Introduction to gRPC
gRPC is a high-performance, open-source, universal RPC (Remote Procedure Call) framework developed by Google. It is built on top of the newer HTTP/2 protocol and Protocol Buffers, making it efficient, lightweight, and easy to use. This also means faster and more efficient communication between services when compared to traditional REST APIs.
gRPC also uses Protocol Buffers (that we'll also call protobufs in this blog post) - a language-agnostic data serialization format - for data serialization and deserialization. The data format is in binary and not text (as is in traditional REST APIs that use JSON) and therefore occupies a smaller memory footprint making it suitable even for devices with low memory like embedded systems.
Similarly, the gRPC framework ships with the protocol buffer compiler which generates your data classes for you (from defined protofiles, which we'll cover in this blog post) reducing the amount of boilerplate code that needs to be written.
As with all other technologies, gRPC does have it's fair share of disadvantages. For starters, the learning curve, when compared to traditional JSON APIs, is rather steep and might take some time for newer developers to be fully aquainted. gRPC also uses the more modern HTTP/2 protocol which has limited browser support, therefore adding set-up overhead and the need to use proxy servers to interface between HTTP/2 and HTTP/1 protocols. You choice to use gRPC in your setup therefore boils down to your specific use case. A good place to start is for communication between internal microservices where you won't need to build and support client libraries.
The Implementation in Node and TypeScript
We'll do a simple server set up that will allow users to login and perform CRUD operations on their contacts.
One key difference between gRPC and REST APIs is that gRPC defines its own standard for status codes which differ from the conventional HTTP response codes that most of us are farmiliar with. Here's a comprehensive list of all these status codes. As such, in each of our protofile definitions, in the responses message types, we'll include a field called status
that we'll use to store the standard HTTP status codes.
For those who can't wait to get their hands dirty and want to dive directly into the source code, here's the codebase link: gRPC Server Example. For those of us who are mere mortals and prefer a more surface-level understanding, keep on reading.
Prerequisites
To generate data classes from your proto definitions, you'll need to have the Protocol Buffer Compiler installed. Heres a handy link on how you can go about it. If you get 'a protoc-gen-ts program not found or is not executable' error, you'll need to downgrade protobuf. Here's another handy link on how to do this.
You also need to have node installed. Here's the download link.
Project Set-Up
Our folder structure will be as follows:
root
|___ src (houses our project source files)
|____ protos (where we'll define our protofiles and where protoc will generate revelant ts files)
|____ services (where we'll define the server side implementation of the services we've defined in our protofiles)
|____ types (where we define necessary interface types for different data structures)
|____ utils (where we'll define utility functions that are re-usable globally across the project)
|____ values (where we define global constants. i.e. server host etc.)
|____ index.ts (the entry point of our application. Handles init functions like server startup and binding to the defined port)
Feel free to create a nodejs project at a location of your choice (with typescript support) that mimicks this folder strucutre.
We'll then create a file named MakeFile
(with no extension) at the root of the project you just created, then copy paste the following:
PROTO_DIR=src/protos
proto_types:
grpc_tools_node_protoc \
--js_out=import_style=commonjs,binary:${PROTO_DIR} \
--grpc_out=grpc_js:${PROTO_DIR} \
--plugin=protoc-gen-grpc=`which grpc_tools_node_protoc_plugin` \
-I ${PROTO_DIR} \
${PROTO_DIR}/*.proto
proto_web:
protoc \
--plugin=protoc-gen-ts=./node_modules/.bin/protoc-gen-ts \
--ts_out=grpc_js:${PROTO_DIR} \
-I ${PROTO_DIR} \
${PROTO_DIR}/*.proto
proto: proto_types proto_web
What this file does is basically house our collection of bash commands related to the protoc compiler that we can conveniently execute by running make proto
on the terminal whenever we want to generate service implementations from our .proto
files.
Protofile Definitions
The first step is to define the protocol buffer (.proto) files, which define the services and messages that will be used for communication between services. In this example, we have two .proto files, one for authentication and another for contacts.
We'll use a simple implementation of the JWT flow for user authentication. If you aren't farmiliar with JWTs, I'd suggest you take a read before exploring the codebase as you may get lost in authentication technicalities, which is not what this blog post is trying to explore.
Our authentication service (at src/protos/auth.proto
) is defined as follows:
auth.proto:
syntax = "proto3";
package auth;
service AuthService {
rpc Login (LoginRequest) returns (LoginResponse);
rpc UserMe (UserRequest) returns (UserResponse);
rpc RefreshAccessToken (RefreshAccessTokenRequest) returns (RefreshAccessTokenResponse);
}
message UserRequest{}
message UserResponse {
int32 status = 1;
string error = 2;
string username = 4;
string id = 5;
string password = 6;
}
message LoginRequest{
string username = 1;
string password = 2;
}
message LoginResponse{
int32 status = 1;
string error = 2;
string jwtToken = 3;
string refreshToken = 4;
}
message RefreshAccessTokenRequest{
string refreshToken = 1;
}
message RefreshAccessTokenResponse{
int32 status = 1;
string error = 2;
string accessToken = 3;
}
Looking at the code snippet above, defining a protofile is fairly straightforward. You start off by defining a message
that represents a data structure. Within the message
definitions, you define your various fields and their respective data type (for those coming from a dynamically typed language, a data type is basically a data structure. I.e. an Array is a data type. A dictionary is also a data type. This also applies to integers, floating point numbers and strings, which are also data types).
In this case, we've defined a AuthService
message that has three methods:
-
Login
- Takes in ausername
andpassword
as inputs and responds with an access token and refresh token. -
UserMe
- Takes no inputs (only the access token as metadata) and responds with information related to the logged in user. -
RefreshAccessToken
- Takes a refresh token as input (generated during login) and responds with a new access token.
Our contacts service protofile also follows a similar pattern (at src/protos/contacts.proto
). Feel free to explore this in the example repository.
Lastly, we run make proto
from the root of the project to generate our typescript data classes with the necessary boilerplate code that we'll interact with when doing our server implementation.
Server Implementation
Here, we'll define the logic based on each of the method messages we defined earlier in our protofiles. We'll start off with our AuthService
. Kindly note that I'll be using 'method' and 'function' here interchangeably, but they both refer to typescript functions.
Also, feel free to download the package.json
file from the repository and run npm install
to load up the packages that we'll be using in this implementation.
We'll walk through different sections of our service implementation (at src/services/authService.ts
) as I attempt to explain the structure of this file, and how we've done the implementation.
As is the norm in I believe every programming language, we start off by importing the necessary files that will help us wire our service.
import {sendUnaryData, ServerUnaryCall} from "@grpc/grpc-js";
import {generateAccessToken, generateRefreshToken, getLoggedInUser, refreshToken} from "../utils/auth";
import {User} from "../types/user";
import {
LoginRequest,
LoginResponse,
RefreshAccessTokenRequest,
RefreshAccessTokenResponse,
UserRequest,
UserResponse
} from "../protos/auth_pb";
Here, we import core classes defined in the @grpc/grpc-js
package that define gRPC related core classes. In our case, we'll only handle unary server calls (gRPC also supports streaming calls, which we'll cover in a future post).
We then import typescript definitions of the protofiles we declared above (in our auth.proto
) that were generated by the make proto
command.
We then define mock user data that we'll use during login since we won't be handling any user registration logic.
// Mock user data for authentication
const USERS = new Map<string, User>();
USERS.set(
"admin", {
id: 1, username: "admin", password: "admin",
}
)
USERS.set(
"staff", {
id: 2, username: "staff", password: "staff",
}
)
All method implementations will follow a similar pattern. They all have 2 parameters: a call
parameter of type ServerUnaryCall
and a callback
parameter of type sendUnaryData
. Both types take generic arguments.
Our ServerUnaryCall
takes 2 generic arguments: the first one for the request definition and the other for the response definition. I'll therefore only explore the method userMe
to talk about how we implement the various methods defined in our protofiles.
Again, feel free to explore the whole file in the example repository.
export const userMe = (call: ServerUnaryCall<UserRequest, UserResponse>, callback: sendUnaryData<UserResponse>) => {
const response = new UserResponse();
const user = getLoggedInUser();
response.setId(`${user.id}`);
response.setStatus(200);
response.setError("");
response.setUsername(user.username);
response.setPassword(user.password);
callback(null, response)
}
When doing method implementations, our method names should correspond to the method names we defined in our protofiles. This is not a hard constraint, but is generally better since it's clear at first glance which method here corresponds to which method in the defined protofiles. In the case above, userMe
corresponds to UserMe
defined in the auth.proto
protofile.
The call
parameter in our method takes a ServerUnaryCall
with UserRequest
and UserResponse
as generic arguments. The UserRequest
and UserResponse
types are imported from src/protos/auth_pb.ts
which is part of the files that were generated by the protoc compiler when we ran make proto
. These class definitions directly correspond to your protofile message definitions all the way down to the data types. The protoc compiler also includes helper methods like setters and getters when generating source files that will allow you to manipulate values conveniently.
The callback
parameter takes a sendUnaryData
type with only one generic argument. The generic argument represents the type of response we'll be returning, in this case UserResponse
. The callback
function also takes an initial parameter (that we have passed null
as) which represents an error - if any - of type ServerErrorResponse
. You can use this parameter to throw gRPC specific errors i.e. when a user is not authenticated.
Inside the method definition, we declare a response
variable and instantiate it with the UserResponse
class. We then handle necessary logic and set the status values accordingly. We defined the status
variable in our protofiles to store standard HTTP status codes i.e. 200 OK since gRPC defines its own standard set of status codes. If we wanted to get any values attached in the incoming request, we can as well get a reference to this by invoking call.request
.
We lasty invoke the callback function with our response object and errors (if any).
That's pretty much all it takes to define and implement server logic in gRPC.
Server Set-Up
Lastly, we need to start our server and attach the our service implementations to this server instance. This will be done in the index.ts
file at the root of our project. Here, we'll also be covering a concept called interceptors
which will allow us to intercept our requests and responses and handle logic before they are passed to our services or before responses are returned to client. In our case, we'll use interceptors to check whether a client is already authenticated with our backend, and throw an error if not and the client is trying to invoke a method that requires authentication. We'll use the interceptor implementation from the package grpcjs-interceptors
.
const interceptors = require('grpcjs-interceptors');
const server = interceptors.serverProxy(new Server());
server.addService(ContactServiceService, {
addContact,
getContacts,
updateContact,
deleteContact
});
server.addService(AuthServiceService, {
login,
userMe,
refreshAccessToken
});
In this code snippet, we instantiate our server through the grpcjs-interceptors
package so that we can attach global interceptors to our server. We then add the services we just implemented to our server instance, in this case, the auth and contacts services.
const openEndpoints = [
'/auth.AuthService/Login',
'/auth.AuthService/RefreshAccessToken'
]
const checkAuthorizationToken = async function (ctx, next, callback) {
if (!openEndpoints.includes(ctx.service.path)) {
// check if user is authorized to access this route.
const metadata = ctx.call.metadata;
const authToken = metadata.get("authorization").toString();
const userIsAuthenticated = await isAuthenticated(authToken)
if (!userIsAuthenticated) {
callback(new Error("Unauthorized."))
return;
}
}
await next()
// do something before response goes back to client
}
We then define a variable openEndpoints
which will store the method definitions that do not require authentication. The strings defined here follow a specific syntax. You start off with the package name defined in your protofile i.e. auth
, then a forward slash(/
) followed by the service name as defined in your protofile i.e. AuthService
followed by a forward slash (/
) and lastly the method name, again as defined in your protofile i.e. Login
. This structure is defined by the gRPC standard and you can see it when you console.log a request's metadata, in this case: console.log(ctx.service.path)
.
Afterwards, we define our interceptor function which will handle logic for ensuring that only the closed endpoints are accessible by authenticated users. Here, we've defined an Error
object that will fire using gRPC specific status codes in case a user is not authenticated. We then cancel that request and respond with that error.
If you'd like to perform a function after the request has been processed, you should instead handle logic after the await next()
block.
server.use(checkAuthorizationToken);
server.bindAsync(host, ServerCredentials.createInsecure(), (err, port) => {
if (err) {
console.log(err)
return;
}
server.start();
console.log(`listening on ${host}`)
});
Lastly, we tell our server to use the interceptor we've just defined. You can attach more than one interceptor to your server by using the server.use(interceptor)
function.
We then bind our server to the host and port defined in src/values/globals.ts
.
We can now test our server implementation using postman. Here's the documentation on how to do that.
Conclusion
gRPC is a powerful and efficient communication protocol that is well-suited for modern microservices architecture. It offers several benefits, including high performance, language-independence, interoperability, code generation, and streaming. With official support for Node.js and TypeScript, it is easy to use in modern web applications. Although the use of interceptors may seem complex at first glance, they are a powerful tool for intercepting and modifying requests and responses, and can be used to implement complex authentication and authorization logic.
Feel free to share your thoughts!
Top comments (0)