DEV Community

Dmitry Kozhedubov
Dmitry Kozhedubov

Posted on • Updated on

How to (performance) test a gRPC service

One of the common questions about gRPC services seems to be this - how do I quickly test my service, is there a Postman or something? Turns out, there is. Today I’ll talk about two tools that are indispensable for anyone who works with gRPC more or less regularly.

Refresher on gRPC

gRPC is a high performance RPC framework that uses Protocol Buffers (protobuf) for serialization. Serialized protobuf messages are not human readable, which introduces some difficulties with testing. Normally you'd generate client stub with a plugin for the language of your choice and then implement the client logic.

Can’t you just build a client?

Well, technically you can, but is this the most efficient way? You’d have to download dependencies, generate stubs, implement the client - quite a few steps when all you need do to in case of a JSON API is to fire up Postman or cURL and throw some JSON at it.

The latter was apparently the inspiration for the folks at FullStory that built a tool called, not surprisingly, grpcurl. This tool allows you to do many things you’d normally use cURL for, but with gRPC, including building requests and viewing responses in your terminal in plain JSON.

Show, not tell

Assuming you've got your Hello World server from the official Go quick start running, this is how you'd make a request to it ⬇️

grpcurl -plaintext -proto  helloworld/helloworld.proto \
-d '{ "name": "Dmitry" }' \
localhost:50051 helloworld.Greeter/SayHello
Enter fullscreen mode Exit fullscreen mode

In the first line we tell grpcurl to use plain text connection as opposed to TLS and specify a .proto file that will be used to figure out request formats and method signatures. Next, we have request payload in a JSON document that follows the structure of the proto message. Finally, we specify the host, service and the RPC to call.

As you've probably guessed already, you'll see JSON in return as well. Pretty effortless too, right?

{
  "message": "Hello Dmitry"
}
Enter fullscreen mode Exit fullscreen mode

A couple of things I'd personally want to be able to do with grpcurl in the future:

  • be able to use serialized protobuf messages instead of having to specify JSON request
  • save requests and create collections as you would do in Postman (looks like FullStory team is already going in this direction with grpcui project)

Load testing

Being able to test gRPC API with cURL like command is great, but what if you need to check how your service behaves under more serious load? Fortunately, there's a tool for that too. It's called ghz and it pretty much resembles some other benchmarking tools you're likely familiar with - ApacheBench (ab) and its more modern alternative hey.

This is how you'd throw a million requests at the above-mentioned Go Hello World server ⬇️

ghz -c 100 -n 1000000 --insecure \
  --proto helloworld/helloworld.proto \
  --call helloworld.Greeter.SayHello \
  -d '{"name":"Joe"}' \
  localhost:50051
Enter fullscreen mode Exit fullscreen mode

Structure of the command is very similar to that of gprсurl, except we specify concurrency (100) and total number of requests (1M) as we would do with ApacheBench.

In response you'll get a nice summary with a classic average response time, requests per seconds and well as percentiles and a histogram:

Summary:
  Count:    1000000
  Total:    20.77 s
  Slowest:  18.05 ms
  Fastest:  0.15 ms
  Average:  2.00 ms
  Requests/sec: 48143.58

Response time histogram:
  0.145 [1] |
  1.935 [561604]    |∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎
  3.725 [393260]    |∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎
  5.516 [37030] |∎∎∎
  7.306 [5976]  |
  9.096 [1302]  |
  10.886 [557]  |
  12.677 [189]  |
  14.467 [50]   |
  16.257 [26]   |
  18.047 [5]    |

Latency distribution:
  10 % in 1.13 ms
  25 % in 1.44 ms
  50 % in 1.83 ms
  75 % in 2.34 ms
  90 % in 3.04 ms
  95 % in 3.63 ms
  99 % in 5.28 ms

Status code distribution:
  [OK]   1000000 responses
Enter fullscreen mode Exit fullscreen mode

Some cool ghz features include:

  • ability to use serialized binary messages
  • JSON/TOML config files as opposed to command line arguments
  • support for streaming RPCs

Conclusion

Getting started with gRPC can be frustrating, especially from a tooling standpoint when coming from a REST API background. Fortunately, the ecosystem is growing very quickly and it's certainly possible to build a very efficient and reliable workflow for developing gRPC services.

Top comments (1)

Collapse
 
fbe7757 profile image
Ragavendra Nagraj

Nice. Check this one out too PerfService . It can be integrated with Grafana or Promotheus as well to see live metrics. Tests can be for http or gRPC and can be custom written/ updated in code.