Outages during high-traffic events such as ticket and merchandise releases are the stuff of nightmares for developers, and potentially lead to huge revenue loss for online businesses. If you’ve bought event tickets online before, it’s likely that you have spent some time waiting in a virtual queue, as they are one of the most reliable ways to combat this issue. As the first line of defense, the queue needs to be able to handle huge and highly variable traffic, and that's where edge computing comes in…
An example of a queue page from Queue-it.com
Queues allow websites to spread out the load on their servers over a longer period of time by sending all traffic through a service that only lets a certain number of users access the site at a time.
A virtual queue needs to be globally available and bear the brunt of incoming traffic from legitimate visitors during peak traffic events as well as malicious traffic from attacks deliberately timed to coincide with busy periods. Some cloud providers offer a queue (or waiting room) feature, though usually as an opaque service with a few variables you can configure.
A while ago I wrote about Edge Side Includes, and said that at Fastly we prefer to build great primitives and then give you the tools you need to build the right solution for you. Modern edge computing platforms like Compute@Edge provide the primitives needed to implement a bespoke queue solution, so we’ve written a tutorial that helps you put the right pieces together for a fully-featured waiting room at the edge.
Because the queue is an open-source JavaScript project, you can build upon it to add functionality in the way that works best for you, such as adding a way to notify users by email or SMS when they have reached the end of the queue, schedule queueing for important events, estimate the time that the user will have to wait, or even build separate queues per country using Geolocation.
The queue page that ships with the starter kit
The waiting room principle we're using here is like a ticket system at a shop counter. The system needs to know two numbers: the number of the user who has just reached the front of the queue (think of this as the "now serving" digital sign), and the number of the user at the back of the queue (which is like the number of the next ticket to come out of the ticket machine). The people in the waiting room need to know what number they are, but the system doesn't need to remember that information, as long as we make sure people can't cheat.
To implement such a mechanism, you need a state store that can atomically increment a counter. One of the most popular choices for this is Redis, which is a powerful in-memory data store.
As the Redis network protocol is not HTTP-based, in order to be able to speak to Redis from a Compute@Edge service you need a HTTP interface on top of it. Upstash is a Redis-as-a-service provider that offers such an interface, so we’ve taken advantage of their generous free tier for this example.
Check out the demo to see the queue in action, read the tutorial to learn how to build your own, or for those who want to get up and running even quicker, download the solution as a starter kit, which is a complete project that will work immediately and give you a functional foundation to build on as you explore what Fastly can do.
Top comments (0)