DEV Community

Cover image for API Rate Limiting in Node.js
Riky Fahri Hasibuan
Riky Fahri Hasibuan

Posted on • Originally published at codenoun.com

API Rate Limiting in Node.js

APIs form the backbone of modern web communication, and it's crucial to manage how often clients access them. Implementing rate limiting ensures your server remains responsive and secure by controlling the flow of requests to your API.

This guide focuses on the key strategies for implementing API rate limiting in Node.js, a widely used platform for building scalable web services.

What is API Rate Limiting?

API rate limiting restricts the number of requests a user or client can make to an API within a given timeframe. It's a safeguard against overuse and abuse, designed to ensure fair access to resources and maintain server health.

Why is API Rate Limiting Important?

  • DDoS Protection: Limits the impact of Distributed Denial of Service (DDoS) attacks by reducing the number of requests from a single source.
  • Improved Server Performance: Prevents server overload by distributing resources fairly among users.
  • Better User Experience: Ensures all users get timely responses by preventing misuse of the API.

Best Practices for API Rate Limiting in Node.js

1. Implement Middleware

Using middleware to manage rate limiting is both efficient and effective. The express-rate-limit package is one popular tool for this in Node.js, especially when working with the Express framework. You can install the package by typing npm i express-rate-limit in your console.

const rateLimit = require('express-rate-limit');

const limiter = rateLimit({
  windowMs: 15 * 60 * 1000, // 15 minutes
  max: 100, // Limit each IP to 100 requests per windowMs
  message: 'Too many requests from this IP, please try again after 15 minutes',
});

app.use('/api/', limiter);
Enter fullscreen mode Exit fullscreen mode

In this example:

  • windowMs sets a 15-minute window.
  • max limits each IP to 100 requests in that window.
  • message provides feedback when limits are exceeded.

Using middleware like this ensures requests are filtered early in the process, saving server resources.

2. Use Redis for Distributed Systems

For APIs running on multiple servers, rate limiting needs to be consistent across the entire system. Redis is often the go-to solution for shared storage in these cases. Combine express-rate-limit with rate-limit-redis for smooth implementation.

You'll need to install the following packages:

  • express: The web framework to create the API.
  • redis: Communicate with Redis to track and store request counts.
  • express-rate-limit: Middleware to handle rate limiting.
  • rate-limit-redis: Plugin to store rate limit data in Redis.
const RedisStore = require('rate-limit-redis');
const redis = require('redis');
const client = redis.createClient();

const limiter = rateLimit({
  store: new RedisStore({
    client: client,
  }),
  windowMs: 15 * 60 * 1000,
  max: 100,
});
Enter fullscreen mode Exit fullscreen mode

This setup ensures that request limits are maintained no matter which server handles the request, thanks to Redis acting as a central store. For the full explanation, You can find check out our article about how to implement API Rate Limiting with Redis and Node.js.

3. Add Limits for Different User Types

Different users have different needs. A common approach is to allow more requests for premium users while limiting requests for those on free plans.

const rateLimit = require('express-rate-limit');

const freeLimiter = rateLimit({
  windowMs: 15 * 60 * 1000,
  max: 50, // Free-tier users get 50 requests per window
});

const premiumLimiter = rateLimit({
  windowMs: 15 * 60 * 1000,
  max: 1000, // Premium users get 1000 requests per window
});

app.use('/api/free/', freeLimiter);
app.use('/api/premium/', premiumLimiter);
Enter fullscreen mode Exit fullscreen mode

This method helps balance user experience based on the service level.

4. Dynamic Rate Limiting

Static rate limits may not always reflect user needs. Some users may require higher limits at specific times, which can be handled by dynamically adjusting limits based on usage patterns.

let userRequestCount = 0;

app.use((req, res, next) => {
  if (userRequestCount < 100) {
    next();
  } else {
    res.status(429).send('Rate limit exceeded, please try again later.');
  }
});
Enter fullscreen mode Exit fullscreen mode

This flexibility allows your API to respond intelligently to varying usage scenarios.

5. Communicate with Retry Headers

Users appreciate knowing when they can try again. By adding a Retry-After header to rate-limited responses, you can guide users on how long to wait before making another request.

res.set('Retry-After', 60); // 60 seconds
res.status(429).send('Too many requests, please try again later.');
Enter fullscreen mode Exit fullscreen mode

This small step improves the overall user experience and reduces frustration for clients interacting with your API.

Monitoring and Fine-Tuning

Rate limiting should be continuously monitored and adjusted based on real-world usage patterns. Tracking key metrics such as the number of rate limit violations, API response times, and user feedback will help you make informed adjustments.

Key Metrics to Track

  • Rate Limit Violations: High numbers may indicate that the limits are too strict or that users require more flexibility.
  • Server Performance: Keeping an eye on response times can reveal if rate limiting has the desired effect.
  • User Feedback: Feedback from API users can provide insights into whether rate limits are too restrictive or if changes are needed.

Monitoring tools such as Prometheus and Grafana can provide real-time insights into how your rate limiting is performing and where adjustments may be needed.

Final Thoughts

API rate limiting is necessary for managing traffic, protecting resources, and ensuring fair usage. By following these practices in Node.js, you can build a resilient system that balances security with user experience.

Whether you're implementing basic limits or building dynamic systems that adjust in real time, effective rate limiting is an essential part of API management.

For more insights and tutorials, visit CodeNoun and learn how to build scalable Node.js applications efficiently.

Top comments (1)

Collapse
 
kishore80 profile image
Kishore Selvakumar

Great.!

Something that catches my mind is you have used redis to store the rate count. Considering redis stores the data in memory. This leads us chance of erasing the data when the server restarts.

Keeping the count and other relevant information in persistent storage layer might prevent such errors from happening.