Rate Limiting in Node.js Using Redis and Token Bucket Algorithm
Rate Limiting in Node.js Using Redis and Token Bucket Algorithm APIs exposed to the public or third-party clients are vulnerable to abuse, intentional or otherwise. Implementing proper rate limiting protects your application from spamming, DDoS attacks, and system overload. In this guide, we’ll implement an efficient, scalable rate limiting middleware in Node.js using Redis and the token bucket algorithm. Why Redis and Token Bucket? The token bucket algorithm is one of the most popular and flexible rate-limiting algorithms. Redis enables centralized tracking of limits across distributed systems, making it perfect for scalable applications. Step 1: Project Setup npm init -y npm install express redis Step 2: Configure Redis Client Use the official Redis client: // redisClient.js const redis = require('redis'); const client = redis.createClient(); client.on('error', (err) => { console.error('Redis error:', err); }); client.connect(); module.exports = client; Step 3: Token Bucket Middleware // rateLimiter.js const client = require('./redisClient'); const rateLimiter = (options) => { const { bucketSize = 10, refillRate = 1, // tokens per second interval = 1000, } = options; return async (req, res, next) => { const ip = req.ip; const key = `rate-limit:${ip}`; const now = Date.now(); const data = await client.hGetAll(key); let tokens = bucketSize; let lastRefill = now; if (data.tokens && data.lastRefill) { tokens = parseFloat(data.tokens); lastRefill = parseInt(data.lastRefill, 10); const elapsed = (now - lastRefill) / 1000; tokens = Math.min(bucketSize, tokens + elapsed * refillRate); } if (tokens < 1) { return res.status(429).json({ message: 'Too Many Requests' }); } tokens -= 1; await client.hSet(key, { tokens: tokens.toString(), lastRefill: now.toString(), }); next(); }; }; module.exports = rateLimiter; Step 4: Use the Middleware // server.js const express = require('express'); const rateLimiter = require('./rateLimiter'); const app = express(); app.use(rateLimiter({ bucketSize: 20, refillRate: 0.5, // One token every 2 seconds })); app.get('/', (req, res) => { res.send('Welcome to a rate-limited API!'); }); app.listen(3000, () => { console.log('Server running on http://localhost:3000'); }); How It Works Each IP address is associated with a Redis hash that stores token count and last refill time. When a request comes in: It calculates how many tokens to refill based on elapsed time. If tokens are available, it allows the request and decrements the token count. If not, it blocks the request with a 429 Too Many Requests response. Advanced Features You Could Add Whitelist or bypass trusted IPs or routes Rate limit by API key or user ID Real-time admin dashboard for monitoring limits Different limits for different endpoints or user tiers Conclusion Rate limiting is essential for securing and scaling modern APIs. Using Redis and a token bucket algorithm gives you high performance and flexibility across distributed systems. You can tune the bucket size and refill rate to match the behavior you want to enforce, and add custom logic for VIP users or special routes. If you found this helpful, consider supporting me: buymeacoffee.com/hexshift
Rate Limiting in Node.js Using Redis and Token Bucket Algorithm
APIs exposed to the public or third-party clients are vulnerable to abuse, intentional or otherwise. Implementing proper rate limiting protects your application from spamming, DDoS attacks, and system overload. In this guide, we’ll implement an efficient, scalable rate limiting middleware in Node.js using Redis and the token bucket algorithm.
Why Redis and Token Bucket?
The token bucket algorithm is one of the most popular and flexible rate-limiting algorithms. Redis enables centralized tracking of limits across distributed systems, making it perfect for scalable applications.
Step 1: Project Setup
npm init -y
npm install express redis
Step 2: Configure Redis Client
Use the official Redis client:
// redisClient.js
const redis = require('redis');
const client = redis.createClient();
client.on('error', (err) => {
console.error('Redis error:', err);
});
client.connect();
module.exports = client;
Step 3: Token Bucket Middleware
// rateLimiter.js
const client = require('./redisClient');
const rateLimiter = (options) => {
const {
bucketSize = 10,
refillRate = 1, // tokens per second
interval = 1000,
} = options;
return async (req, res, next) => {
const ip = req.ip;
const key = `rate-limit:${ip}`;
const now = Date.now();
const data = await client.hGetAll(key);
let tokens = bucketSize;
let lastRefill = now;
if (data.tokens && data.lastRefill) {
tokens = parseFloat(data.tokens);
lastRefill = parseInt(data.lastRefill, 10);
const elapsed = (now - lastRefill) / 1000;
tokens = Math.min(bucketSize, tokens + elapsed * refillRate);
}
if (tokens < 1) {
return res.status(429).json({ message: 'Too Many Requests' });
}
tokens -= 1;
await client.hSet(key, {
tokens: tokens.toString(),
lastRefill: now.toString(),
});
next();
};
};
module.exports = rateLimiter;
Step 4: Use the Middleware
// server.js
const express = require('express');
const rateLimiter = require('./rateLimiter');
const app = express();
app.use(rateLimiter({
bucketSize: 20,
refillRate: 0.5, // One token every 2 seconds
}));
app.get('/', (req, res) => {
res.send('Welcome to a rate-limited API!');
});
app.listen(3000, () => {
console.log('Server running on http://localhost:3000');
});
How It Works
Each IP address is associated with a Redis hash that stores token count and last refill time. When a request comes in:
- It calculates how many tokens to refill based on elapsed time.
- If tokens are available, it allows the request and decrements the token count.
- If not, it blocks the request with a
429 Too Many Requests
response.
Advanced Features You Could Add
- Whitelist or bypass trusted IPs or routes
- Rate limit by API key or user ID
- Real-time admin dashboard for monitoring limits
- Different limits for different endpoints or user tiers
Conclusion
Rate limiting is essential for securing and scaling modern APIs. Using Redis and a token bucket algorithm gives you high performance and flexibility across distributed systems. You can tune the bucket size and refill rate to match the behavior you want to enforce, and add custom logic for VIP users or special routes.
If you found this helpful, consider supporting me: buymeacoffee.com/hexshift