Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
7e26fd6
feat: add token bucket rate limiter implementation
andy-stark-redis Mar 9, 2026
04382fc
feat: add interactive demo server for token bucket rate limiter
andy-stark-redis Mar 9, 2026
c49efa2
docs: add rate limiter use case documentation
andy-stark-redis Mar 9, 2026
979ac21
DOC-6349 small fixes
andy-stark-redis Mar 9, 2026
c62228d
Update content/develop/use-cases/rate-limiter/_index.md
andy-stark-redis Mar 9, 2026
617a80f
DOC-6349 link fixes
andy-stark-redis Mar 9, 2026
20c7544
feat: add token bucket rate limiter implementation with demo server
andy-stark-redis Mar 31, 2026
d706c00
docs: add Node.js token bucket rate limiter guide
andy-stark-redis Mar 31, 2026
ec16d99
feat: add interactive HTML UI and main entry point to demo server
andy-stark-redis Mar 31, 2026
26ba5a4
docs: expand Node.js rate limiter guide with algorithm explanation and L
andy-stark-redis Mar 31, 2026
f1ec659
docs: fix Node.js demo server filename in rate limiter guide
andy-stark-redis Mar 31, 2026
0d884f0
docs: add Go token bucket rate limiter implementation
andy-stark-redis Mar 31, 2026
7ab6495
docs: add Go token bucket rate limiter implementation
andy-stark-redis Mar 31, 2026
d88cbca
docs: add Go token bucket rate limiter demo server
andy-stark-redis Mar 31, 2026
d709ce7
docs: update Go rate limiter examples with time.Duration and configurati
andy-stark-redis Mar 31, 2026
10c0883
chore: add Go module files for rate limiter
andy-stark-redis Mar 31, 2026
0a9b68e
docs: add Java token bucket rate limiter implementation with Jedis
andy-stark-redis Mar 31, 2026
5528ee8
docs: add Java demo server for token bucket rate limiter with interactiv
andy-stark-redis Mar 31, 2026
daaa574
docs: add Java token bucket rate limiter implementation with Jedis
andy-stark-redis Mar 31, 2026
6534656
docs: update TokenBucket constructor parameter order in DemoServer
andy-stark-redis Mar 31, 2026
9aed87c
docs: update Java Jedis rate limiter implementation to match API changes
andy-stark-redis Mar 31, 2026
a6dda57
docs: add .NET token bucket rate limiter implementation with Redis
andy-stark-redis Mar 31, 2026
f388f29
docs: add .NET token bucket rate limiter implementation with Redis
andy-stark-redis Mar 31, 2026
c9a2269
docs: add .NET token bucket rate limiter implementation with Redis
andy-stark-redis Mar 31, 2026
a2c9cd5
refactor: extract TokenBucket rate limiter to separate class file
andy-stark-redis Mar 31, 2026
0047cc6
docs: update .NET rate limiter to use synchronous API instead of async
andy-stark-redis Mar 31, 2026
6a43274
chore: add .gitignore for .NET build artifacts
andy-stark-redis Mar 31, 2026
671493c
feat: add TokenBucket rate limiter implementation with Lettuce
andy-stark-redis Mar 31, 2026
c095f2b
feat: add interactive demo server for token bucket rate limiter
andy-stark-redis Mar 31, 2026
4407c5d
docs: add Java Lettuce token bucket rate limiter guide
andy-stark-redis Mar 31, 2026
d28bd9a
docs: fix TokenBucket initialization to use sync() connection and update
andy-stark-redis Mar 31, 2026
9eaba51
docs: add PHP Predis token bucket rate limiter guide
andy-stark-redis Mar 31, 2026
8f186d0
docs: add PHP demo server for token bucket rate limiter
andy-stark-redis Mar 31, 2026
32d1b7f
docs: add PHP token bucket rate limiter guide
andy-stark-redis Mar 31, 2026
38233d6
docs: fix PHP rate limiter examples and update demo server instructions
andy-stark-redis Mar 31, 2026
a7781db
docs: add Ruby token bucket rate limiter guide
andy-stark-redis Apr 1, 2026
dea581a
docs: add Ruby token bucket rate limiter demo server
andy-stark-redis Apr 1, 2026
f18fb8e
docs: add Ruby token bucket rate limiter guide
andy-stark-redis Apr 1, 2026
8131428
docs: update Ruby rate limiter examples to use Hash return value
andy-stark-redis Apr 1, 2026
df01dd6
docs: add Rust token bucket rate limiter guide
andy-stark-redis Apr 1, 2026
75098f6
Create the Rust token bucket module
andy-stark-redis Apr 1, 2026
7c4bb56
build: add Rust rate limiter demo server and build artifacts
andy-stark-redis Apr 1, 2026
b106e9f
docs(rate-limiter): ignore Rust demo build artifacts
andy-stark-redis Apr 1, 2026
2919970
DOC-6349 Lettuce async/reactive examples
andy-stark-redis Apr 2, 2026
54e7b15
DOC-6349 hide content (for now)
andy-stark-redis Apr 2, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,6 @@ package-lock.json
.vscode/
.DS_Store
.idea
# Rust docs demos
/content/develop/use-cases/rate-limiter/rust/target/
/content/develop/use-cases/rate-limiter/rust/Cargo.lock
21 changes: 21 additions & 0 deletions content/develop/use-cases/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
---
categories:
- docs
- develop
- stack
- oss
- rs
- rc
description: Learn how to implement common use cases with Redis
linkTitle: Use cases
title: Redis use cases
weight: 50
draft: true
---

This section provides practical examples and reference implementations for common Redis use cases.

## Available use cases

* [Rate limiting]({{< relref "/develop/use-cases/rate-limiter" >}}) - Implement token bucket rate limiting with Redis

215 changes: 215 additions & 0 deletions content/develop/use-cases/rate-limiter/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,215 @@
---
categories:
- docs
- develop
- stack
- oss
- rs
- rc
description: Implement a token bucket rate limiter using Redis and Lua scripts
linkTitle: Rate limiter
title: Token bucket rate limiter with Redis
weight: 1
---

This guide shows you how to implement a distributed token bucket rate limiter using Redis and Lua scripts for atomic operations.

## Overview

Rate limiting is a critical technique for controlling the rate at which operations are performed. Common use cases include:

* Limiting API requests per user or IP address
* Preventing abuse and protecting against denial-of-service attacks
* Ensuring fair resource allocation across multiple clients
* Throttling background jobs or batch operations

The **token bucket algorithm** is a popular rate limiting approach that allows bursts of traffic while maintaining an average rate limit over time.

## How it works

The token bucket algorithm works like a bucket that holds tokens:

1. **Initialization**: The bucket starts with a maximum capacity of tokens
2. **Refill**: Tokens are added to the bucket at a constant rate (for example, 1 token per second)
3. **Consumption**: Each request consumes one token from the bucket
4. **Decision**: If tokens are available, the request is allowed; otherwise, it's denied
5. **Capacity limit**: The bucket never exceeds its maximum capacity

This approach allows for burst traffic (using accumulated tokens) while enforcing an average rate limit over time.

### Why use Redis?

Redis is ideal for distributed rate limiting because:

* **Atomic operations**: Lua scripts execute atomically, preventing race conditions
* **Shared state**: Multiple application servers can share the same rate limit counters
* **High performance**: In-memory operations provide microsecond latency
* **Automatic expiration**: Keys can be set to expire automatically (though not used in this implementation)

## The Lua script

The core of this implementation is a Lua script that runs atomically on the Redis server. This ensures that checking and updating the token bucket happens in a single operation, preventing race conditions in distributed environments.

Here's how the script works:

```lua
local key = KEYS[1]
local capacity = tonumber(ARGV[1])
local refill_rate = tonumber(ARGV[2])
local refill_interval = tonumber(ARGV[3])
local now = tonumber(ARGV[4])

-- Get current state or initialize
local bucket = redis.call('HMGET', key, 'tokens', 'last_refill')
local tokens = tonumber(bucket[1])
local last_refill = tonumber(bucket[2])

-- Initialize if this is the first request
if tokens == nil then
tokens = capacity
last_refill = now
end

-- Calculate token refill
local time_passed = now - last_refill
local refills = math.floor(time_passed / refill_interval)

if refills > 0 then
tokens = math.min(capacity, tokens + (refills * refill_rate))
last_refill = last_refill + (refills * refill_interval)
end

-- Try to consume a token
local allowed = 0
if tokens >= 1 then
tokens = tokens - 1
allowed = 1
end

-- Update state
redis.call('HMSET', key, 'tokens', tokens, 'last_refill', last_refill)

-- Return result: allowed (1 or 0) and remaining tokens
return {allowed, tokens}
```

### Script breakdown

1. **State retrieval**: Uses [`HMGET`]({{< relref "/commands/hmget" >}}) to fetch the current token count and last refill time from a hash
2. **Initialization**: On first use, sets tokens to full capacity
3. **Token refill calculation**: Computes how many tokens should be added based on elapsed time
4. **Capacity enforcement**: Uses `math.min()` to ensure tokens never exceed capacity
5. **Token consumption**: Decrements the token count if available
6. **State update**: Uses [`HMSET`]({{< relref "/commands/hmset" >}}) to save the new state
7. **Return value**: Returns both the decision (allowed/denied) and remaining tokens

### Why atomicity matters

Without atomic execution, race conditions could occur:

* **Double spending**: Two requests could read the same token count and both succeed when only one should
* **Lost updates**: Concurrent updates could overwrite each other's changes
* **Inconsistent state**: Token count and refill time could become desynchronized

Using [`EVAL`]({{< relref "/commands/eval" >}}) or [`EVALSHA`]({{< relref "/commands/evalsha" >}}) ensures the entire operation executes atomically, making it safe for distributed systems.

## Using the Python module

The `TokenBucket` class provides a simple interface for rate limiting
([source](token_bucket.py)):

```python
import redis
from token_bucket import TokenBucket

# Create a Redis connection
r = redis.Redis(host='localhost', port=6379, decode_responses=True)

# Create a rate limiter: 10 requests per second
limiter = TokenBucket(
redis_client=r,
capacity=10, # Maximum burst size
refill_rate=1, # Add 1 token per interval
refill_interval=1.0 # Every 1 second
)

# Check if a request should be allowed
allowed, remaining = limiter.allow('user:123')

if allowed:
print(f"Request allowed. {remaining} tokens remaining.")
# Process the request
else:
print("Request denied. Rate limit exceeded.")
# Return 429 Too Many Requests
```

### Configuration parameters

* **capacity**: Maximum number of tokens in the bucket (controls burst size)
* **refill_rate**: Number of tokens added per refill interval
* **refill_interval**: Time in seconds between refills

For example:
* `capacity=10, refill_rate=1, refill_interval=1.0` allows 10 requests per second with bursts up to 10
* `capacity=100, refill_rate=10, refill_interval=1.0` allows 10 requests per second with bursts up to 100
* `capacity=60, refill_rate=1, refill_interval=60.0` allows 1 request per minute with bursts up to 60

### Rate limit keys

The `key` parameter identifies what you're rate limiting. Common patterns:

* **Per user**: `user:{user_id}` - Limit each user independently
* **Per IP address**: `ip:{ip_address}` - Limit by client IP
* **Per API endpoint**: `api:{endpoint}:{user_id}` - Different limits per endpoint
* **Global**: `global:api` - Single limit shared across all requests

## Running the demo

A demonstration web server is included to show the rate limiter in action
([source](demo_server.py)):

```bash
# Install dependencies
pip install redis

# Run the demo server
python demo_server.py
```

The demo provides an interactive web interface where you can:

* Submit requests and see them allowed or denied in real-time
* View the current token count
* Adjust rate limit parameters dynamically
* Test different rate limiting scenarios

The demo assumes Redis is running on `localhost:6379` but you can easily change the host
and port in the `demo_server.py` script. Visit `http://localhost:8080` in your browser to try it out.

## Response headers

It's common to include rate limit information in HTTP response headers:

```python
allowed, remaining = limiter.allow(f'user:{user_id}')

# Add standard rate limit headers
response.headers['X-RateLimit-Limit'] = str(limiter.capacity)
response.headers['X-RateLimit-Remaining'] = str(int(remaining))
response.headers['X-RateLimit-Reset'] = str(int(time.time() + limiter.refill_interval))

if not allowed:
response.status_code = 429 # Too Many Requests
response.headers['Retry-After'] = str(int(limiter.refill_interval))
```

## Learn more

* [EVAL command]({{< relref "/commands/eval" >}}) - Execute Lua scripts
* [EVALSHA command]({{< relref "/commands/evalsha" >}}) - Execute cached Lua scripts
* [Lua scripting]({{< relref "/develop/programmability/eval-intro" >}}) - Introduction to Redis Lua scripting
* [HMGET command]({{< relref "/commands/hmget" >}}) - Get multiple hash fields
* [HMSET command]({{< relref "/commands/hmset" >}}) - Set multiple hash fields
* [Transactions]({{< relref "/develop/using-commands/transactions" >}}) - Alternative to Lua scripts for atomicity

Loading
Loading