APIs are the backbone of modern applications, but if not protected, they can be overwhelmed by excessive requests from clients. To ensure **fair usage, reliability, and performance**, we use **Throttling** in Web API.
🔹 What is Throttling?
Throttling is a mechanism that **limits the number of API requests a client can make within a given time frame**. It prevents abuse, protects server resources, and ensures all clients get a fair share of the system’s capacity.
For example:
* A client is allowed only **100 requests per minute**.
* If they exceed the limit, the API returns **HTTP 429 (Too Many Requests)**.
🔹 Why Do We Need Throttling?
* ✅ **Prevents server overload** – Protects from heavy traffic or denial-of-service (DoS) attacks.
* ✅ **Fair usage policy** – Ensures no single user hogs all the resources.
* ✅ **Cost efficiency** – Reduces unnecessary server and bandwidth usage.
* ✅ **Improved reliability** – Keeps the API stable and consistent.
🔹 Throttling Strategies
There are multiple approaches to implement throttling:
1. **Fixed Window**
* Restricts requests in fixed time slots.
* Example: 100 requests allowed between 12:00–12:01.
2. **Sliding Window**
* Uses a rolling time frame for more accuracy.
* Example: If a request is made at 12:00:30, the limit resets at 12:01:30.
3. **Token Bucket**
* A bucket holds tokens, each request consumes one. Tokens refill at a fixed rate.
* Allows short bursts of traffic until the bucket is empty.
4. **Leaky Bucket**
* Similar to Token Bucket but processes requests at a fixed outflow rate.
* Ensures smooth traffic flow without sudden spikes.
🔹 Implementing Throttling in .NET Web API
✅ Option 1: Custom Middleware
You can create your own middleware to limit requests per client:
`csharp
public class ThrottlingMiddleware
{
private static Dictionary<string, (DateTime timestamp, int count)> _requests = new();
private readonly RequestDelegate _next;
private const int LIMIT = 5; // max 5 requests
private static readonly TimeSpan TIME_WINDOW = TimeSpan.FromMinutes(1);
public ThrottlingMiddleware(RequestDelegate next) => _next = next;
public async Task Invoke(HttpContext context)
{
var clientIp = context.Connection.RemoteIpAddress?.ToString();
if (_requests.ContainsKey(clientIp))
{
var (timestamp, count) = _requests[clientIp];
if ((DateTime.Now - timestamp) < TIME_WINDOW)
{
if (count >= LIMIT)
{
context.Response.StatusCode = StatusCodes.Status429TooManyRequests;
await context.Response.WriteAsync("Too many requests. Try again later.");
return;
}
_requests[clientIp] = (timestamp, count + 1);
}
else
{
_requests[clientIp] = (DateTime.Now, 1);
}
}
else
{
_requests[clientIp] = (DateTime.Now, 1);
}
await _next(context);
}
}
Register in **Program.cs**:
csharp
app.UseMiddleware<ThrottlingMiddleware>();
✅ Option 2: Built-in Rate Limiting in .NET 7+
ASP.NET Core 7 introduced built-in **Rate Limiting Middleware**:
```csharp
builder.Services.AddRateLimiter(options =>
{
options.AddFixedWindowLimiter("Fixed", opt =>
{
opt.Window = TimeSpan.FromSeconds(10);
opt.PermitLimit = 5; // 5 requests per 10 seconds
opt.QueueLimit = 2;
opt.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
});
});
app.UseRateLimiter();
Apply to a specific endpoint:
```csharp
app.MapGet("/data", () => "Hello")
.RequireRateLimiting("Fixed");
🔹 Best Practices for API Throttling
* Always return **HTTP 429 Too Many Requests** when limits are hit.
* Provide a **Retry-After header** to guide clients on when to retry.
* Implement **per-user or per-IP throttling** for fairness.
* Use **distributed caching (Redis, SQL, etc.)** when running multiple servers.
* Log throttling events to monitor abuse patterns.
🔹 Final Thoughts
Throttling is essential for any production-ready API. It helps maintain **performance, security, and fair usage**. Whether you use a **custom middleware** or the **built-in .NET rate limiter**, implementing throttling ensures your API remains **reliable and scalable**.
No comments:
Post a Comment