Rate limiting middleware in ASP.NET Core
By Arvin Kahbazi, Maarten Balliauw, and Rick Anderson
The Microsoft.AspNetCore.RateLimiting
middleware provides rate limiting middleware. Apps configure rate limiting policies and then attach the policies to endpoints. Apps using rate limiting should be carefully load tested and reviewed before deploying. See Testing endpoints with rate limiting in this article for more information.
For an introduction to rate limiting, see Rate limiting middleware.
Why use rate limiting
Rate limiting can be used for managing the flow of incoming requests to an app. Key reasons to implement rate limiting:
- Preventing Abuse: Rate limiting helps protect an app from abuse by limiting the number of requests a user or client can make in a given time period. This is particularly important for public APIs.
- Ensuring Fair Usage: By setting limits, all users have fair access to resources, preventing users from monopolizing the system.
- Protecting Resources: Rate limiting helps prevent server overload by controlling the number of requests that can be processed, thus protecting the backend resources from being overwhelmed.
- Enhancing Security: It can mitigate the risk of Denial of Service (DoS) attacks by limiting the rate at which requests are processed, making it harder for attackers to flood a system.
- Improving Performance: By controlling the rate of incoming requests, optimal performance and responsiveness of an app can be maintained, ensuring a better user experience.
- Cost Management: For services that incur costs based on usage, rate limiting can help manage and predict expenses by controlling the volume of requests processed.
Implementing rate limiting in an ASP.NET Core app can help maintain stability, security, and performance, ensuring a reliable and efficient service for all users.
Preventing DDoS Attacks
While rate limiting can help mitigate the risk of Denial of Service (DoS) attacks by limiting the rate at which requests are processed, it's not a comprehensive solution for Distributed Denial of Service (DDoS) attacks. DDoS attacks involve multiple systems overwhelming an app with a flood of requests, making it difficult to handle with rate limiting alone.
For robust DDoS protection, consider using a commercial DDoS protection service. These services offer advanced features such as:
- Traffic Analysis: Continuous monitoring and analysis of incoming traffic to detect and mitigate DDoS attacks in real-time.
- Scalability: The ability to handle large-scale attacks by distributing traffic across multiple servers and data centers.
- Automated Mitigation: Automated response mechanisms to quickly block malicious traffic without manual intervention.
- Global Network: A global network of servers to absorb and mitigate attacks closer to the source.
- Constant Updates: Commercial services continuously track and update their protection mechanisms to adapt to new and evolving threats.
When using a cloud hosting service, DDoS protection is usually available as part of the hosting solution, such as Azure Web Application Firewall, AWS Shield or Google Cloud Armor. Dedicated protections are available as Web Application Firewalls (WAF) or as part of a CDN solution such as Cloudflare or Akamai Kona Site Defender
Implementing a commercial DDoS protection service in conjunction with rate limiting can provide a comprehensive defense strategy, ensuring the stability, security, and performance of an app.
Use Rate Limiting Middleware
The following steps show how to use the rate limiting middleware in an ASP.NET Core app:
- Install the
Microsoft.AspNetCore.RateLimiting
package.:
Add the Microsoft.AspNetCore.RateLimiting
package to the project, via the NuGet Package Manager or the following command:
dotnet add package Microsoft.AspNetCore.RateLimiting
- Configure rate limiting services.
In the Program.cs
file, configure the rate limiting services by adding the appropriate rate limiting policies. Policies can either be defined as global or named polices. The following example permits 10 requests per minute by user (identity) or globally:
builder.Services.AddRateLimiter(options =>
{
options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
RateLimitPartition.GetFixedWindowLimiter(
partitionKey: httpContext.User.Identity?.Name ?? httpContext.Request.Headers.Host.ToString(),
factory: partition => new FixedWindowRateLimiterOptions
{
AutoReplenishment = true,
PermitLimit = 10,
QueueLimit = 0,
Window = TimeSpan.FromMinutes(1)
}));
});
Named polices need to be explicitly applied to the pages or endpoints. The following example adds a fixed window limiter policy named "fixed"
which we'll add to an endpoint later:
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddRateLimiter(options =>
{
options.AddFixedWindowLimiter("fixed", opt =>
{
opt.PermitLimit = 4;
opt.Window = TimeSpan.FromSeconds(12);
opt.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
opt.QueueLimit = 2;
});
});
var app = builder.Build();
The global limiter applies to all endpoints automatically when it's configured via options.GlobalLimiter.
Enable rate limiting middleware
In the
Program.cs
file, enable the rate limiting middleware by calling UseRateLimiter:
app.UseRouting();
app.UseRateLimiter();
app.UseEndpoints(endpoints =>
{
endpoints.MapControllers();
});
app.Run();
Apply rate limiting policies to endpoints or pages
Apply rate limiting to WebAPI Endpoints
Apply a named policy to the endpoint or group, for example:
app.MapGet("/api/resource", () => "This endpoint is rate limited")
.RequireRateLimiting("fixed"); // Apply specific policy to an endpoint
Apply rate limiting to MVC Controllers
Apply the configured rate limiting policies to specific endpoints or globally. For example, to apply the "fixed" policy to all controller endpoints:
app.UseEndpoints(endpoints =>
{
endpoints.MapControllers().RequireRateLimiting("fixed");
});
Apply rate limiting to server-side Blazor apps
To set rate limiting for all of the app's routable Razor components, specify RequireRateLimiting with the rate limiting policy name on the MapRazorComponents call in the Program
file. In the following example, the rate limiting policy named "policy
" is applied:
app.MapRazorComponents<App>()
.AddInteractiveServerRenderMode()
.RequireRateLimiting("policy");
To set a policy for a single routable Razor component or a folder of components via an _Imports.razor
file, the [EnableRateLimiting]
attribute is applied with the policy name. In the following example, the rate limiting policy named "override
" is applied. The policy replaces any policies currently applied to the endpoint. The global limiter still runs on the endpoint with this attribute applied.
@page "/counter"
@using Microsoft.AspNetCore.RateLimiting
@attribute [EnableRateLimiting("override")]
<h1>Counter</h1>
The [EnableRateLimiting]
attribute is only applied to a routable component or a folder of components via an _Imports.razor
file if RequireRateLimiting is not called on MapRazorComponents.
The [DisableRateLimiting]
attribute is used to disable rate limiting for a routable component or a folder of components via an _Imports.razor
file.
Rate limiter algorithms
The RateLimiterOptionsExtensions
class provides the following extension methods for rate limiting:
The fixed, sliding, and token limiters all limit the maximum number of requests in a time period. The concurrency limiter limits only the number of concurrent requests and doesn't cap the number of requests in a time period. The cost of an endpoint should be considered when selecting a limiter. The cost of an endpoint includes the resources used, for example, time, data access, CPU, and I/O.
Fixed window limiter
The AddFixedWindowLimiter
method uses a fixed time window to limit requests. When the time window expires, a new time window starts and the request limit is reset.
Consider the following code:
using Microsoft.AspNetCore.RateLimiting;
using System.Threading.RateLimiting;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddRateLimiter(_ => _
.AddFixedWindowLimiter(policyName: "fixed", options =>
{
options.PermitLimit = 4;
options.Window = TimeSpan.FromSeconds(12);
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = 2;
}));
var app = builder.Build();
app.UseRateLimiter();
static string GetTicks() => (DateTime.Now.Ticks & 0x11111).ToString("00000");
app.MapGet("/", () => Results.Ok($"Hello {GetTicks()}"))
.RequireRateLimiting("fixed");
app.Run();
The preceding code:
- Calls AddRateLimiter to add a rate limiting service to the service collection.
- Calls
AddFixedWindowLimiter
to create a fixed window limiter with a policy name of"fixed"
and sets: - PermitLimit to 4 and the time Window to 12. A maximum of 4 requests per each 12-second window are allowed.
- QueueProcessingOrder to OldestFirst.
- QueueLimit to 2 (set this to 0 to disable the queueing mechanism).
- Calls UseRateLimiter to enable rate limiting.
Apps should use Configuration to set limiter options. The following code updates the preceding code using MyRateLimitOptions
for configuration:
using System.Threading.RateLimiting;
using Microsoft.AspNetCore.RateLimiting;
using WebRateLimitAuth.Models;
var builder = WebApplication.CreateBuilder(args);
builder.Services.Configure<MyRateLimitOptions>(
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit));
var myOptions = new MyRateLimitOptions();
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit).Bind(myOptions);
var fixedPolicy = "fixed";
builder.Services.AddRateLimiter(_ => _
.AddFixedWindowLimiter(policyName: fixedPolicy, options =>
{
options.PermitLimit = myOptions.PermitLimit;
options.Window = TimeSpan.FromSeconds(myOptions.Window);
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
}));
var app = builder.Build();
app.UseRateLimiter();
static string GetTicks() => (DateTime.Now.Ticks & 0x11111).ToString("00000");
app.MapGet("/", () => Results.Ok($"Fixed Window Limiter {GetTicks()}"))
.RequireRateLimiting(fixedPolicy);
app.Run();
UseRateLimiter must be called after UseRouting
when rate limiting endpoint specific APIs are used. For example, if the [EnableRateLimiting]
attribute is used, UseRateLimiter
must be called after UseRouting
. When calling only global limiters, UseRateLimiter
can be called before UseRouting
.
Sliding window limiter
A sliding window algorithm:
- Is similar to the fixed window limiter but adds segments per window. The window slides one segment each segment interval. The segment interval is (window time)/(segments per window).
- Limits the requests for a window to
permitLimit
requests. - Each time window is divided in
n
segments per window. - Requests taken from the expired time segment one window back (
n
segments prior to the current segment) are added to the current segment. We refer to the most expired time segment one window back as the expired segment.
Consider the following table that shows a sliding window limiter with a 30-second window, three segments per window, and a limit of 100 requests:
- The top row and first column shows the time segment.
- The second row shows the remaining requests available. The remaining requests are calculated as the available requests minus the processed requests plus the recycled requests.
- Requests at each time moves along the diagonal blue line.
- From time 30 on, the request taken from the expired time segment are added back to the request limit, as shown in the red lines.
The following table shows the data in the previous graph in a different format. The Available column shows the requests available from the previous segment (The Carry over from the previous row). The first row shows 100 available requests because there's no previous segment.
Time | Available | Taken | Recycled from expired | Carry over |
---|---|---|---|---|
0 | 100 | 20 | 0 | 80 |
10 | 80 | 30 | 0 | 50 |
20 | 50 | 40 | 0 | 10 |
30 | 10 | 30 | 20 | 0 |
40 | 0 | 10 | 30 | 20 |
50 | 20 | 10 | 40 | 50 |
60 | 50 | 35 | 30 | 45 |
The following code uses the sliding window rate limiter:
using Microsoft.AspNetCore.RateLimiting;
using System.Threading.RateLimiting;
using WebRateLimitAuth.Models;
var builder = WebApplication.CreateBuilder(args);
var myOptions = new MyRateLimitOptions();
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit).Bind(myOptions);
var slidingPolicy = "sliding";
builder.Services.AddRateLimiter(_ => _
.AddSlidingWindowLimiter(policyName: slidingPolicy, options =>
{
options.PermitLimit = myOptions.PermitLimit;
options.Window = TimeSpan.FromSeconds(myOptions.Window);
options.SegmentsPerWindow = myOptions.SegmentsPerWindow;
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
}));
var app = builder.Build();
app.UseRateLimiter();
static string GetTicks() => (DateTime.Now.Ticks & 0x11111).ToString("00000");
app.MapGet("/", () => Results.Ok($"Sliding Window Limiter {GetTicks()}"))
.RequireRateLimiting(slidingPolicy);
app.Run();
Token bucket limiter
The token bucket limiter is similar to the sliding window limiter, but rather than adding back the requests taken from the expired segment, a fixed number of tokens are added each replenishment period. The tokens added each segment can't increase the available tokens to a number higher than the token bucket limit. The following table shows a token bucket limiter with a limit of 100 tokens and a 10-second replenishment period.
Time | Available | Taken | Added | Carry over |
---|---|---|---|---|
0 | 100 | 20 | 0 | 80 |
10 | 80 | 10 | 20 | 90 |
20 | 90 | 5 | 15 | 100 |
30 | 100 | 30 | 20 | 90 |
40 | 90 | 6 | 16 | 100 |
50 | 100 | 40 | 20 | 80 |
60 | 80 | 50 | 20 | 50 |
The following code uses the token bucket limiter:
using Microsoft.AspNetCore.RateLimiting;
using System.Threading.RateLimiting;
using WebRateLimitAuth.Models;
var builder = WebApplication.CreateBuilder(args);
var tokenPolicy = "token";
var myOptions = new MyRateLimitOptions();
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit).Bind(myOptions);
builder.Services.AddRateLimiter(_ => _
.AddTokenBucketLimiter(policyName: tokenPolicy, options =>
{
options.TokenLimit = myOptions.TokenLimit;
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
options.ReplenishmentPeriod = TimeSpan.FromSeconds(myOptions.ReplenishmentPeriod);
options.TokensPerPeriod = myOptions.TokensPerPeriod;
options.AutoReplenishment = myOptions.AutoReplenishment;
}));
var app = builder.Build();
app.UseRateLimiter();
static string GetTicks() => (DateTime.Now.Ticks & 0x11111).ToString("00000");
app.MapGet("/", () => Results.Ok($"Token Limiter {GetTicks()}"))
.RequireRateLimiting(tokenPolicy);
app.Run();
When AutoReplenishment is set to true
, an internal timer replenishes the tokens every ReplenishmentPeriod; when set to false
, the app must call TryReplenish on the limiter.
Concurrency limiter
The concurrency limiter limits the number of concurrent requests. Each request reduces the concurrency limit by one. When a request completes, the limit is increased by one. Unlike the other requests limiters that limit the total number of requests for a specified period, the concurrency limiter limits only the number of concurrent requests and doesn't cap the number of requests in a time period.
The following code uses the concurrency limiter:
using Microsoft.AspNetCore.RateLimiting;
using System.Threading.RateLimiting;
using WebRateLimitAuth.Models;
var builder = WebApplication.CreateBuilder(args);
var concurrencyPolicy = "Concurrency";
var myOptions = new MyRateLimitOptions();
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit).Bind(myOptions);
builder.Services.AddRateLimiter(_ => _
.AddConcurrencyLimiter(policyName: concurrencyPolicy, options =>
{
options.PermitLimit = myOptions.PermitLimit;
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
}));
var app = builder.Build();
app.UseRateLimiter();
static string GetTicks() => (DateTime.Now.Ticks & 0x11111).ToString("00000");
app.MapGet("/", async () =>
{
await Task.Delay(500);
return Results.Ok($"Concurrency Limiter {GetTicks()}");
}).RequireRateLimiting(concurrencyPolicy);
app.Run();
Rate Limiting Partitions
Rate limiting partitions divide the traffic into separate "buckets" that each get their own rate limit counters. This allows for more granular control than a single global counter. The partition "buckets" are defined by different keys (like user ID, IP address, or API key).
Benefits of Partitioning
- Fairness: One user can't consume the entire rate limit for everyone
- Granularity: Different limits for different users/resources
- Security: Better protection against targeted abuse
- Tiered Service: Support for service tiers with different limits
Partitioned rate limiting gives you fine-grained control over how you manage API traffic while ensuring fair resource allocation.
By IP Address
options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
RateLimitPartition.GetFixedWindowLimiter(
partitionKey: httpContext.Connection.RemoteIpAddress?.ToString() ?? "unknown",
factory: _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 50,
Window = TimeSpan.FromMinutes(1)
}));
By User Identity
options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
RateLimitPartition.GetFixedWindowLimiter(
partitionKey: httpContext.User.Identity?.Name ?? "anonymous",
factory: _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 100,
Window = TimeSpan.FromMinutes(1)
}));
By API Key
options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
{
string apiKey = httpContext.Request.Headers["X-API-Key"].ToString() ?? "no-key";
// Different limits based on key tier
return apiKey switch
{
"premium-key" => RateLimitPartition.GetFixedWindowLimiter(
partitionKey: apiKey,
factory: _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 1000,
Window = TimeSpan.FromMinutes(1)
}),
_ => RateLimitPartition.GetFixedWindowLimiter(
partitionKey: apiKey,
factory: _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 100,
Window = TimeSpan.FromMinutes(1)
}),
};
});
By Endpoint Path
options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
{
string path = httpContext.Request.Path.ToString();
// Different limits for different paths
if (path.StartsWith("/api/public"))
{
return RateLimitPartition.GetFixedWindowLimiter(
partitionKey: $"{httpContext.Connection.RemoteIpAddress}-public",
factory: _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 30,
Window = TimeSpan.FromSeconds(10)
});
}
return RateLimitPartition.GetFixedWindowLimiter(
partitionKey: httpContext.Connection.RemoteIpAddress?.ToString() ?? "unknown",
factory: _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 100,
Window = TimeSpan.FromMinutes(1)
});
});
Create chained limiters
The CreateChained API allows passing in multiple PartitionedRateLimiter which are combined into one PartitionedRateLimiter
. The combined limiter runs all the input limiters in sequence.
The following code uses CreateChained
:
using System.Globalization;
using System.Threading.RateLimiting;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddRateLimiter(_ =>
{
_.OnRejected = async (context, cancellationToken) =>
{
if (context.Lease.TryGetMetadata(MetadataName.RetryAfter, out var retryAfter))
{
context.HttpContext.Response.Headers.RetryAfter =
((int) retryAfter.TotalSeconds).ToString(NumberFormatInfo.InvariantInfo);
}
context.HttpContext.Response.StatusCode = StatusCodes.Status429TooManyRequests;
await context.HttpContext.Response.WriteAsync("Too many requests. Please try again later.", cancellationToken);
};
_.GlobalLimiter = PartitionedRateLimiter.CreateChained(
PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
{
var userAgent = httpContext.Request.Headers.UserAgent.ToString();
return RateLimitPartition.GetFixedWindowLimiter
(userAgent, _ =>
new FixedWindowRateLimiterOptions
{
AutoReplenishment = true,
PermitLimit = 4,
Window = TimeSpan.FromSeconds(2)
});
}),
PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
{
var userAgent = httpContext.Request.Headers.UserAgent.ToString();
return RateLimitPartition.GetFixedWindowLimiter
(userAgent, _ =>
new FixedWindowRateLimiterOptions
{
AutoReplenishment = true,
PermitLimit = 20,
Window = TimeSpan.FromSeconds(30)
});
}));
});
var app = builder.Build();
app.UseRateLimiter();
static string GetTicks() => (DateTime.Now.Ticks & 0x11111).ToString("00000");
app.MapGet("/", () => Results.Ok($"Hello {GetTicks()}"));
app.Run();
For more information, see the CreateChained source code
Choosing what happens when a request is rate limited
For simple cases, you can just set the status code:
builder.Services.AddRateLimiter(options =>
{
// Set a custom status code for rejections
options.RejectionStatusCode = StatusCodes.Status429TooManyRequests;
// Rate limiter configuration...
});
The most common approach is to register an OnRejected callback when configuring rate limiting:
builder.Services.AddRateLimiter(options =>
{
// Rate limiter configuration...
options.OnRejected = async (context, cancellationToken) =>
{
// Custom rejection handling logic
context.HttpContext.Response.StatusCode = StatusCodes.Status429TooManyRequests;
context.HttpContext.Response.Headers["Retry-After"] = "60";
await context.HttpContext.Response.WriteAsync("Rate limit exceeded. Please try again later.", cancellationToken);
// Optional logging
logger.LogWarning("Rate limit exceeded for IP: {IpAddress}",
context.HttpContext.Connection.RemoteIpAddress);
};
});
Another option is to queue the request:
Request Queuing
With queuing enabled, when a request exceeds the rate limit, it's placed in a queue where the request waits until a permit becomes available or until a timeout occurs. Requests are processed according to a configurable queue order.
builder.Services.AddRateLimiter(options =>
{
options.AddFixedWindowLimiter("api", options =>
{
options.PermitLimit = 10; // Allow 10 requests
options.Window = TimeSpan.FromSeconds(10); // Per 10-second window
options.QueueLimit = 5; // Queue up to 5 additional requests
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst; // Process oldest requests first
options.AutoReplenishment = true; // Default: automatically replenish permits
});
});
EnableRateLimiting
and DisableRateLimiting
attributes
The [EnableRateLimiting]
and [DisableRateLimiting]
attributes can be applied to a Controller, action method, or Razor Page. For Razor Pages, the attribute must be applied to the Razor Page and not the page handlers. For example, [EnableRateLimiting]
can't be applied to OnGet
, OnPost
, or any other page handler.
The [DisableRateLimiting]
attribute disables rate limiting to the Controller, action method, or Razor Page regardless of named rate limiters or global limiters applied. For example, consider the following code which calls RequireRateLimiting to apply the fixedPolicy
rate limiting to all controller endpoints:
using Microsoft.AspNetCore.RateLimiting;
using System.Threading.RateLimiting;
using WebRateLimitAuth.Models;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddRazorPages();
builder.Services.AddControllersWithViews();
builder.Services.Configure<MyRateLimitOptions>(
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit));
var myOptions = new MyRateLimitOptions();
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit).Bind(myOptions);
var fixedPolicy = "fixed";
builder.Services.AddRateLimiter(_ => _
.AddFixedWindowLimiter(policyName: fixedPolicy, options =>
{
options.PermitLimit = myOptions.PermitLimit;
options.Window = TimeSpan.FromSeconds(myOptions.Window);
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
}));
var slidingPolicy = "sliding";
builder.Services.AddRateLimiter(_ => _
.AddSlidingWindowLimiter(policyName: slidingPolicy, options =>
{
options.PermitLimit = myOptions.SlidingPermitLimit;
options.Window = TimeSpan.FromSeconds(myOptions.Window);
options.SegmentsPerWindow = myOptions.SegmentsPerWindow;
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
}));
var app = builder.Build();
app.UseRateLimiter();
if (!app.Environment.IsDevelopment())
{
app.UseExceptionHandler("/Error");
app.UseHsts();
}
app.UseHttpsRedirection();
app.UseStaticFiles();
app.MapRazorPages().RequireRateLimiting(slidingPolicy);
app.MapDefaultControllerRoute().RequireRateLimiting(fixedPolicy);
app.Run();
In the following code, [DisableRateLimiting]
disables rate limiting and overrides [EnableRateLimiting("fixed")]
applied to the Home2Controller
and app.MapDefaultControllerRoute().RequireRateLimiting(fixedPolicy)
called in Program.cs
:
[EnableRateLimiting("fixed")]
public class Home2Controller : Controller
{
private readonly ILogger<Home2Controller> _logger;
public Home2Controller(ILogger<Home2Controller> logger)
{
_logger = logger;
}
public ActionResult Index()
{
return View();
}
[EnableRateLimiting("sliding")]
public ActionResult Privacy()
{
return View();
}
[DisableRateLimiting]
public ActionResult NoLimit()
{
return View();
}
[ResponseCache(Duration = 0, Location = ResponseCacheLocation.None, NoStore = true)]
public IActionResult Error()
{
return View(new ErrorViewModel { RequestId = Activity.Current?.Id ?? HttpContext.TraceIdentifier });
}
}
In the preceding code, the [EnableRateLimiting("sliding")]
is not applied to the Privacy
action method because Program.cs
called app.MapDefaultControllerRoute().RequireRateLimiting(fixedPolicy)
.
Consider the following code which doesn't call RequireRateLimiting
on MapRazorPages
or MapDefaultControllerRoute
:
using Microsoft.AspNetCore.RateLimiting;
using System.Threading.RateLimiting;
using WebRateLimitAuth.Models;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddRazorPages();
builder.Services.AddControllersWithViews();
builder.Services.Configure<MyRateLimitOptions>(
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit));
var myOptions = new MyRateLimitOptions();
builder.Configuration.GetSection(MyRateLimitOptions.MyRateLimit).Bind(myOptions);
var fixedPolicy = "fixed";
builder.Services.AddRateLimiter(_ => _
.AddFixedWindowLimiter(policyName: fixedPolicy, options =>
{
options.PermitLimit = myOptions.PermitLimit;
options.Window = TimeSpan.FromSeconds(myOptions.Window);
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
}));
var slidingPolicy = "sliding";
builder.Services.AddRateLimiter(_ => _
.AddSlidingWindowLimiter(policyName: slidingPolicy, options =>
{
options.PermitLimit = myOptions.SlidingPermitLimit;
options.Window = TimeSpan.FromSeconds(myOptions.Window);
options.SegmentsPerWindow = myOptions.SegmentsPerWindow;
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
options.QueueLimit = myOptions.QueueLimit;
}));
var app = builder.Build();
app.UseRateLimiter();
if (!app.Environment.IsDevelopment())
{
app.UseExceptionHandler("/Error");
app.UseHsts();
}
app.UseHttpsRedirection();
app.UseStaticFiles();
app.MapRazorPages();
app.MapDefaultControllerRoute(); // RequireRateLimiting not called
app.Run();
Consider the following controller:
[EnableRateLimiting("fixed")]
public class Home2Controller : Controller
{
private readonly ILogger<Home2Controller> _logger;
public Home2Controller(ILogger<Home2Controller> logger)
{
_logger = logger;
}
public ActionResult Index()
{
return View();
}
[EnableRateLimiting("sliding")]
public ActionResult Privacy()
{
return View();
}
[DisableRateLimiting]
public ActionResult NoLimit()
{
return View();
}
[ResponseCache(Duration = 0, Location = ResponseCacheLocation.None, NoStore = true)]
public IActionResult Error()
{
return View(new ErrorViewModel { RequestId = Activity.Current?.Id ?? HttpContext.TraceIdentifier });
}
}
In the preceding controller:
- The
"fixed"
policy rate limiter is applied to all action methods that don't haveEnableRateLimiting
andDisableRateLimiting
attributes. - The
"sliding"
policy rate limiter is applied to thePrivacy
action. - Rate limiting is disabled on the
NoLimit
action method.
Rate limiting metrics
The rate limiting middleware provides built-in metrics and monitoring capabilities to help understand how rate limits are affecting app performance and user experience. See Microsoft.AspNetCore.RateLimiting
for a list of metrics.
Testing endpoints with rate limiting
Before deploying an app using rate limiting to production, stress test the app to validate the rate limiters and options used. For example, create a JMeter script with a tool like BlazeMeter or Apache JMeter HTTP(S) Test Script Recorder and load the script to Azure Load Testing.
Creating partitions with user input makes the app vulnerable to Denial of Service (DoS) Attacks. For example, creating partitions on client IP addresses makes the app vulnerable to Denial of Service Attacks that employ IP Source Address Spoofing. For more information, see BCP 38 RFC 2827 Network Ingress Filtering: Defeating Denial of Service Attacks that employ IP Source Address Spoofing.
Additional resources
- Rate limiting middleware by Maarten Balliauw provides an excellent introduction and overview to rate limiting.
- Rate limit an HTTP handler in .NET
ASP.NET Core