API Security: Lack of rate limiting
Whenever an API serves a request, it will have to respond, and to generate the response the API requires resources (CPU, RAM, network, and at times even disk space) but how much is required highly depends on the task at hand. Whenever there is more logic processing happens or data is returned, more resources are taken up by a single call and this can stack on quickly if we do not rate limit our API endpoints. This issue is made even worse by the fact that most APIs reside on shared hosts which means they are all fighting for the same resources which could mean the attacker is disabling a secondary unrelated API by consuming all the resources.
Example:-
There are simple examples of attacks related to lack of rate limiting on the endpoint but those are easy enough, a somewhat deeper attack could be a user who discovers the endpoint to create a file that does have rate limiting and an endpoint to copy a file does not have rate limiting. At first, this might seem hard to abuse but if we create a document on the system that has a large file size and then copy it over, we might trigger the server to run out of resources.
POST /createDocument
[
{
"Name": "67",
"Content": "AAAAA...AAAA"
"fileSize":"21343243242343kb"
}
]
With a response of the ID:
id=21
GET /cloneDocument?id=21
Let's add another example to make things more clear. We might be trying to recall the last 100 posts to a blog with the following URL
Recommended by LinkedIn
GET /api/v1/posts?limit=100
By executing this request with a parameter of limit=99999 we might trigger a lack of resources as well and this is also counted as a lack of endpoint rate limiting.
Preventive measures:-