API Security: Lack of rate limiting

API Security: Lack of rate limiting

Whenever an API serves a request, it will have to respond, and to generate the response the API requires resources (CPU, RAM, network, and at times even disk space) but how much is required highly depends on the task at hand. Whenever there is more logic processing happens or data is returned, more resources are taken up by a single call and this can stack on quickly if we do not rate limit our API endpoints. This issue is made even worse by the fact that most APIs reside on shared hosts which means they are all fighting for the same resources which could mean the attacker is disabling a secondary unrelated API by consuming all the resources.

Example:-

There are simple examples of attacks related to lack of rate limiting on the endpoint but those are easy enough, a somewhat deeper attack could be a user who discovers the endpoint to create a file that does have rate limiting and an endpoint to copy a file does not have rate limiting. At first, this might seem hard to abuse but if we create a document on the system that has a large file size and then copy it over, we might trigger the server to run out of resources.

POST /createDocument
 
[
  {
    "Name": "67", 
    "Content": "AAAAA...AAAA"
		"fileSize":"21343243242343kb"
  }
]

With a response of the ID:
 
id=21

GET /cloneDocument?id=21        

Let's add another example to make things more clear. We might be trying to recall the last 100 posts to a blog with the following URL

GET /api/v1/posts?limit=100

By executing this request with a parameter of limit=99999 we might trigger a lack of resources as well and this is also counted as a lack of endpoint rate limiting.

Preventive measures:-

  • We should make sure the client can only make a certain amount of requests over a certain period. If we do this however we have to make sure that the client is notified when a rate limit is triggered or about to be triggered. This message should contain the number of remaining requests before the limit is hit and/or the remaining time of the rate limit.
  • We need to verify on the client and server-side that the request body and response are not too big. This is especially true for endpoints that return a number of records specified by the client. The client can manipulate the number of requests and cause a severe issue to occur if they request too many records at a time.
  • Every parameter in the API endpoint which defines the amount of data to be returned should have an upper limit. This ensures we never expose too great of a volume of data.

To view or add a comment, sign in

More articles by Rishabh Gaur

Others also viewed

Explore content categories