<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1063935717132479&amp;ev=PageView&amp;noscript=1 https://www.facebook.com/tr?id=1063935717132479&amp;ev=PageView&amp;noscript=1 "> Bitovi Blog - UX and UI design, JavaScript and Frontend development
Loading

React |

Don’t Overload the API: Sequential & Batched Promises

Learn to juggle JavaScript promises through various throttling methods to avoid overloading the API.

Ryan Spencer

Ryan Spencer

Developer

Twitter Reddit

The other day I was uploading a lot of content, including images, to a 3rd party service. I put together all my objects and mapped through them to fire off requests. Nearly immediately, I blew past the API rate limiter, and my requests started failing. Oops.

While this problem isn’t particularly hard, needing to rate limit requests is somewhat uncommon. Since I hadn’t done it in a while, it took me a bit to figure it out.

Let’s work through all my iterations of potential solutions. The code examples are fake and simplified to make it easier to understand the concepts. There is a fully functional React code sandbox at the end of the post, which contains some real code you can use.

Using Map

Let’s take a look at my first thought, some code using .map.

import csvData from '../data.csv'
import { post } from '../api/post.ts' // Assume this sends data to our api

function uploadData(data: Array<any>) {
  data.map(datum => post(datum)) // datum is such a weird word
}

What happens here?

The uploadData method works through the array of data, running post for each item. For our pretend code, the post method returns a promise. However, there is nothing in this code that says a request has to wait for the previous request to finish before it runs.

fireeverythingwevegot

 

What ends up happening is all requests fire at once, or more correctly, all of the promises start as fast as the map can run. Browsers will limit in-flight requests to a handful at a time and will start the next in the queue as soon as one finishes. If you run this code as a Node script, there will be no limit and all the requests will hit the server in rapid succession. A co-worker and I took out our own staging server this way once, it's called a DoS attack. Hi Bob!

Wrapping in a Promise

What if we await each post, that will work right?

function uploadData(data: Array<any>) {
  data.map(async (datum) => await post(datum))
}

Nope. This just wraps each promise in another promise. Map still creates all the promises as fast as it can, and the result is the same.

The Loops that Wait

Enter the ES6 for..of loop. MDN Docs

async function uploadData(data: Array<any>) {
  for (const datum of data) {
    await post(datum)
  }
}

With some tweaks to the code, promises will run one at a time. A for..of one isn't going to work in IE, but we're letting that die, right?

The while loop has been around forever and will work with IE6. Probably even earlier versions, but I’m not going to bother looking it up.

async function uploadData(data: Array<any>, batchSize = 5) {
  while (data.length) {
    const datum = data.shift() // takes first item, altering the original array
    await post(datum)
  }
}

Too Slow, Let’s Batch

Firing all requests at the same time can overload the server or exceed rate limits, but sending one request at a time takes too long. Let’s look at how we can group our requests into batches.

async function uploadData(data: Array<any>, batchSize = 5) {
  while (data.length) {
    await Promise.all(data.splice(0, batchSize).map((datum) => post(datum)));
  }
}

In case you were wondering, .splice takes a start and end value, then removes and returns those items from the original array. The original array is mutated. If the end value is beyond the length of the array, then it returns up to and including the last item.

That code is pretty simple, but there is still room for improvement. When a batch runs faster than expected, rate limits can still be exceeded.

wreckitralphbunny

Introducing Time

If the rate limit is known, let’s say 5 requests per second, it is possible to introduce that time into the code by adding a setTimeout to each batch of promises. Because we are wrapping our batches in Promise.all, the code won’t move to the next loop until all promises complete, meaning the time given to the setTimeout is the minimum length of time that will pass before the next batch starts.

async function uploadData(data: Array<any>, batchSize = 5, minTime = 5000) {
  while (data.length) {
    await Promise.all([
      ...data.splice(0, batchSize).map(datum => post(datum)),
      () => new Promise<void>(resolve => setTimeout(resolve, minTime))
    ])
  }
}
preciselyonschedule

 

Interactive Code Sandbox

Want to see this working in a React project? Check out this code sandbox.

Summary

That’s five different ways to fire off a collection of promises, from the basic “Fire everything!” to the orderly and precise batching with a time restriction. I hope that helps you. Depending on your use case, you probably either want the version that fires promises sequentially or the final version with batching and time-based rate limiting.

I expect to be checking back in on this article a few years from now after I’ve forgotten how all of this works.

What do you think?

We’d love to hear your thoughts on batched API requests! Join our Community Discord and drop into the #React channel to chat with our React Consulting experts 🙌

Join our Discord