Recently, I faced a peculiar challenge at work. As a Design System Engineer, one of the tasks I perform is publishing client files to an artifactory. Due to a recent change I introduced to the architecture, the number of files I published increased by a big factor. This caused the backend to refuse connection mid way through the publishing process and the whole version was deemed to be a failure. I had to quickly find a solution.
One of the solutions was to reduce the number of files I published. But this didnβt fix the problem as the number of files were still huge and there was nothing I could do to reduce them. Plus, this solution had scalability issues. Should the number of files increase in the future, we would see the issue again.
The other solution was to publish in batches. Letβs say we have n
number of files to publish. The script was publishing them all in parallel using Promise.all
. The idea was to set a limit to the number of parallel requests. This is in a way, rate limiting of promises.
And so I wrote a script which executes n
promises in parallel in batches in series recursively. In other words, it would publish a given number of files, wait for them to be uploaded and then continue with the rest of the files in batches until all the files are uploaded in a recursive manner.
const PromisesInSeries = (promises, n) => {
return new Promise((resolve, reject) => {
let i = 0;
let _data = [];
const execute = () => {
Promise.all(promises.slice(i, i + n).map((p) => p()))
.then((data) => {
console.log(data);
_data.push(...data);
})
.catch(reject)
.finally(() => {
i += n;
if (i < promises.length) execute();
else resolve(_data);
});
};
execute();
});
};
And to test it, I created 10 fake promises.
const promiseGenerator = (i) => () =>
new Promise((resolve, reject) => {
console.log("executing", i);
setTimeout(() => resolve(i), 1000);
});
const call = async () => {
const data = await PromisesInSeries(
[
promiseGenerator(1),
promiseGenerator(2),
promiseGenerator(3),
promiseGenerator(4),
promiseGenerator(5),
promiseGenerator(6),
promiseGenerator(7),
promiseGenerator(8),
promiseGenerator(9),
promiseGenerator(10)
],
3,
10000
);
console.log(data);
};
call(); // [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] after 10 seconds.
Now, what if we wanted at add a wait stage after every batch? Simple, instead of calling the function recursively right away, we would call it with a delay with a setTimeout
.
const PromisesInSeriesWithDelay = (promises, n, delay) => {
return new Promise((resolve, reject) => {
let i = 0;
let _data = [];
const execute = () => {
Promise.all(promises.slice(i, i + n).map((p) => p()))
.then((data) => {
console.log(data);
_data.push(...data);
})
.catch(reject)
.finally(() => {
i += n;
if (i < promises.length) {
console.log("waiting for", delay);
setTimeout(execute, delay);
} else resolve(_data);
});
};
execute();
});
};
Note: This script doesn't cater for failure scenarios and hence is not suitable for production.
Top comments (0)