This content originally appeared on DEV Community and was authored by Manoj Gohel
Practical Tips for Managing Multiple Promises in Node JS Applications
Handling Large Numbers of Promises in Node Js: Practical Tips for Managing Multiple Promises in Node Js Applications
When developing with Node.js, you may come across situations where you need to manage a large number of promises. For instance, you might need to send requests to multiple URLs.
In this article, we’ll explore different strategies to handle these situations gracefully, without overwhelming your server.
The Challenge
Suppose you have an array of 2000+ URLs and you need to send a GET request to each one. A naive approach might be to loop over the array and send a request for each URL.
However, this could potentially create hundreds or thousands of simultaneous requests, which could overwhelm your server or the server you’re sending requests to.
The Solutions
1. Promise.all with Batching
Promise.all
allows you to handle multiple promises at once, but using it with thousands of promises might overwhelm your server. Instead, you can batch your promises into smaller groups and handle each batch separately.
const batchSize = 100; // Adjust as needed
for(let i = 0; i < promises.length; i += batchSize) {
const batch = promises.slice(i, i + batchSize);
await Promise.all(batch);
}
In this example, we’re using Promise.all
to handle multiple promises at once, but we're doing it in batches to avoid overwhelming the system with too many concurrent promises.
Here’s how it works:
- In the
Promise.all
with batching example, we: - Set a
batchSize
for concurrent promises. - Loop through
promises
array inbatchSize
chunks. - For each chunk, we use
Promise.all
to handle all promises in that chunk concurrently. - We wait for each chunk to finish before moving to the next one.
This approach allows us to handle a large number of promises in a controlled manner, without creating too many concurrent promises that could overwhelm the system.
2. Promise.allSettled
This function is similar to Promise.all
, but it doesn't reject as soon as one of the promises rejects. Instead, it waits for all promises to settle, whether they fulfill or reject.
This can be useful if you want to handle all of your promises, even if some of them fail.
const axios = require('axios');
const urls = \['url1', 'url2', 'url3', ..., 'url1000'\];
const requests = urls.map((url) => {
return axios.get(url)
.then(response => ({ status: 'fulfilled', value: response.data }))
.catch(error => ({ status: 'rejected', reason: error.message }));
});
Promise.allSettled(requests)
.then(results => {
results.forEach((result, index) => {
if (result.status === 'fulfilled') {
console.log(\`Response from ${urls\[index\]}:\`, result.value);
} else {
console.error(\`Error fetching ${urls\[index\]}:\`, result.reason);
}
});
});
In this example, we first create an array of promises using Array.map
. Each promise sends a GET request to one of the URLs and returns an object with the status and the response data if the request succeeds, or the status and the error message if the request fails.
Then, we pass this array of promises to Promise.allSettled
. This function returns a new promise that fulfills when all the promises in the array have settled, i.e., they have either fulfilled or rejected.
The fulfillment value of the Promise.allSettled
promise is an array of objects that describe the outcome of each promise. We can then loop over this array to log the response data for each successful request and the error message for each failed request.
3. async.eachLimit
The async
library provides several functions that can help you handle large numbers of asynchronous operations. For example, async.eachLimit
allows you to run a function on each item in a collection, with a limit on the number of concurrent executions.
Here’s an example of how you could use async.eachLimit
to send GET requests to a list of URLs, with a maximum of 100 requests at a time
const async = require('async');
const axios = require('axios');
const urls = \['url1', 'url2', 'url3', ..., 'url1000'\];
async.eachLimit(urls, 100, async (url, callback) => {
try {
const response = await axios.get(url);
console.log(response.data);
callback();
} catch (error) {
console.error(\`Error fetching ${url}: ${error.message}\`);
callback(error);
}
}, (err) => {
if (err) {
console.error('A URL failed to process');
} else {
console.log('All URLs have been processed successfully');
}
});
In this example, we’re using the async.eachLimit
function from the async
library to send HTTP GET requests to a list of URLs. The function async.eachLimit
is used to iterate over urls
, an array of URLs, with a maximum of 100 requests at a time.
The function takes three arguments:
-
urls
: The collection to iterate over. -
100
: The maximum number of items to process concurrently. - An asynchronous function that is applied to each item in the collection. This function takes two arguments:
-
url
: The current item being processed. -
callback
: A callback function that you call when the processing of the current item is finished. If you pass an error (or any truthy value) to this callback, the main callback (the fourth argument toasync.eachLimit
) is immediately called with this error.
The asynchronous function sends a GET request to the current URL using axios.get
. If the request is successful, it logs the response data and calls the callback with no arguments, indicating that it finished without errors.
If the request fails, it logs an error message and calls the callback with the error, indicating that it finished with an error.
Finally, async.eachLimit
takes a fourth argument: a callback that is called when all items have been processed, or when an error occurs.
If there was an error with any URL, it logs a message indicating that a URL failed to process.
If there were no errors, it logs a message indicating that all URLs have been processed successfully.
Thank you so much for taking the time to read my article all the way through!
If you found it helpful or interesting, why not give it a round of applause by clicking those heart buttons?
This content originally appeared on DEV Community and was authored by Manoj Gohel
Manoj Gohel | Sciencx (2024-06-27T04:20:54+00:00) Handling Large Numbers of Promises in Node JS. Retrieved from https://www.scien.cx/2024/06/27/handling-large-numbers-of-promises-in-node-js/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.