I\'m running a knex seed
in Node and need to batch an additional query to my database due to restrictions on my server. I\'m starting to get the hang of promise
In this line:
let images = await Promise.all(batches.map(run_batch));
You are trying to run ALL the batches in parallel which is defeating your chunking entirely.
You could use a regular for
loop with await
instead of the .map()
so you runva batch, wait for it to finish, then run the next batch.
let allResults = [];
for (let batch of batches) {
let images = await run_batch(batch);
allResults.push(...images);
}
console.log(allResults);
FYI, you might benefit from any number of functions people have written for processing a large array with no more than N requests in flight at the same time. These do not require you to manually break the data into batches. Instead, they monitor how many requests are in-flight at the same time and they start up your desired number of requests and as one finishes, they start another one, collecting the results for you.
runN(fn, limit, cnt, options)
: Loop through an API on multiple requests
pMap(array, fn, limit)
: Make several requests to an api that can only handle 20 at a time
rateLimitMap(array, requestsPerSec, maxInFlight, fn)
: Proper async method for max requests per second
mapConcurrent(array, maxConcurrent, fn)
: Promise.all() consumes all my ram
There are also features to do this built into the Bluebird promise library and the Async-promises library.