I am new to JavaScript and I really got confused by the documentation about promises. I have the following case here where I have a punch of users and for each user I execut
The first solution which only searches until 20 users where found would be to traverse one user after another :
async function someFunction(){
const results = [];
for(const user of users){
const result = await asyncFunction(user);
// Run some checks and add the user to an array with the result
if(!someChecksGood) continue;
results.push(result);
if(results.length >= 20) break;
}
return results;
}
While this works "perfect", its quite slow as it only processes one request at a time. So the opposite solution would be to run all requests at a time and cancel them if the array is already full:
async function someFunction(){
const results = [];
async function process(user){
const result = await asyncFunction(user);
if(!someChecksGood || results.length >= 20) return;
results.push(result);
}
await Promise.all(users.map(process));
return results;
}
But now there is a high number of unneccessary requests, that are discarded afterwards. To improve this, one could combine both approaches above by "chunking" the requests, which should not decrease request time that much as dbs can only process a certain amount of requests at a time, but the good thing is that we can stop processing when the array is full, and only the rest of the "chunk" is unneccessary proccessed, so in average it should be better than both solutions above:
async function someFunction(){
//Chunk the users
const chunks = [], size = 5;
for(var i = 0; i < users.length; i += size)
chunks.push( users.slice(i, i + size));
//the method to create the results:
const results = [];
async function process(user){
const result = await asyncFunction(user);
if(!someChecksGood || results.length >= 20) return;
results.push(result);
}
//iterate over the chunks:
for(const chunk of chunks){
await Promise.all(chunk.map(process));
if(results.length >= 20) break;
}
return results;
}