I am trying to write a function that:
- Takes an array of URLs
- Gets files from URLs in parallel (order's irrelevant)
- Processes each file
- Returns an object with the processed files
Furthermore, I don't need for errors in #2 or #3 to affect the rest of the execution in my application in any way - the app could continue even if all the requests or processing failed.
I know how to fire all the requests in a loop, then once I have all the data, fire the callback to process the files, by using this insertCollection pattern.
However, this is not efficient, as I shouldn't need to wait for ALL files to download before attempting to process them - I would like to process them as each download finishes.
So far I have this code:
const request = require('request');
const urlArray = [urlA, urlB, urlC];
const results = {};
let count = 0;
let processedResult;
const makeRequests = function (urls, callback) {
for (let url of urls) {
request(url, function(error, response, body) {
if (error) {
callback(error);
return;
}
processedResult = callback(null, body)
if (processedResult) {
console.log(processedResult); // prints correctly!
return processedResult;
}
})
}
};
const processResult = function(error, file) {
if (error) {
console.log(error);
results.errors.push(error);
}
const processedFile = file + `<!-- Hello, Dolly! ${count}-->`;
results.processedFiles.push(processedFile);
if (++count === urlArray.length) {
return results;
}
};
const finalResult = makeRequests(urlArray, processResult);
console.log(finalResult); // undefined;
In the last call to processResult
I manage to send a return, and makeRequests
captures it, but I'm failing to "reign it in" in finalResult after that.
My questions are:
- Why is this not working? I can print a well-formed
processedResult
on the last iteration ofmakeRequests
, but somehow I cannot return it back to the caller (finalResult
) - How can this be solved, ideally "by hand", without promises or the help of libraries like async?
via alexcs
No comments:
Post a Comment