Thursday 18 May 2017

Executing CPU heavy HTTP requests asynchronously in node

I have a node app that uses node canvas to generate a dot plot graph from a file.

I'm having a problem with requests blocking each other. These requests involve CPU heavy looping through of large arrays (close to 1 million in length).

For example, I have one request which does some analysis on the file. This involves looping through an array with a length of 800k+ and calculating the mean value.

app.get('/api/get/statistics', function(req, res) {

    // Promise here which gets files and loops through and calculates mean

});

Another request loops through the same array and creates a graph and returns to the UI.

app.get('/api/canvas', function(req, res) {

    // creates the graph

});

On their own, the first request takes about 40 seconds and the second about 13 seconds. But when I make the requests at the same time, the second request will take about 53 seconds (and sometimes times out) as it's being queued.

Is there a way around this? I was thinking I could use child processes, perhaps with this package. But it seems like from this question it might not be possible with seperate HTTP requests.

Note: I don't want to do these in the same request. Most of the time I'm firing off multiple requests to the '/api/canvas' endpoint to create different graphs. So I'm basically wondering is it possible to execute these CPU heavy requests at the same time?



via Mark

No comments:

Post a Comment