Monday, 22 May 2017

Wild time differences measured in Node.js program

I've noticed an odd behavior in a program I am working on. While it's probably my fault, I wanted to check if someone faced this before.

Both with console.time and performance-now my code has highly variable run time. I can't exactly paste it here because it's work related, but the workflow is:

  1. start timer
  2. load image via jimp
  3. when image is loaded, scale image times 2
  4. scan the image, transform pixels in zeroes and ones and put it into an array
  5. log time

Some discrepancy is sure to be expected, but the results are so different I'm wondering what I did wrong. Average time is short, but the outliers are pretty... well, you can see on the image below.

Same 8x8 png image



via Tutch

No comments:

Post a Comment