Y
Y
yeti3572015-11-11 17:25:25
linux
yeti357, 2015-11-11 17:25:25

Delay testing node.js event loop?

I'm exploring nodejs event loop delays using process.hrtime() and also I'm calculating percentiles for the measured delays.
Here is my simple test example
. I ran this test on a computer with an i7-4770.
This is what the graph looks like when the script is run on Windows 7 (same type of graph on OSX):
2eda8534de7e4c4b97afbda20428d044.png
Where the x-axis (horizontal) is time. And the y -axis (vertical) is the 90th percentile latency in µs (microseconds)
And this is what the graph looks like when the script is run on CentOS 7/6 :
11c3024702d042838d3e18f7a406629b.png
Where the x-axis (horizontal) is time. A Y axis(vertical) - 90th percentile delay in µs (microseconds). The wavelength is about 2 minutes.
As you can see, in general, there are more delays on Linux. But the real problem lies in the periodic peaks, when the average delay and the amount of high delay (on the order of 1000µs) per measured interval increases. Simply put, on Linux, the execution speed periodically sags. It has nothing to do with the garbage collector.
I'm not sure where exactly the problem is, in the OS, V8 or libuv. Is there a good explanation for this behaviour? For example, differences in the implementation of timers on different operating systems. Or how to fix it?

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question