You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Timestamps are in ms (milliseconds). So, under certain circumstances, the difference between end and start is always zero ms. E.g: when profiling code that do not require long times to execute
Suggested solution
I would like to suggest timestamps in ns (nanoseconds). That could be achieved by replacing Date.now() for process.hrtime() in ./lib/profile.js. That would also require parsing the hrtime format with a function like this one:
The real problem here is that one reading takes about 5-10ms to be taken. While we can report the correct execution time we cannot take enough readings if the code is too fast.
Anyway the execution time is computed inside the ./lib/vm.js file
@simonepri I am not sure about why it takes 5~10ms each reading. But assuming it is because of the time required to generate a tempy file and setup each fork, maybe there's one solution, albeit I did not spent much time thinking about nor did I validated it:
It could be done something in the lines of recording start/end timestamps for each reading. With that in memory, you could compare current reading start with previous reading end to know how much of that time was spent not executing. Even for execution, you could compare current execution start with previous execution end to know time spent between executions.
I'm submitting a
Checklist
Information
[email protected]
The Problem
Timestamps are in ms (milliseconds). So, under certain circumstances, the difference between end and start is always zero ms. E.g: when profiling code that do not require long times to execute
Suggested solution
I would like to suggest timestamps in ns (nanoseconds). That could be achieved by replacing
Date.now()
forprocess.hrtime()
in./lib/profile.js
. That would also require parsing the hrtime format with a function like this one:The text was updated successfully, but these errors were encountered: