Monotonically increasing time in Node.js - javascript

This question has already been answered for the browser here, but window.performance.now() is obviously not available in Node.js.
Some applications need a steady clock, i.e., a clock that monotonically increases through time, not subject to system clock drifts. For instance, Java has System.nanoTime() and C++ has std::chrono::steady_clock. Is such clock available in Node.js?

Turns out the equivalent in Node.js is process.hrtime(). As per the documentation:
[The time returned from process.hrtime() is] relative to an arbitrary time in the past, and not related to the time of day and therefore not subject to clock drift.
Example
Let's say we want to periodically call some REST endpoint once a second, process its outcome and print something to a log file. Consider the endpoint may take a while to respond, e.g., from hundreds of milliseconds to more than one second. We don't want to have two concurrent requests going on, so setInterval() does not exactly meet our needs.
One good approach is to call our function one first time, do the request, process it and then call setTimeout() and reschedule for another run. But we want to do that once a second, taking into account the time we spent making the request. Here's one way to do it using our steady clock (which will guarantee we won't be fooled by system clock drifts):
function time() {
const nanos = process.hrtime.bigint();
return Number(nanos / 1_000_000n);
}
async function run() {
const startTime = time();
const response = await doRequest();
await processResponse(response);
const endTime = time();
// wait just the right amount of time so we run once second;
// if we took more than one second, run again immediately
const nextRunInMillis = Math.max(0, 1000 - (endTime - startTime));
setTimeout(run, nextRunInMillis);
}
run();
I made this helper function time() which converts the value returned by process.hrtime.bigint() to a timestamp with milliseconds resolution; just enough resolution for this application.

NodeJS 10.7.0 added process.hrtime.bigint().
You can then do this:
function monotimeRef() {
return process.hrtime.bigint();
}
function monotimeDiff(ref) {
return Number(process.hrtime.bigint() - ref) / 10**9;
}
Demonstrating the usage in a Node REPL:
// Measure reference time.
> let t0 = monotimeRef();
undefined
[ ... let some time pass ... ]
// Measure time passed since reference time,
// in seconds.
> monotimeDiff(t0)
12.546663115
Note:
Number() converts a BigInt to a regular Number type, allowing for translating from nanoseconds to seconds with the normal division operator.
monotimeDiff() returns the wall time difference passed with nanosecond resolution as a floating point number (as of converting to Number before doing the division).
This assumes that the measured time duration does not grow beyond 2^53 ns, which is actually just about 104 days (2**53 ns / 10**9 ns/s / 86400.0 s/day = 104.3 day).

Related

Precision timing in React / React Native / momentJS - length of a second differs depending on device

I am building a React Native app for Android that has a dependency on durations of time and accurately creating actions at certain points in time.
Given the below code I have found that the length of a second changes depending on the device that is running it:
interval.current = setInterval(() => {
const now = moment().valueOf();
console.log('now', now);
}, 1000);
If I run clocks based on the value of now which is a unix time stamp in milliseconds, the clocks deviate over time by a degree that is too much given the importance of time in the app.
I have analysed this on Google Pixel 2 XL and Samsung A10 devices. Surprisingly, despite the cheaper nature of the Samsung, it is much more accurate. You can see that for this 20 seconds of data, the Pixel deviates by 168ms where as the Samsung is just 6:
I don't understand why this is happening and want to find a better solution. Is there a way to access precise system or network time? Or is there a better approach to achieving precise timing on the client side?
TIA.
JS timers like setTimeout or setInterval can't give you strict result. To measure time exactly you need to use timestamp with Date.now() or similar and compare it to the previous timestamp. You can use setTimeout, but do some calculations to figure out fluctuations and compensate them. Have a look at code below, which runs operation each second... You can see some fluctuations there, but they stays static and doesn't grow:
const start = Date.now();
const cycles = 60;
let i = 0;
const delay = (ms = 1000) => {
i++;
const now = Date.now();
const diff = now - start - i*1000;
console.log(`Cycle ${i} fluctuation: ${diff}`);
setTimeout(() => {
// Compensate diff
if(i < cycles) delay(1000-diff);
}, ms);
}
// Run clock
setTimeout(delay, 1000);

Measuring page load timings using JavaScript

I have created a script in JavaScript that is injected into our Ext JS application during automated browser testing. The script measures the amount of time taken to load the data in our grids.
Specifically, the script polls each grid, looks to see if there is a first row or a 'no data' message, and once all grids have satisfied this condition the script records the value between Date.now() and performance.timing.fetchStart, and treats this as the time the page took to load.
This script works more or less as expected, however when compared with human measured timings (Google stopwatch ftw), the time reported by this test is consistently around 300 milliseconds longer than when measured by stopwatch.
My questions are these:
Is there a hole in this logic that would lead to incorrect results?
Are there any alternative and accurate ways to achieve this
measurement?
The script is as follows:
function loadPoll() {
var i, duration,
dataRow = '.firstRow', noDataRow = '.noData',
grids = ['.grid1', '.grid2', '.grid3','.grid4', 'grid5', 'grid6', 'grid7'];
for (i = 0; i < grids.length; ++i) {
var data = grids[i] + ' ' + dataRow,
noData = grids[i] + ' ' + noDataRow;
if (!(document.querySelector(data) || document.querySelector(noData))) {
window.setTimeout(loadPoll, 100);
return;
}
}
duration = Date.now() - performance.timing.fetchStart;
window.loadTime = duration;
}
loadPoll();
Some considerations:
Although I am aware that human response time can be slow, I am sure
that the 300 millisecond inconsistency is not introduced by the human
factor of using Google stopwatch.
Looking at the code it might appear that the polling of multiple
elements could lead to the 300 ms inconsistency, however when I
change the number of elements being monitored from 7 to 1, there
still appears to be a 300 ms surplus in the time reported by the
automated test.
Our automated tests are executed in a framework controlled by
Selenium and Protractor.
Thanks in advance if you are able to provide any insight to this!
If you use performance.now() the time should be accurate to 5 microseconds. According to MDN:
The performance.now() method returns a DOMHighResTimeStamp, measured
in milliseconds, accurate to five thousandths of a millisecond (5
microseconds).
The returned value represents the time elapsed since the time origin
(the PerformanceTiming.navigationStart property).
If I were you I would revise my approach to how the actual measuring of the time is captured. Rather than evaluating the time for each loadPoll() call, you can evaluate how many calls you can perform for a given period of time. In other words you can count the number of function iterations for a longer period of time, eg 1000 milliseconds. Here's how this can be done:
var timeout = 1000;
var startTime = new Date().getTime();
var elapsedTime = 0;
for (var iterations = 0; elapsedTime < timeout; iterations++) {
loadPoll();
elapsedTime = new Date().getTime() - startTime;
}
// output the number of achieved iterations
console.log(iterations);
This approach will give you more consistent and accurate time estimates. Faster systems will simply achieve a greater number of iterations. Keep in mind that setInterval()/setTimeout() are not perfectly precise and for really small interval timers these functions may give you invalid results due to garbage collection, demands from events and many other things that can run in parallel while your code is being executed.

Execution time of JavaScript programs

I am dealing with execution time of JavaScript programs. When I run them
with nodejs/v8, time command on shell returns varying execution times.
*Is it possible to sandbox the execution of such program so that I will get constant or less varying execution times across different runs?
*Is there any other way to check the execution time of these programs using nodejs/v8?
Have a look at node's own process.hrtime().
This returns high-resolution real time in a [seconds, nanoseconds]
tuple Array. It is relative to an arbitrary time in the past. It is
not related to the time of day and therefore not subject to clock
drift. The primary use is for measuring performance between intervals. You may pass in the result of a previous call to process.hrtime() to get a diff reading, useful for benchmarks and measuring intervals
var time = process.hrtime();
// [ 1800216, 25 ]
setTimeout(function() {
var diff = process.hrtime(time);
// [ 1, 552 ]
console.log('benchmark took %d nanoseconds', diff[0] * 1e9 + diff[1]);
// benchmark took 1000000527 nanoseconds
}, 1000);
Have a read of this useful blog post

Javascript setTimeout timing reliability

I have recently started exploring Javascript in more detail, and how it executes within the browser. Specifically, the setTimeout function.
My understanding is that calling setTimeout(foo,x)
will pass a handle to foo to be executed after x milliseconds. How reliable is this timing? Obviously if another long-running script is still executing after x milliseconds then the browser won't be able to call foo, but can I be absolutely certain that setTimeout(foo,101) will always be executed after setTimeout(foo,100)?
First of all, the timeout is in miliseconds, therefor 1 sec = 1000 ms. consider that.
you can always be sure that delay of 1001 will be later than 1000.
BUT You must remember that if the 2nd methods relay on changes of the first method it doesnt mean it will work good.
the first methods can take for reasonable time of 3ms (not a complicated one) and the 2nd one can start only 1 ms after the first one causing your reliability on the first method to fail.
i would suggest not to use this feature but in some rare cases.
you can tag me in this answer comment for your specific case and i can suggest the right way to work it out.
In nodejs we may be able to time events very precisely by setting a timeout to expire shortly before the desired moment, and then creating a tight series of event-loop ticks, and after each one checking how close to the target time we've approached:
Imagine Date.now() is currently, e.g., 11000 (unrealistic; just an example!)
Determine we want to act in EXACTLY 4000ms
Note that means EXACTLY when Date.now() === 15000
Use setTimeout to wait less than 4000ms, e.g., 3800ms
Keep awaiting microticks until Date.now() >= 15000
This won't block the event loop
(But it will keep your CPU very busy)
let preciseWaitMs = async (ms, { nowFn=Date.now, stopShortMs=200, epsilonMs=0 }={}) => {
let target = nowFn() + ms;
await new Promise(resolve => setTimeout(resolve, ms - stopShortMs));
// Allow `epsilonMs` to shift our wait time by a small amount
target += epsilonMs;
// Await a huge series of microticks
// Note: this will eat cpu! Don't set `stopShortMs` too high!
while (target > nowFn()) await Promise.resolve();
};
(async () => {
let t1 = Date.now();
let target = t1 + 2000;
console.log(`Trying to act in EXACTLY 2000ms; at that point the time should be ${target}`);
await preciseWaitMs(2000);
let t2 = Date.now();
console.log(`Done waiting; time is ${t2}; deviation from target: ${t2 - target}ms (negative means early)`);
})();
Note that preciseWaitMs(4000) will never wait less than 4000ms. This means that, if anything, it is biased towards waiting too long. I added an epsilonMs option to allow the bias to be moved back according to the user; for example preciseWaitMs(4000, { epsilonMs: -1 }) may cancel out preciseWaitMs's bias towards always being late.
Note that some js environments provide a higher-precision current-time query than Date.now - for example, nodejs has require('perf_hooks').performance.now. You can supply a function like this using the nowFn option:
{ epsilonMs: -0.01, nowFn: require('perf_hooks').performance.now };
Many browsers support window.performance.now; try supplying:
{ epsilonMs: -0.01, nowFn: window.performance.now };
These settings achieve sub-millisecond "precise" timing; off by only about 0.01ms on average
Note the -0.01 value for epsilonMs is what seemed to work best for my conditions. Note that supplying fractional epsilonMs values is only meaningful if nowFn provides hi-res timestamps (which isn't, for example, the case with Date.now).
Most browsers use single thread for UI and JavaScript, which is blocked by synchronous calls. So, JavaScript execution blocks the rendering.
Events are processed asynchronously with the exception of DOM events.
but setTimeout(function(),1000) trick is very useful. It allows to:
Let the browser render current changes. Evade the “script is running too long” warning.
Change the execution flow. Opera is special in many places when it comes to timeouts and threading.
So if another function is executing it will handle it by running in parallel.
another thing to setTimeout(function(),1000) her time is in millisecond not in seconds.

Monotonically increasing time in JavaScript?

What’s the best way to get monotonically increasing time in JavaScript? I’m hoping for something like Java’s System.nanoTime().
Date() obviously won’t work, as it’s affected by system time changes.
In other words, what I would like is for a <= b, always:
a = myIncreasingTime.getMilliseconds();
...
// some time later, maybe seconds, maybe days
b = myIncreasingTime.getMilliseconds();
At best, even when using the UTC functions in Date(), it will return what it believes is the correct time, but if someone sets the time backward, the next call to Date() can return a lesser value. System.nanoTime() does not suffer from this limitation (at least not until the system is rebooted).
Modification: [2012-02-26: not intended to affect the original question, which has a bounty]
I am not interested knowing the “wall time”, I’m interested in knowing elapsed time with some accuracy, which Date() cannot possibly provide.
You could use window.performance.now() - since Firefox 15, and window.performance.webkitNow() - Chrome 20]
var a = window.performance.now();
//...
var delay = window.performance.now() - a;
You could wrap Date() or Date.now() so as to force it to be monotonic (but inaccurate). Sketch, untested:
var offset = 0;
var seen = 0;
function time() {
var t = Date.now();
if (t < seen) {
offset += (seen - t);
}
seen = t;
return t + offset;
}
If the system clock is set back at a given moment, then it will appear that no time has passed (and an elapsed time containing that interval will be incorrect), but you will at least not have negative deltas. If there are no set-backs then this returns the same value as Date.now().
This might be a suitable solution if you're writing a game simulation loop, for example, where time() is called extremely frequently — the maximum error is the number of set-backs times the interval between calls. If your application doesn't naturally do that, you could explicitly call it on a setInterval, say (assuming that isn't hosed by the system clock), to keep your accuracy at the cost of some CPU time.
It is also possible that the clock will be set forward, which does not prevent monotonicity but might have equally undesirable effects (e.g. a game spending too long trying to catch up its simulation at once). However, this is not especially distinguishable from the machine having been asleep for some time. If such a protection is desired, it just means changing the condition next to the existing one, with a constant threshold for acceptable progress:
if (t > seen + leapForwardMaximum) {
offset += (seen - t) + leapForwardMaximum;
}
I would suggest that leapForwardMaximum should be set to more than 1000 ms because, for example, Chrome (if I recall correctly) throttles timers in background tabs to fire not more than once per second.
Javascript itself does not have any functionality to access the nanoTime. You might load a java-applet to aqcuire that information, like benchmark.js has done. Maybe #mathias can shed some light on what they did there…
Firefox provides "delay" argument for setTimeout...
this is the one of ways to implement monotonically increased time counter.
var time = 0;
setTimeout(function x(actualLateness) {
setTimeout(x, 0);
time += actualLateness;
}, 0);

Categories