UPD: The question What is the reason JavaScript setTimeout is so inaccurate? asks why the timers in JavaScript are inaccurate in general and all mentions of the inaccuracy are about invocations slightly after the specified delay. Here I'm asking why NodeJS tolerates also invocations even before the delay? Isn't it an error-prone design of the timers?
Just found an unexpected (to me only?) behaviour of NodeJS setTimeout(). Some times it triggers earlier than the specified delay.
function main() {
let count = 100;
while (count--) {
const start = process.hrtime();
const delay = Math.floor(Math.random() * 1000);
setTimeout(() => {
const end = process.hrtime(start);
const elapsed = (end[0] * 1000 + end[1]/1e6);
const dt = elapsed - delay;
if (dt < 0) {
console.log('triggered before delay', delay, dt);
}
}, delay);
}
}
main();
On my laptop output is:
$ node --version
$ v8.7.0
$ node test.js
triggered before delay 73 -0.156439000000006
triggered before delay 364 -0.028260999999986325
triggered before delay 408 -0.1185689999999795
triggered before delay 598 -0.19596799999999348
triggered before delay 750 -0.351709000000028
Is it a "feature" of the event loop? I always thought that it must be triggered at least after delay ms.
From the NodeJS docs:
The callback will likely not be invoked in precisely delay milliseconds. Node.js makes no guarantees about the exact timing of when callbacks will fire, nor of their ordering. The callback will be called as close as possible to the time specified.
As you increase the number of intervals (you have 100) the accuracy decreases, e.g., with 1000 intervals accuracy is even worse. With ten it's much better. As NodeJS has to track more intervals its accuracy will decrease.
We can posit the algorithm has a "reasonable delta" that determines final accuracy, and it does not include checking to make sure it's after the specified interval. That said, it's easy enough to find out with some digging in the source.
See also How is setTimeout implemented in node.js, which includes more details, and preliminary source investigation seems to confirm both this, and the above.
From node timers doc, about settimeout:
The only guarantee is that the timeout will not execute sooner than the declared timeout interval
Related
Consider this piece of code
// Initialise the variables to check when the code started
let total_time = performance.now();
let recursions = 0;
function test() {
recursions++;
if (recursions === 60) {
// Check how long it took to do 60 recursions
console.log(`${performance.now() - total_time}ms`);
}
// Timeout to call it 17ms later
setTimeout(() => test(), 17);
}
test();
This is meant to show when 60 recursions have happened to ensure that certain parts of my code are fired off correctly (without needing setTimeout as it's blocking) and I get between 1200 - 1600ms for no reason. I used a modified version that measured how long it took and it gave 0ms (for no reason). Anyone know why? And will the same happen in node.js?
Code used to measure how long it took to do one recursion
let start_time = performance.now();
let total_time = performance.now();
let measure = 0;
function test() {
start_time = performance.now();
measure++;
if (measure === 60) {
console.log(`${performance.now() - total_time}ms`);
}
console.log(`Taken ${performance.now() - start_time}`);
}
test();
It's not supposed to be very accurate as you might expect since the timeout can fire an event later when the page is busy with other tasks.
setTimeout is implemented in a way it's meant to execute after a minimum given delay, and once the browser's thread is free to execute it. so, for an example, if you specify a value of 0 for the delay parameter and you think it will execute "immediately", it won't. it will, more accurately, run in the next event cycle (which is part of the event loop - concurrency model which is responsible for executing the code).
So for conclusion, you can't use setTimeout if you expect a consistent, reliable
and accurate timing in scale of millisecond.
Please read reason for delays longer then specified - https://developer.mozilla.org/en-US/docs/Web/API/setTimeout#reasons_for_delays_longer_than_specified
More about the JS event loop - https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop
This code recursively calls the same function with a setTimeout of 1 millisecond, which in theory should call the function 1000 times per second. However, it's only called about 200 times per second:
This behavior happens on different machines and different browsers, I checked if it's has something to do with the maximum call stack, but this limit is actually way higher than 200 on any browser.
const info = document.querySelector("#info");
let start = performance.now();
let iterations = 0;
function run() {
if (performance.now() - start > 1000) {
info.innerText = `${iterations} function calls per second`;
start = performance.now();
iterations = 0;
}
iterations++;
setTimeout(run, 1);
}
run();
<div id="info"></div>
There’s a limitation of how often nested timers can run. The HTML5 standard says: 11: "If nesting level is greater than 5, and timeout is less than 4, then set timeout to 4."
This is true only for client-side engines (browsers). In Node.JS this limitation does not exist
HTML Standard Timers Section
Very similar example from javascript.info
The delay argument passed to setTimeout and setInterval is not a guaranteed amount of time. It's the minimum amount of time you could expect to wait before the callback function is executed. It doesn't matter how much of a delay you've asked for, if the JavaScript call stack is busy, then anything in the event queue will have to wait.
Also, there is an absolute minimum amount of time you could reasonably expect a callback to be called after which is dependent on the internals of the client.
From the HTML5 Spec:
This API does not guarantee that timers will run exactly on schedule.
Delays due to CPU load, other tasks, etc, are to be expected.
I once read somewhere that it was around 16ms so setting a delay of anything less than that shouldn't really change the timings at all.
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)
I have recently started exploring Javascript in more detail, and how it executes within the browser. Specifically, the setTimeout function.
My understanding is that calling setTimeout(foo,x)
will pass a handle to foo to be executed after x milliseconds. How reliable is this timing? Obviously if another long-running script is still executing after x milliseconds then the browser won't be able to call foo, but can I be absolutely certain that setTimeout(foo,101) will always be executed after setTimeout(foo,100)?
First of all, the timeout is in miliseconds, therefor 1 sec = 1000 ms. consider that.
you can always be sure that delay of 1001 will be later than 1000.
BUT You must remember that if the 2nd methods relay on changes of the first method it doesnt mean it will work good.
the first methods can take for reasonable time of 3ms (not a complicated one) and the 2nd one can start only 1 ms after the first one causing your reliability on the first method to fail.
i would suggest not to use this feature but in some rare cases.
you can tag me in this answer comment for your specific case and i can suggest the right way to work it out.
In nodejs we may be able to time events very precisely by setting a timeout to expire shortly before the desired moment, and then creating a tight series of event-loop ticks, and after each one checking how close to the target time we've approached:
Imagine Date.now() is currently, e.g., 11000 (unrealistic; just an example!)
Determine we want to act in EXACTLY 4000ms
Note that means EXACTLY when Date.now() === 15000
Use setTimeout to wait less than 4000ms, e.g., 3800ms
Keep awaiting microticks until Date.now() >= 15000
This won't block the event loop
(But it will keep your CPU very busy)
let preciseWaitMs = async (ms, { nowFn=Date.now, stopShortMs=200, epsilonMs=0 }={}) => {
let target = nowFn() + ms;
await new Promise(resolve => setTimeout(resolve, ms - stopShortMs));
// Allow `epsilonMs` to shift our wait time by a small amount
target += epsilonMs;
// Await a huge series of microticks
// Note: this will eat cpu! Don't set `stopShortMs` too high!
while (target > nowFn()) await Promise.resolve();
};
(async () => {
let t1 = Date.now();
let target = t1 + 2000;
console.log(`Trying to act in EXACTLY 2000ms; at that point the time should be ${target}`);
await preciseWaitMs(2000);
let t2 = Date.now();
console.log(`Done waiting; time is ${t2}; deviation from target: ${t2 - target}ms (negative means early)`);
})();
Note that preciseWaitMs(4000) will never wait less than 4000ms. This means that, if anything, it is biased towards waiting too long. I added an epsilonMs option to allow the bias to be moved back according to the user; for example preciseWaitMs(4000, { epsilonMs: -1 }) may cancel out preciseWaitMs's bias towards always being late.
Note that some js environments provide a higher-precision current-time query than Date.now - for example, nodejs has require('perf_hooks').performance.now. You can supply a function like this using the nowFn option:
{ epsilonMs: -0.01, nowFn: require('perf_hooks').performance.now };
Many browsers support window.performance.now; try supplying:
{ epsilonMs: -0.01, nowFn: window.performance.now };
These settings achieve sub-millisecond "precise" timing; off by only about 0.01ms on average
Note the -0.01 value for epsilonMs is what seemed to work best for my conditions. Note that supplying fractional epsilonMs values is only meaningful if nowFn provides hi-res timestamps (which isn't, for example, the case with Date.now).
Most browsers use single thread for UI and JavaScript, which is blocked by synchronous calls. So, JavaScript execution blocks the rendering.
Events are processed asynchronously with the exception of DOM events.
but setTimeout(function(),1000) trick is very useful. It allows to:
Let the browser render current changes. Evade the “script is running too long” warning.
Change the execution flow. Opera is special in many places when it comes to timeouts and threading.
So if another function is executing it will handle it by running in parallel.
another thing to setTimeout(function(),1000) her time is in millisecond not in seconds.
I'm trying to make a countdown that is counting down in milliseconds; however, the countdown actually takes much longer than 7 seconds. Any idea as to why?
function countDown(time){
var i = 0;
var interval = setInterval(function(){
i++;
if(i > time){
clearInterval(interval);
}else{
//mining
$('#mining_time').text($('#mining_time').text()-1);
}
}, 1);
}
And I can confirm the varible time passed to the function is correctly set to 7000.
For a mostly-accurate countdown, use setTimeout().
setTimeout(fn, 7e3);
If you absolutely must have it as close to 7 seconds as possible, use a tight poll (requestAnimationFrame()) and look at difference between the time of start and current poll.
var startTime = Date.now();
requestAnimationFrame(function me() {
var deltaTime = Date.now() - startTime;
if (deltaTime >= 7e3) {
fn();
} else {
requestAnimationFrame(me);
}
});
Poly-fill as required.
the most precise way to run something after 7 seconds - is to use setTimeout with 7000 ms interval
a. there is no browser that guarantees an interval to run with 1ms resolution. In the best case it would be 7-10ms
b. there is only one thread in js, so the tasks are queued. It means that the next run will be scheduled to only after the current run is finished.
Some useful reading: http://ejohn.org/blog/how-javascript-timers-work/
No browser will take 1 as parameter for setInterval. Off the top of my head the minimum is 4 ms.
For an accurate result, get the current time, add 7000 ms, and poll (using setInterval or setTimeout) until you reach that new time.
A quick Web search returned this article that provides an example.
[Update] the value of 4 ms is mentioned on this MDN page.