I'm writing an application in Node.js that needs to schedule functions to be run at specific times. Hours or sometimes days in the future. Currently, I'm doing so with something similar to this:
const now = Date.now();
const later = getSomeFutureTimestamp();
setTimeout(function() {
// do something
}, later - now);
I'm wondering if setTimeout is the proper tool for this job or if there is a tool more suited for long intervals of time. One thing to note is I don't need great precision; if the job runs within a few minutes of the scheduled time, everything should still be fine.
setTimeout should be fine. It's only really delayed if there's blocking code running at the moment when it's meant to execute. So setTimeout is typically 20 milliseconds late. But since your margin is minutes, I don't think it'll be an issue.
However, I'd suggest storing the timestamp at which things should trigger, and just periodically check. That way you only have 1 timer running at any given time.
You also get to keep your timestamps as absolute timestamps - not as relative, "milliseconds in the future" values. That also lets you store them between server restarts, which you can't do as easily with relative times. With your code, the only record of a job being queued is that there's a timer running. If that timer disappears for any reason, you lose all record of a job having been scheduled.
Something like:
function checkForScheduledJobs() {
var now = Date.now(),
job;
// assuming here that the jobs array is sorted earliest to latest
while(jobs.length && jobs[0].timestamp < now) {
jobs.shift().callback();
}
setTimeout(checkForScheduledJobs, 60000); // check each minute
}
You just need to kick it off once. This could be done in an addScheduledJob function (which would also sort the jobs array after adding something to it, etc.)
Related
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)
I am developing a stopwatch application using Javascript/jQuery. The problem is that the milliseconds value is out of sync with REAL milliseconds. I am using function setInterval() with the interval of 1 millisecond, still it is causing this problem.
jsFiddle: http://jsfiddle.net/FLv3s/
Please help!
Use setInterval to trigger updates, but use the system time (via new Date()) for the actual time calculations.
To be honest, I tried nearly the same thing as you do now (Creating an accurate Metronome in Javascript only) - to make a long story short: To be absolutely accurate in terms of milliseconds (or lower) is sadly not (yet) possible with javascript only.
For more insight i recommend this question: Does JavaScript provide a high resolution timer?
or to be more precise this blog article: http://ejohn.org/blog/how-javascript-timers-work/
Best regards,
Dominik
Program execution in any language, not just JavaScript, is not realtime. It will take a tiny amount of time to actually run the code to increment your counter and update the view, and that throws the "timing" off.
Additionally, many browsers have a "minimum timeout" length, which varies between 4 and about 16 (the latter being the computer's own clock timer), which will really mess with your code.
Instead, you should use delta timing.
var startTime = new Date().getTime();
setInterval(function() {
var elapsed = new Date().getTime()-startTime;
// update view according to elapsed time
},25);
If you're worried about it looking choppy, consider using requestAnimationFrame instead, to update the timer exactly once per frame - this has the added benefit of not updating when the user switches tabs (but it will still be timing them) because if the tab is not active then there's no need to redraw stuff.
You can use the new performance object to get a more accurate time. Combine this with requestAnimationFrame instead of setInterval:
var startTime = performance.now(),
currentTime,
isRunning = true;
loop();
function loop(timeElapsed) {
currentTime = performance.now();
if (isRunning) requestAnimationFrame(loop);
}
Just subtract startTime from currentTime.
I left timeElapsed which contains time elapsed handed by rAF which you can may use also for something (or just ignore it).
One note though: not all browsers support this yet (and some may use prefix) so for mobile you need to use the standard system time object.
Is it possible to ser a function to start in a given date and hour? How?
I thought about setTimeout, but what's the maximum time I can set?
--update
By the way, it's for a desktop application.
I agree with JCOC611 - if you can make sure that your application does not close, then just get a Date object of when your alarm should go off and do something like this:
window.setTimeout(function() { soundAlarm() },
alarmDate.getTime() - new Date().getTime());
I see no reason for this not to work, but a lot of people exalt a timer based solution where you have a short lived timer that ticks until the set time. It has the advantage that the timer function can also update a clock or a countdown. I like to write this pattern like this:
(function(targetDate) {
if (targetDate.getTime() <= new Date().getTime()) {
soundAlarm();
return;
}
// maybe update a time display here?
window.setTimeout(arguments.callee,1000,targetDate); // tick every second
})(alarmDate);
This is basically a function that when called with a target date to sound an alarm on, re-calls itself every second to check if the time has not elapsed yet.
setTimeout(functionToCall,delayToWait)
As stated in Why does setTimeout() "break" for large millisecond delay values?, it uses a 32 bit int to store the delay so the max value allowed would be 2147483647
Does setTimeout() have a maximum?
http://www.highdots.com/forums/javascript/settimeout-ecma-166425.html
It may surprise you that setTimeout is
not covered by an ECMA standard, nor
by a W3C standard. There are some
holes in the web standards. This is
one of them. I'm looking to the WHAT
Working Group to fix this. See
http://www.whatwg.org/specs/web-apps/current-work/
There doesn't seem to be a problem in
setting the timeout value to something
that is vastly greater than the MTBF
of the browser. All that means is that
the timeout may never fire.
http://javascript.crockford.com/
-Douglas Crockford
As others have mentioned, this isn't the way to handle the situation. Use setTimeout to check a date object and then fire the event at the appropriate time. Some code to play with is linked below.
http://www.w3schools.com/js/tryit.asp?filename=tryjs_timing_clock
You should not relay on setTimeout for the actual alarm trigger but for a periodic function tracking the alarm. Use setTimeout to check the stored time for your alarm say every minute. Store that time in DB, file or server.
Is there any server component to this at all? You could use setInterval to call something serverside on a regular basis via ajax, then pull back a date object and once it's finally in the past you could trigger your "alarm"
Is there anyway to create a variable speed timer in the browser that will give the exact same results for all operating systems and browsers? If I want 140 beats per minute for every user regardless of their computer speed.
I've been using javascript setTimeout() and setInterval() but I think they are dependant on the speed of the computer and the amount of code in the program.
How do I incorporate the system clock into a browser? Or any other ideas?
You'll have to use setTimeout or setInterval in your solution, but it will be inaccurate for the following reasons:
Browsers have a minimum timeout, which is NOT 0ms. The cross-browser minimum is somewhere around 14ms.
Timers are inexact. They represent queuing time, not execution time. If something else is executing when your timer fires, your code gets pushed to a queue to wait, and may not actually execute until much later.
You're probably going to want to use setTimeout along with manual tracking of the current time (using Date) to step your program. For your case, try something like this:
function someAction(delta) {
// ...
}
function beat() {
var currentTime = +new Date;
var delta = currentTime - pastTime;
if (delta > 430) { // 430ms ~ 140bpm
pastTime = currentTime;
someAction();
}
setTimeout(beat, 107); // 4x resolution
}
var pastTime = +new Date;
beat();
This should approximate 140 beats per minute, using a higher resolution to avoid larger delays. This is just a sample though, you'll probably need to work at it more to get it to perform optimally for your application.
Best you can use is setInterval() or try and derive something from Date().
Note that the time won't be exact, I think because of JavaScript's single threaded nature.
The setTimeout() and setInterval() functions are pretty much the best you're going to get. The timeout parameters to these functions are specified in milliseconds, and are not dependent on the overall speed of the computer running the browser.
However, these functions are certainly not hard-real-time functions, and if the browser is off busy doing something else at the time your timeout or interval expires, there might be a slight delay before your callback function is actually called.