Monotonically increasing time in JavaScript? - javascript

What’s the best way to get monotonically increasing time in JavaScript? I’m hoping for something like Java’s System.nanoTime().
Date() obviously won’t work, as it’s affected by system time changes.
In other words, what I would like is for a <= b, always:
a = myIncreasingTime.getMilliseconds();
...
// some time later, maybe seconds, maybe days
b = myIncreasingTime.getMilliseconds();
At best, even when using the UTC functions in Date(), it will return what it believes is the correct time, but if someone sets the time backward, the next call to Date() can return a lesser value. System.nanoTime() does not suffer from this limitation (at least not until the system is rebooted).
Modification: [2012-02-26: not intended to affect the original question, which has a bounty]
I am not interested knowing the “wall time”, I’m interested in knowing elapsed time with some accuracy, which Date() cannot possibly provide.

You could use window.performance.now() - since Firefox 15, and window.performance.webkitNow() - Chrome 20]
var a = window.performance.now();
//...
var delay = window.performance.now() - a;

You could wrap Date() or Date.now() so as to force it to be monotonic (but inaccurate). Sketch, untested:
var offset = 0;
var seen = 0;
function time() {
var t = Date.now();
if (t < seen) {
offset += (seen - t);
}
seen = t;
return t + offset;
}
If the system clock is set back at a given moment, then it will appear that no time has passed (and an elapsed time containing that interval will be incorrect), but you will at least not have negative deltas. If there are no set-backs then this returns the same value as Date.now().
This might be a suitable solution if you're writing a game simulation loop, for example, where time() is called extremely frequently — the maximum error is the number of set-backs times the interval between calls. If your application doesn't naturally do that, you could explicitly call it on a setInterval, say (assuming that isn't hosed by the system clock), to keep your accuracy at the cost of some CPU time.
It is also possible that the clock will be set forward, which does not prevent monotonicity but might have equally undesirable effects (e.g. a game spending too long trying to catch up its simulation at once). However, this is not especially distinguishable from the machine having been asleep for some time. If such a protection is desired, it just means changing the condition next to the existing one, with a constant threshold for acceptable progress:
if (t > seen + leapForwardMaximum) {
offset += (seen - t) + leapForwardMaximum;
}
I would suggest that leapForwardMaximum should be set to more than 1000 ms because, for example, Chrome (if I recall correctly) throttles timers in background tabs to fire not more than once per second.

Javascript itself does not have any functionality to access the nanoTime. You might load a java-applet to aqcuire that information, like benchmark.js has done. Maybe #mathias can shed some light on what they did there…

Firefox provides "delay" argument for setTimeout...
this is the one of ways to implement monotonically increased time counter.
var time = 0;
setTimeout(function x(actualLateness) {
setTimeout(x, 0);
time += actualLateness;
}, 0);

Related

Making a clock that takes 2560ms to complete a revolution in JavaScript

I'm trying to make a clock for a Libet task - a cognitive task used by psychologists. By convention these clocks take 2560 ms to complete a revolution. Mine seems to be running quite a lot slower and I can figure out why.
I have made an example of the issue here: https://jsfiddle.net/Sumoza/re4v7Lcj/8/
On each iteration of a setInterval with a 1ms delay, I increase the angle of rotation of the hand by (Math.PI*2)/2560, i.e. 1/2560th of a circle in radians. So, incrementing one of these each ms should complete the circle in that number of ms. As you can see from the clock, this runs a fair bit slower. Can anyone shed any light as to why. Interestingly, *10 seems to look a lot closer to what I want - but that doesn't make much sense to me so I'm wary of the fix. Relevant JS from the fiddle below. Thanks for any help.
var time = 0;
var rad_tick = (Math.PI*2)/2560; //radians per tick: last number is ms per revolution
// Draw Clock
const clock = document.getElementById('clock')
const clock_ctx = clock.getContext('2d')
clock_ctx.translate(clock.width/2, clock.height/2);
radius = (clock.height/2) * 0.7;
function drawClock() {
clock_ctx.clearRect(-clock.width/2, -clock.height/2, clock.width, clock.height);
clock.style.backgroundColor = 'transparent';
clock_ctx.beginPath();
clock_ctx.arc(0, 0, radius, 0 , 2 * Math.PI);
clock_ctx.stroke();
}
drawClock();
// Draw Hand
const hand = document.getElementById('hand')
const hand_ctx = hand.getContext('2d')
hand_ctx.translate(hand.width/2, hand.height/2);
function drawHand(time){
hand_ctx.clearRect(-hand.width/2, -hand.height/2, hand.width, hand.height);
hand_ctx.beginPath();
hand_ctx.moveTo(0,0);
hand_ctx.rotate(time);
hand_ctx.lineTo(0, -radius*0.9);
hand_ctx.stroke();
hand_ctx.rotate(-time);
}
function drawTime(){
time+=rad_tick;//*10
drawHand(time)
}
var intervalId = setInterval(drawTime, 1);
Try this
var starttime = new Date().getTime();
function drawTime() {
var elapsed = new Date().getTime() - starttime;
var time = elapsed % 2560;
time = time * rad_tick;
console.log(elapsed + "->" + time);
drawHand(time)
}
Base things on the current time, and don't depend on setInterval to fire at the exact time you set it to fire. If there are things in-queue at the time that setInterval is supposed to fire, it will do those things in-queue first and only then run the function you've wired to setIterval (causing the delays you see --which build up more and more over time). Learn about the event loop and requestAnimationFrame.
Instead, get the time (freshly) prior to doing something, and calculate how much time has passed since the start time. Base your animations on these calculations. This way, even if there are slight delays, your last action will always sync things to the way they're supposed to look right now.
When you do it this way, you might lose some frames (skipping things the engine didn't have time to do), but it will likely complete much closer to the expected completion time.
The bottom line is, if you tell the browser to do more than it can do in a particular amount of time, there will be delays. However, you can skip frames of an animation to make things complete much closer to the expected completion time.
You ask setInterval to call your callback every 1ms, but does it actually call it every 1ms?
Try this code:
let cnt = 0;
setInterval(() => cnt++, 1);
setTimeout(() => { console.log(cnt); }, 1000);
For me, it prints something between 230 and 235. So looks like the actual (average) minimum interval for me is a bit over 4ms.
Adding on to Stig, since javascript isn't super fast when it comes to these things, it won't be super accurate. Using something like getTime() will allow proper timing. How about you call getTime in your drawTime function, so that around every millisecond, it will check the actual time and use that value.

Measuring page load timings using JavaScript

I have created a script in JavaScript that is injected into our Ext JS application during automated browser testing. The script measures the amount of time taken to load the data in our grids.
Specifically, the script polls each grid, looks to see if there is a first row or a 'no data' message, and once all grids have satisfied this condition the script records the value between Date.now() and performance.timing.fetchStart, and treats this as the time the page took to load.
This script works more or less as expected, however when compared with human measured timings (Google stopwatch ftw), the time reported by this test is consistently around 300 milliseconds longer than when measured by stopwatch.
My questions are these:
Is there a hole in this logic that would lead to incorrect results?
Are there any alternative and accurate ways to achieve this
measurement?
The script is as follows:
function loadPoll() {
var i, duration,
dataRow = '.firstRow', noDataRow = '.noData',
grids = ['.grid1', '.grid2', '.grid3','.grid4', 'grid5', 'grid6', 'grid7'];
for (i = 0; i < grids.length; ++i) {
var data = grids[i] + ' ' + dataRow,
noData = grids[i] + ' ' + noDataRow;
if (!(document.querySelector(data) || document.querySelector(noData))) {
window.setTimeout(loadPoll, 100);
return;
}
}
duration = Date.now() - performance.timing.fetchStart;
window.loadTime = duration;
}
loadPoll();
Some considerations:
Although I am aware that human response time can be slow, I am sure
that the 300 millisecond inconsistency is not introduced by the human
factor of using Google stopwatch.
Looking at the code it might appear that the polling of multiple
elements could lead to the 300 ms inconsistency, however when I
change the number of elements being monitored from 7 to 1, there
still appears to be a 300 ms surplus in the time reported by the
automated test.
Our automated tests are executed in a framework controlled by
Selenium and Protractor.
Thanks in advance if you are able to provide any insight to this!
If you use performance.now() the time should be accurate to 5 microseconds. According to MDN:
The performance.now() method returns a DOMHighResTimeStamp, measured
in milliseconds, accurate to five thousandths of a millisecond (5
microseconds).
The returned value represents the time elapsed since the time origin
(the PerformanceTiming.navigationStart property).
If I were you I would revise my approach to how the actual measuring of the time is captured. Rather than evaluating the time for each loadPoll() call, you can evaluate how many calls you can perform for a given period of time. In other words you can count the number of function iterations for a longer period of time, eg 1000 milliseconds. Here's how this can be done:
var timeout = 1000;
var startTime = new Date().getTime();
var elapsedTime = 0;
for (var iterations = 0; elapsedTime < timeout; iterations++) {
loadPoll();
elapsedTime = new Date().getTime() - startTime;
}
// output the number of achieved iterations
console.log(iterations);
This approach will give you more consistent and accurate time estimates. Faster systems will simply achieve a greater number of iterations. Keep in mind that setInterval()/setTimeout() are not perfectly precise and for really small interval timers these functions may give you invalid results due to garbage collection, demands from events and many other things that can run in parallel while your code is being executed.

Understanding JavaScript performance variance

http://jsfiddle.net/6L2pJ/
var test = function () {
var i,
a,
startTime;
startTime = new Date().getTime();
for (i = 0; i < 3000000000; i = i + 1) {
a = i % 5;
}
console.log(a); //prevent dead code eliminiation
return new Date().getTime() - startTime;
};
var results = [];
for (var i = 0; i < 5; i = i + 1) {
results.push(test());
}
for (var i = 0; i < results.length; i = i + 1) {
console.log('Time needed: ' + results[i] + 'ms');
}
Results in:
First execution:
Time needed: 13654ms
Time needed: 32192ms
Time needed: 33167ms
Time needed: 33587ms
Time needed: 33630ms
Second execution:
Time needed: 14004ms
Time needed: 32965ms
Time needed: 33705ms
Time needed: 33923ms
Time needed: 33727ms
Third execution:
Time needed: 13124ms
Time needed: 30706ms
Time needed: 31555ms
Time needed: 32275ms
Time needed: 32752ms
What is the reason for the jump from first to second row?
My setup:
Ubuntu 13.10
Google Chrome 36.0.1985.125 (Mozilla Firefox 30.0 giving same kind of results)
EDIT:
I modified the code leaving it semantically the same but inlining everything. Interestingly it does not only speed up the execution significantly but it also removes the phenomena that I described above to a great extent. A slight jump is still noticable though.
Modified code:
http://jsfiddle.net/cay69/
Results:
First execution:
Time needed: 13786ms
Time needed: 14402ms
Time needed: 14261ms
Time needed: 14355ms
Time needed: 14444ms
Second execution:
Time needed: 13778ms
Time needed: 14293ms
Time needed: 14236ms
Time needed: 14459ms
Time needed: 14728ms
Third execution:
Time needed: 13639ms
Time needed: 14375ms
Time needed: 13824ms
Time needed: 14125ms
Time needed: 14081ms
After a bit testing, I think I have pin-pointed what may be causing the difference. It must have something to do with type I think.
var i,
a = 0,
startTime;
var a = 0 gives me a uniformed result with an overall faster performance, on the other hand var a = "0" gives me the same result as yours: the first one is somewhat faster.
I have no clue why this happens.
The following is only a pseudo-answer, which I hope may become updated by the community. Originally, it was going to be a comment, but it became too lengthy, too quickly and thus needed to be posted as an answer.
Interesting/Findings
In running a few tests, I couldn't find any correlation to console.log being used. Testing in OSX Safari, I found that the problem existed with and without printing to the console.
What I did notice was a pattern. There was an inflection point as I approached 2147483648 (2^31) from your initial starting value. This most likely depends on the user's environment, but I found an inflection point around 2147485000 (try numbers above and below; 2147430000..2147490000). Somewhere around this number is where the timings became more uniform.
I was really hoping it would be 2^31 [exactly], since that number is also significant in computer terms; it is the upper bound of a long integer. However, my tests resulted to a number that was slightly more than that (for reasons unknown at this point). Other than making sure the swap file wasn't being used, I didn't do any other memory analysis.
EDIT from asker:
On my setup it actually is exactly 2^31 where the jump occurs. I tested it by playing around with with the following code:
http://jsfiddle.net/8w24v/
This information may support Derek's initialization observation.
This is just a thought and might be a stretch:
The LLVM or something else may be performing some up-front optimizations. Perhaps the loop variable starts out as an int and then after a pass or two the optimizer notices the variable becomes a long. In trying to optimize, it tries to set it as a long up front, only in this case it's not an optimization that saves time, since working with a regular integer performs better than the conversion cost from int to long.
I wouldn't be surprised if the answer was somewhere in the ECMAScript documentation :)
It appears that Google Chrome is breaking up your script execution into chunks, and giving processing time to other processes. Its not noticeable until your execution hits around 600ms per function call. I tested with a smaller subset of data (300000000 if I remember correctly.)

Milliseconds out of sync

I am developing a stopwatch application using Javascript/jQuery. The problem is that the milliseconds value is out of sync with REAL milliseconds. I am using function setInterval() with the interval of 1 millisecond, still it is causing this problem.
jsFiddle: http://jsfiddle.net/FLv3s/
Please help!
Use setInterval to trigger updates, but use the system time (via new Date()) for the actual time calculations.
To be honest, I tried nearly the same thing as you do now (Creating an accurate Metronome in Javascript only) - to make a long story short: To be absolutely accurate in terms of milliseconds (or lower) is sadly not (yet) possible with javascript only.
For more insight i recommend this question: Does JavaScript provide a high resolution timer?
or to be more precise this blog article: http://ejohn.org/blog/how-javascript-timers-work/
Best regards,
Dominik
Program execution in any language, not just JavaScript, is not realtime. It will take a tiny amount of time to actually run the code to increment your counter and update the view, and that throws the "timing" off.
Additionally, many browsers have a "minimum timeout" length, which varies between 4 and about 16 (the latter being the computer's own clock timer), which will really mess with your code.
Instead, you should use delta timing.
var startTime = new Date().getTime();
setInterval(function() {
var elapsed = new Date().getTime()-startTime;
// update view according to elapsed time
},25);
If you're worried about it looking choppy, consider using requestAnimationFrame instead, to update the timer exactly once per frame - this has the added benefit of not updating when the user switches tabs (but it will still be timing them) because if the tab is not active then there's no need to redraw stuff.
You can use the new performance object to get a more accurate time. Combine this with requestAnimationFrame instead of setInterval:
var startTime = performance.now(),
currentTime,
isRunning = true;
loop();
function loop(timeElapsed) {
currentTime = performance.now();
if (isRunning) requestAnimationFrame(loop);
}
Just subtract startTime from currentTime.
I left timeElapsed which contains time elapsed handed by rAF which you can may use also for something (or just ignore it).
One note though: not all browsers support this yet (and some may use prefix) so for mobile you need to use the standard system time object.

JavaScript Duration of animation isn't exact

First of all I want to mention two things,
One: My code isn't perfect (esspechially the eval parts) - but I wanted to try something for my self, and see if I could duplicate the jQuery Animation function, so please forgive my "bad" practices, and please don't suggest that I'll use jQuery, I wanted to experiment.
Two: This code isn't done yet, and I just wanted to figure out what makes it work badly.
So the animation runs for about 12 seconds while the duration parameter I entered was 15 seconds, What am I doing wrong?
function animate(elem, attr, duration){
if(attr.constructor === Object){//check for object literal
var i = 0;
var cssProp = [];
var cssValue = [];
for(key in attr) {
cssProp[i] = key;
cssValue[i] = attr[key];
}
var fps = (1000 / 60);
var t = setInterval(function(){
for(var j=0;j<cssProp.length;j++){
if(document.getElementById(elem).style[cssProp[j]].length == 0){
//asign basic value in css if the object dosn't have one.
document.getElementById(elem).style[cssProp[j]]= 0;
}
var c = document.getElementById(elem).style[cssProp[j]];
//console.log(str +" | "+c+"|"+cssValue[j]);
if(c > cssValue[j]){
document.getElementById(elem).style[cssProp[j]] -= 1/((duration/fps)*(c-cssValue[j]));
}else if(c < cssValue[j]){
document.getElementById(elem).style[cssProp[j]] += 1/((duration/fps)*(c-cssValue[j]));
}else if(c == cssValue[j]){
window.clearInterval(t);
}
}
},fps);
}
}
animate('hello',{opacity:0},15000);
html:
<p id="hello" style="opacity:1;">Hello World</p>
Note: I guess there is a problem with the
(duration/fps)*(c-cssValue[j])
Part or/and the interval of the setInterval (fps variable).
Thanks in advance.
I'm not gonna try and refactor that and figure it out, cause it's pretty wonky. That said... a few things.
Don't rely on the value you are animating to let you know animation progress
In general your approach is unsound. You are better off keeping track of progress yourself. Also, as a result of your approach your math seems like it's trying too hard, and should be much simpler.
Think of it like this: your animation is complete when the time has elapsed, not when the animated value seems to indicate that it's at the final position.
Don't increment, set
Floating point math is inexact, and repeated addition cumulation like this is going accumulate floating point errors as well. And it's far more readable to make some variables to keep track of progress for you, which you can use in calculations.
animatedValue += changeOnThisFrame // BAD!
animatedValue = valueOnThisFrame // GOOD!
Don't do the positive/negative conditional dance
It turns out that 10 + 10 and 10 - (-10) is really the same thing. Which means you can always add the values, but the rate of change can be negative or positive, and the value will animate in the appropriate direction.
timeouts and intervals aren't exact
Turns out setTimeout(fn, 50) actually means to schedule the fn to be call at least 50ms later. The next JS run loop to execute after those 50ms will run the function, so you can't rely on it to be perfectly accurate.
That said it's usually within a few milliseconds. But 60fps is about 16ms for frame, and that timer may actually fire in a variable amount of time from 16-22ms. So when you do calculations based on frame rate, it's not matching the actual time elapsed closely at all.
Refactor complex math
Deconstructing this line here is gonna be hard.
document.getElementById(elem).style[cssProp[j]] -= 1/((duration/fps)*(c-cssValue[j]));
Why for more complex break it up so you can easily understand what's going on here. refactoring this line alone, I might do this:
var style = document.getElementById(elem).style;
var changeThisFrame = duration/fps;
var someOddCalculatedValue = c-cssValue[j];
style[cssProp[j]] -= 1 / (changeThisFrame * someOddCalculatedValue);
Doing this makes it clearer what each expression in your math means and what it's for. And because you didn't do it here, I had a very hard time wondering why c-cssValue[j] was in there and what it represents.
Simple Example
This is less capable than what you have, but it shows the approach you should be taking. It uses the animation start time to create the perfect value, depending on how complete the animation should be, where it started, and where it's going. It doesn't use the current animated value to determine anything, and is guaranteed to run the full length of the animation.
var anim = function(elem, duration) {
// save when we started for calculating progress
var startedAt = Date.now();
// set animation bounds
var startValue = 10;
var endValue = 200;
// figure out how much change we have over the whole animation
var delta = endValue - startValue;
// Animation function, to run at 60 fps.
var t = setInterval(function(){
// How far are we into the animation, on a scale of 0 to 1.
var progress = (Date.now() - startedAt) / duration;
// If we passed 1, the animation is over so clean up.
if (progress > 1) {
alert('DONE! Elapsed: ' + (Date.now() - startedAt) + 'ms');
clearInterval(t);
}
// Set the real value.
elem.style.top = startValue + (progress * delta) + "px";
}, 1000 / 60);
};
anim(document.getElementById('foo'), 5000);
​
JSFiddle: http://jsfiddle.net/DSRst/
You cannot use setInterval for accurate total timing. Because JS is single threaded and multiple things compete for cycles on the one thread, there is no guarantee that the next interval call will be exactly on time or that N intervals will consume the exact duration of time.
Instead, pretty much all animation routines get the current time and use the system clock to measure time for the total duration. The general algorithm is to get the start time, calculate a desired finish time (starttime + duration). Then, as you've done, calculate the expected step value and number of iterations. Then, upon each step, you recalculate the remaining time left and the remaining step value. In this way, you ensure that the animation always finishes exactly (or nearly exactly) on time and that you always get exactly to the final position. If the animation gets behind the ideal trajectory, then it will self correct and move slightly more for the remaining steps. If it gets ahead for any reason (rounding errors, etc...), it will dial back the step size and likewise arrive at the final position on time.
You may also need to know that browsers don't always support very small timing amounts. Each browser has some sort of minimum time that they will allow for a timer operation. Here's an article on minimum timer levels.
Here's an article on tweening (the process of continually recalculating the step to fit the duration exactly).
I'd also suggest that you look at the code for doing animation in some libraries (jQuery, YUI or any other one you find) as they can all show you how this is done in a very general purpose way, including tweening, easing functions, etc...

Categories