calculating FPS in Javascript less than 1 millisecond - javascript

Is it possible to measure time gaps less than 1 milliseconds that is supported in all browsers i know of only one way which is in Chrome.
The chrome method : window.performance.now()
Currently i do FPS measurements in millisecond time spaces, but if less than 1ms passes i get infinity because the two numbers are rounded to nearest millisecond so they are the same value.
Does any one know a cross browser function calculate less than 1 millisecond time gaps in javascript?

Here's how you get accurate measurements without an accurate timer, so long as what you're timing occurs often, which I'm hoping they do in your case.
Average/aggregate the imprecise measurements of the duration of your event. A snippet out of one of my projects:
var start = Date.now();
... (stuff to be timed)
var now = Date.now();
if (DEBUG.enabled) {
var profile_this_iter = now - start;
profile += (profile_this_iter - profile) * 0.02;
}
Each new value measures nudges your reading closer to it by a factor of 0.02. Obviously you'll want to tweak that a bit. This will allow you to read an average that hovers around 0.5ms if you read a duration of 1ms half the time and 0ms half the time (with a 1ms resolution timer).
This is obviously not a replacement for a proper higher resolution timer. But I use this simple algorithm to give my javascript projects a non-crappy FPS reading. You get a damping factor that you can tweak depending on if you want more accuracy or more immediate response to changes. Considering the simplicity of this one, you'd be hard pressed to find a more elegant algorithm to provide you a good representation without any improved input data. One of the ways to enhance it would be to adjust the approach factor (that 0.02 constant) based on the frequency of sampling itself (if that changes), this way a slower measured rate could be made to converge more quickly than with a fixed value.

There is actually another way to calculate the fps, which may be a way to go around this issue. It is to count the actual number of frames in a second, which should be quite accurate, I think.
var fpsStart = new Date().getTime();
var fpsCounting = 0;
var fps = 0;
start_the_first_frame();
// Loop
function update(){
do_time_consuming_stuff();
fpsCounting++;
var thisFrame = new Date().getTime();
if(thisFrame - fpsStart >= 1000){
fpsStart += 1000;
fps = fpsCounting;
fpsCounting = 0;
}
request_next_animation_frame();
}
P.S. just typed right here, not tested, may require slight changes.
I remember seeing a way like this in lwjgl tutorial...
Also as noted by #StevenLu, you can modify it to count the number of frames in 0.5 second and multiply the "fps" by two, or even shorter time (e.g. 0.25 second) so that the update of the fps value will be more frequent.

High resolution time is available in Chrome 20, but you should be aware, that time resolution in JS depends on the browser, device and circumstances. It might vary between 4ms and 1000+ms

Related

Why does millisecond counter keep going in requestAnimationFrame()?

I'm making a game in JavaScript, using the great requestAnimationFrame(callback) function.
Today I learned that the callback passed to requestAnimationFrame() gets a high resolution time measured since we opened that page. Let's call it ms:
function paint(ms) {
// draw my game
requestAnimationFrame(paint);
}
requestAnimationFrame(paint);
It's all great, but there's one thing I don't quite get.
Function requestAnimationFrame() doesn't do anything when we go to another tab, so the rendering is paused. On the other hand, the time passed to the callback still goes on when we leave. Because of this, I'm not sure how would I make any use of that value. If it worked proportionally with rendering engine, I could use ms to calculate logical time of my game, because relying on requestAnimationFrame() as a rock stable 60 FPS doesn't sound like the greatest idea.
Am I missing something? What's the purpose of the ms parameter if it continues to count when we leave our tab?
That's just a timestamp, it doesn't really count anything. It's indeed generally the same as performance.now() which gives the amount of time since the page is active. As to why, it's just how the DOMHighResTimeStamp's origin has been defined.
We usually use it as a way to know how long time has elapsed since some prior event occurred, that is we store a start_time, then check the current timestamp and we can get the delta time of our animation, regardless of the actual frame-rate.
By the way, no, relying on requestAnimationFrame to be at any fixed frame-rate is really not a good idea, requestAnimationFrame is not tied to any frame-rate by specs, and actually it is recommended to align it with the screen refresh-rate (though only Blink does so).
So you actually need to rely on such a timestamp in order to have a consistent speed across different setups. Doing a simple pos++ at every frame will make your animation run twice faster on a 120Hz monitor than on a 60Hz one (at least in Chrome).
This only makes your idea more complex to perceive how it could be implemented: would your paint-timer go twice faster when the window is moved to a 120Hz monitor?
This timestamp may also help to catch up on long frames, it's not because the system had a hiccup that you necessarily want your animation to last longer.
Similarly, not all cases want the animation logic to stop when the window is blurred. Let's say I have a header with a background animation powered by rAF, I don't really want it to pause when going off-screen, even though it didn't have been painted.
If you want your game logic to pause when the window is blurred, listen for the onvisibilitychange, and save the current timestamp (using performance.now() since we're outside of rAF).
To add to Kaiido's answer. It's also common in apps to limit the time delta. For example a common way framerate independant calculations in apps is to compute the time between frames
let previousTime = 0;
function loop(currentTime) {
const deltaTime = currenTime - previousTime;
previousTime = currentTime;
// use deltaTime in various calculations
posX = posX + velX * deltaTime;
requestAnimationFrame(loop);
}
requestAnimationFrame(loop);
But, sometimes collision and or other math can break if deltaTime is too large so games will often add a limiter
// don't let deltaTime be more than a 1/10 of a second
const deltaTime = Math.min(currenTime - previousTime, 1000 / 10);
This mostly removes the need for checking for onvisiblitiychange. Especially if you keep your own animation/game clock.
let previousTime = 0;
let clock = 0;
let clockRate = 1;
function loop(currentTime) {
const deltaTime = Math.min(currenTime - previousTime, 1000 / 10) * clockRate;
previousTime = currentTime;
clock += deltaTime; // update our own clock
requestAnimationFrame(loop);
}
requestAnimationFrame(loop);
Now not only will or clock not jump if the player hides the tab but we can set clockRate to 0 if we want things to pause.
Why use your own clock? It lets you easily slow down or speed up time for all calculations that depend on that clock (see clockRate above). Also many app (games) have multiple clocks. They'd compute a different deltaTime for each clock. For example one clock for all objects that need to pause vs clocks need to keep going even even with the app/game is pause. Another might be a clock for the player vs one for the enemies so that when the player uses their "slow time super power" the enemies motions go slow but the player's motions remain at the same speed.

Measuring page load timings using JavaScript

I have created a script in JavaScript that is injected into our Ext JS application during automated browser testing. The script measures the amount of time taken to load the data in our grids.
Specifically, the script polls each grid, looks to see if there is a first row or a 'no data' message, and once all grids have satisfied this condition the script records the value between Date.now() and performance.timing.fetchStart, and treats this as the time the page took to load.
This script works more or less as expected, however when compared with human measured timings (Google stopwatch ftw), the time reported by this test is consistently around 300 milliseconds longer than when measured by stopwatch.
My questions are these:
Is there a hole in this logic that would lead to incorrect results?
Are there any alternative and accurate ways to achieve this
measurement?
The script is as follows:
function loadPoll() {
var i, duration,
dataRow = '.firstRow', noDataRow = '.noData',
grids = ['.grid1', '.grid2', '.grid3','.grid4', 'grid5', 'grid6', 'grid7'];
for (i = 0; i < grids.length; ++i) {
var data = grids[i] + ' ' + dataRow,
noData = grids[i] + ' ' + noDataRow;
if (!(document.querySelector(data) || document.querySelector(noData))) {
window.setTimeout(loadPoll, 100);
return;
}
}
duration = Date.now() - performance.timing.fetchStart;
window.loadTime = duration;
}
loadPoll();
Some considerations:
Although I am aware that human response time can be slow, I am sure
that the 300 millisecond inconsistency is not introduced by the human
factor of using Google stopwatch.
Looking at the code it might appear that the polling of multiple
elements could lead to the 300 ms inconsistency, however when I
change the number of elements being monitored from 7 to 1, there
still appears to be a 300 ms surplus in the time reported by the
automated test.
Our automated tests are executed in a framework controlled by
Selenium and Protractor.
Thanks in advance if you are able to provide any insight to this!
If you use performance.now() the time should be accurate to 5 microseconds. According to MDN:
The performance.now() method returns a DOMHighResTimeStamp, measured
in milliseconds, accurate to five thousandths of a millisecond (5
microseconds).
The returned value represents the time elapsed since the time origin
(the PerformanceTiming.navigationStart property).
If I were you I would revise my approach to how the actual measuring of the time is captured. Rather than evaluating the time for each loadPoll() call, you can evaluate how many calls you can perform for a given period of time. In other words you can count the number of function iterations for a longer period of time, eg 1000 milliseconds. Here's how this can be done:
var timeout = 1000;
var startTime = new Date().getTime();
var elapsedTime = 0;
for (var iterations = 0; elapsedTime < timeout; iterations++) {
loadPoll();
elapsedTime = new Date().getTime() - startTime;
}
// output the number of achieved iterations
console.log(iterations);
This approach will give you more consistent and accurate time estimates. Faster systems will simply achieve a greater number of iterations. Keep in mind that setInterval()/setTimeout() are not perfectly precise and for really small interval timers these functions may give you invalid results due to garbage collection, demands from events and many other things that can run in parallel while your code is being executed.

get a smooth animation for a canvas game

How to get a better animation, dinamically, even when browser is busy or idle, for different devices which have different hardware capacity.
I have tried many ways and still cannot find the right way to make the game to display a better animation.
This is what i tried:
var now;
var then = Date.now();
var delta;
window.gamedraw = function(){
now = Date.now();
delta = now - then;
if(delta > 18){
then = now - (delta % 18);
game_update();
}
}
window.gameloop = setInterval(window.gamedraw,1);
18 is the interval value to update the game, but when browser is busy this interval is not good, and it needs to lower. How to get a better animation dinamically, even when browser is idle or busy ?
I suppose that the interval value is the problem, because if interval is lower then game animation is very fast, if this value is 18 then game animation is good but not when browser is busy, and I do not have idea how to change it dinamically.
To get a smooth animation, you must :
• Synchronise on the screen.
• Compute the time elapsed within your game.
• Animate only using this game time.
Synchronizing with the screen is done by using requestAnimationFrame (rAF).
When you write :
requestAnimationFrame( myCalbBack ) ;
You are registering myCalbBack to be called once, the next time the screen is available to draw on.
( If you know about double buffering (canvas are always double-buffered), this time is the next time the GPU will swap the draw buffer with the display buffer. )
If, on the other hand, you don't use rAF but a interval/timeout to schedule the draws, what will happen is that the draws won't get actually displayed until next display refresh. Which might happen (on a 60Hz display) any time from right now to 16.6 ms later.
Below with a 20ms interval and a 16 ms screen, you can see that the images actually displayed will be alternatively 16ms away OR 2*16ms away - never 20, for sure-. You just can't know, from the interval callback, when the actual draw will show. Since both rAF and Intervals are not 100% accurate, you can even have a 3 frames delta.
So now that you are on sync with the screen, here's a bad news : the requestAnimationFrame does not tick exactly regularly. For various reasons the elapsed time in between two frames might change of 1ms or 2, or even more. So if you are using a fixed movement each frame, you'll move by the same distance during a different time : the speed is always changing.
(expl : +10 px on each rAF,
16.6 display -->> rAF time of 14, 15, 16 or 17 ms
--> the apparent speed varies from 0.58 to 0.71 px/ms. )
Answer is to measure time... And use it !
Hopefully requestAnimationFrame provides you the current time so you don't even have to use Date.now(). Secondary benefit is that this time will be very accurate on Browsers having an accurate timer (Chrome desktop).
The code below shows how you could know the time elapsed since last frame, and compute an application time :
function animate(time) {
// register to be called again on next frame.
requestAnimationFrame(animate);
// compute time elapsed since last frame.
var dt = time-lastTime;
if (dt<10) return;
if (dt >100) dt=16; // consider only 1 frame elapsed if unfocused.
lastTime=time;
applicationTime+=dt;
//
update(dt);
draw();
}
var lastTime = 0;
var applicationTime = 0;
requestAnimationFrame(animate);
Now last step is to always use time inside all you formulas. So instead of doing
x += xSpeed ;
you'll have to do :
x += dt * xSpeed ;
And now not only the small variations in between frames will be taken into account, but your game will run at the same apparent speed whatever the device's screen (20Hz, 50Hz, 60 Hz, ...).
I did a small demo where you can choose both the sync and if using fixed time, you'll be able to judge of the differences :
http://jsbin.com/wesaremune/1/
You should use requestAnimationFrame instead of setInterval.
Read this article or this one.
The latter one is in fact exactly discussing your approach (with the delta time), and then introduces the animation frame as a more reliable alternative.
The first article is really a great resource. For starters, with your interval of 18 ms you're apparently aiming for something close to 60 fps. This is in fact the default for requestAnimationFrame, so you don't need to write anything special:
function gamedraw() {
requestAnimationFrame(gamedraw); //self-reference
game_update(); //your update logic, propably needs to handle time intervals internally
}
gamedraw(); //this starts the animation
If you want to set the update interval explicitly, you can do so by wrapping the requestAnimationFrame inside a setInterval, like this:
var interval = 18;
function gamedraw() {
setTimeout(function() {
requestAnimationFrame(gamedraw);
game_update(); //must handle time difference internally
}, interval);
}
gamedraw();
Note that the game_update() function must keep track of when it was last called in order to e.g. move everything twice as far as normal in case a frame had to be skipped.
Actually, this means you could (and probably should) refactor your game_update() function to take the time that has actually passed as an argument instead of determining that internally. (There's no functional difference, it just is better, clearer code IMO because it doesn't hide the timing magic.)
var time;
function gamedraw() {
requestAnimationFrame(gamedraw);
var now = new Date().getTime(),
dt = now - (time || now);
time = now; //reset the timer
game_update(dt); //update with explicit and variable time step
}
gamedraw();
(Here I dropped the explicit frames again.)
Still, I urge you to read the first article because it also deals with cross-browser issues that I haven't gotten into here.
You should use requestAnimationFrame. It will queue up a callback to run on the next time the browser renders a frame. To achieve constant updating, call the update function recursively.
var update = function(){
//Do stuff
requestAnimationFrame(update)
}

JavaScript Duration of animation isn't exact

First of all I want to mention two things,
One: My code isn't perfect (esspechially the eval parts) - but I wanted to try something for my self, and see if I could duplicate the jQuery Animation function, so please forgive my "bad" practices, and please don't suggest that I'll use jQuery, I wanted to experiment.
Two: This code isn't done yet, and I just wanted to figure out what makes it work badly.
So the animation runs for about 12 seconds while the duration parameter I entered was 15 seconds, What am I doing wrong?
function animate(elem, attr, duration){
if(attr.constructor === Object){//check for object literal
var i = 0;
var cssProp = [];
var cssValue = [];
for(key in attr) {
cssProp[i] = key;
cssValue[i] = attr[key];
}
var fps = (1000 / 60);
var t = setInterval(function(){
for(var j=0;j<cssProp.length;j++){
if(document.getElementById(elem).style[cssProp[j]].length == 0){
//asign basic value in css if the object dosn't have one.
document.getElementById(elem).style[cssProp[j]]= 0;
}
var c = document.getElementById(elem).style[cssProp[j]];
//console.log(str +" | "+c+"|"+cssValue[j]);
if(c > cssValue[j]){
document.getElementById(elem).style[cssProp[j]] -= 1/((duration/fps)*(c-cssValue[j]));
}else if(c < cssValue[j]){
document.getElementById(elem).style[cssProp[j]] += 1/((duration/fps)*(c-cssValue[j]));
}else if(c == cssValue[j]){
window.clearInterval(t);
}
}
},fps);
}
}
animate('hello',{opacity:0},15000);
html:
<p id="hello" style="opacity:1;">Hello World</p>
Note: I guess there is a problem with the
(duration/fps)*(c-cssValue[j])
Part or/and the interval of the setInterval (fps variable).
Thanks in advance.
I'm not gonna try and refactor that and figure it out, cause it's pretty wonky. That said... a few things.
Don't rely on the value you are animating to let you know animation progress
In general your approach is unsound. You are better off keeping track of progress yourself. Also, as a result of your approach your math seems like it's trying too hard, and should be much simpler.
Think of it like this: your animation is complete when the time has elapsed, not when the animated value seems to indicate that it's at the final position.
Don't increment, set
Floating point math is inexact, and repeated addition cumulation like this is going accumulate floating point errors as well. And it's far more readable to make some variables to keep track of progress for you, which you can use in calculations.
animatedValue += changeOnThisFrame // BAD!
animatedValue = valueOnThisFrame // GOOD!
Don't do the positive/negative conditional dance
It turns out that 10 + 10 and 10 - (-10) is really the same thing. Which means you can always add the values, but the rate of change can be negative or positive, and the value will animate in the appropriate direction.
timeouts and intervals aren't exact
Turns out setTimeout(fn, 50) actually means to schedule the fn to be call at least 50ms later. The next JS run loop to execute after those 50ms will run the function, so you can't rely on it to be perfectly accurate.
That said it's usually within a few milliseconds. But 60fps is about 16ms for frame, and that timer may actually fire in a variable amount of time from 16-22ms. So when you do calculations based on frame rate, it's not matching the actual time elapsed closely at all.
Refactor complex math
Deconstructing this line here is gonna be hard.
document.getElementById(elem).style[cssProp[j]] -= 1/((duration/fps)*(c-cssValue[j]));
Why for more complex break it up so you can easily understand what's going on here. refactoring this line alone, I might do this:
var style = document.getElementById(elem).style;
var changeThisFrame = duration/fps;
var someOddCalculatedValue = c-cssValue[j];
style[cssProp[j]] -= 1 / (changeThisFrame * someOddCalculatedValue);
Doing this makes it clearer what each expression in your math means and what it's for. And because you didn't do it here, I had a very hard time wondering why c-cssValue[j] was in there and what it represents.
Simple Example
This is less capable than what you have, but it shows the approach you should be taking. It uses the animation start time to create the perfect value, depending on how complete the animation should be, where it started, and where it's going. It doesn't use the current animated value to determine anything, and is guaranteed to run the full length of the animation.
var anim = function(elem, duration) {
// save when we started for calculating progress
var startedAt = Date.now();
// set animation bounds
var startValue = 10;
var endValue = 200;
// figure out how much change we have over the whole animation
var delta = endValue - startValue;
// Animation function, to run at 60 fps.
var t = setInterval(function(){
// How far are we into the animation, on a scale of 0 to 1.
var progress = (Date.now() - startedAt) / duration;
// If we passed 1, the animation is over so clean up.
if (progress > 1) {
alert('DONE! Elapsed: ' + (Date.now() - startedAt) + 'ms');
clearInterval(t);
}
// Set the real value.
elem.style.top = startValue + (progress * delta) + "px";
}, 1000 / 60);
};
anim(document.getElementById('foo'), 5000);
​
JSFiddle: http://jsfiddle.net/DSRst/
You cannot use setInterval for accurate total timing. Because JS is single threaded and multiple things compete for cycles on the one thread, there is no guarantee that the next interval call will be exactly on time or that N intervals will consume the exact duration of time.
Instead, pretty much all animation routines get the current time and use the system clock to measure time for the total duration. The general algorithm is to get the start time, calculate a desired finish time (starttime + duration). Then, as you've done, calculate the expected step value and number of iterations. Then, upon each step, you recalculate the remaining time left and the remaining step value. In this way, you ensure that the animation always finishes exactly (or nearly exactly) on time and that you always get exactly to the final position. If the animation gets behind the ideal trajectory, then it will self correct and move slightly more for the remaining steps. If it gets ahead for any reason (rounding errors, etc...), it will dial back the step size and likewise arrive at the final position on time.
You may also need to know that browsers don't always support very small timing amounts. Each browser has some sort of minimum time that they will allow for a timer operation. Here's an article on minimum timer levels.
Here's an article on tweening (the process of continually recalculating the step to fit the duration exactly).
I'd also suggest that you look at the code for doing animation in some libraries (jQuery, YUI or any other one you find) as they can all show you how this is done in a very general purpose way, including tweening, easing functions, etc...

Monotonically increasing time in JavaScript?

What’s the best way to get monotonically increasing time in JavaScript? I’m hoping for something like Java’s System.nanoTime().
Date() obviously won’t work, as it’s affected by system time changes.
In other words, what I would like is for a <= b, always:
a = myIncreasingTime.getMilliseconds();
...
// some time later, maybe seconds, maybe days
b = myIncreasingTime.getMilliseconds();
At best, even when using the UTC functions in Date(), it will return what it believes is the correct time, but if someone sets the time backward, the next call to Date() can return a lesser value. System.nanoTime() does not suffer from this limitation (at least not until the system is rebooted).
Modification: [2012-02-26: not intended to affect the original question, which has a bounty]
I am not interested knowing the “wall time”, I’m interested in knowing elapsed time with some accuracy, which Date() cannot possibly provide.
You could use window.performance.now() - since Firefox 15, and window.performance.webkitNow() - Chrome 20]
var a = window.performance.now();
//...
var delay = window.performance.now() - a;
You could wrap Date() or Date.now() so as to force it to be monotonic (but inaccurate). Sketch, untested:
var offset = 0;
var seen = 0;
function time() {
var t = Date.now();
if (t < seen) {
offset += (seen - t);
}
seen = t;
return t + offset;
}
If the system clock is set back at a given moment, then it will appear that no time has passed (and an elapsed time containing that interval will be incorrect), but you will at least not have negative deltas. If there are no set-backs then this returns the same value as Date.now().
This might be a suitable solution if you're writing a game simulation loop, for example, where time() is called extremely frequently — the maximum error is the number of set-backs times the interval between calls. If your application doesn't naturally do that, you could explicitly call it on a setInterval, say (assuming that isn't hosed by the system clock), to keep your accuracy at the cost of some CPU time.
It is also possible that the clock will be set forward, which does not prevent monotonicity but might have equally undesirable effects (e.g. a game spending too long trying to catch up its simulation at once). However, this is not especially distinguishable from the machine having been asleep for some time. If such a protection is desired, it just means changing the condition next to the existing one, with a constant threshold for acceptable progress:
if (t > seen + leapForwardMaximum) {
offset += (seen - t) + leapForwardMaximum;
}
I would suggest that leapForwardMaximum should be set to more than 1000 ms because, for example, Chrome (if I recall correctly) throttles timers in background tabs to fire not more than once per second.
Javascript itself does not have any functionality to access the nanoTime. You might load a java-applet to aqcuire that information, like benchmark.js has done. Maybe #mathias can shed some light on what they did there…
Firefox provides "delay" argument for setTimeout...
this is the one of ways to implement monotonically increased time counter.
var time = 0;
setTimeout(function x(actualLateness) {
setTimeout(x, 0);
time += actualLateness;
}, 0);

Categories