Measuring page load timings using JavaScript - javascript

I have created a script in JavaScript that is injected into our Ext JS application during automated browser testing. The script measures the amount of time taken to load the data in our grids.
Specifically, the script polls each grid, looks to see if there is a first row or a 'no data' message, and once all grids have satisfied this condition the script records the value between Date.now() and performance.timing.fetchStart, and treats this as the time the page took to load.
This script works more or less as expected, however when compared with human measured timings (Google stopwatch ftw), the time reported by this test is consistently around 300 milliseconds longer than when measured by stopwatch.
My questions are these:
Is there a hole in this logic that would lead to incorrect results?
Are there any alternative and accurate ways to achieve this
measurement?
The script is as follows:
function loadPoll() {
var i, duration,
dataRow = '.firstRow', noDataRow = '.noData',
grids = ['.grid1', '.grid2', '.grid3','.grid4', 'grid5', 'grid6', 'grid7'];
for (i = 0; i < grids.length; ++i) {
var data = grids[i] + ' ' + dataRow,
noData = grids[i] + ' ' + noDataRow;
if (!(document.querySelector(data) || document.querySelector(noData))) {
window.setTimeout(loadPoll, 100);
return;
}
}
duration = Date.now() - performance.timing.fetchStart;
window.loadTime = duration;
}
loadPoll();
Some considerations:
Although I am aware that human response time can be slow, I am sure
that the 300 millisecond inconsistency is not introduced by the human
factor of using Google stopwatch.
Looking at the code it might appear that the polling of multiple
elements could lead to the 300 ms inconsistency, however when I
change the number of elements being monitored from 7 to 1, there
still appears to be a 300 ms surplus in the time reported by the
automated test.
Our automated tests are executed in a framework controlled by
Selenium and Protractor.
Thanks in advance if you are able to provide any insight to this!

If you use performance.now() the time should be accurate to 5 microseconds. According to MDN:
The performance.now() method returns a DOMHighResTimeStamp, measured
in milliseconds, accurate to five thousandths of a millisecond (5
microseconds).
The returned value represents the time elapsed since the time origin
(the PerformanceTiming.navigationStart property).

If I were you I would revise my approach to how the actual measuring of the time is captured. Rather than evaluating the time for each loadPoll() call, you can evaluate how many calls you can perform for a given period of time. In other words you can count the number of function iterations for a longer period of time, eg 1000 milliseconds. Here's how this can be done:
var timeout = 1000;
var startTime = new Date().getTime();
var elapsedTime = 0;
for (var iterations = 0; elapsedTime < timeout; iterations++) {
loadPoll();
elapsedTime = new Date().getTime() - startTime;
}
// output the number of achieved iterations
console.log(iterations);
This approach will give you more consistent and accurate time estimates. Faster systems will simply achieve a greater number of iterations. Keep in mind that setInterval()/setTimeout() are not perfectly precise and for really small interval timers these functions may give you invalid results due to garbage collection, demands from events and many other things that can run in parallel while your code is being executed.

Related

Building your own Timer in React [duplicate]

I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)

Understanding JavaScript performance variance

http://jsfiddle.net/6L2pJ/
var test = function () {
var i,
a,
startTime;
startTime = new Date().getTime();
for (i = 0; i < 3000000000; i = i + 1) {
a = i % 5;
}
console.log(a); //prevent dead code eliminiation
return new Date().getTime() - startTime;
};
var results = [];
for (var i = 0; i < 5; i = i + 1) {
results.push(test());
}
for (var i = 0; i < results.length; i = i + 1) {
console.log('Time needed: ' + results[i] + 'ms');
}
Results in:
First execution:
Time needed: 13654ms
Time needed: 32192ms
Time needed: 33167ms
Time needed: 33587ms
Time needed: 33630ms
Second execution:
Time needed: 14004ms
Time needed: 32965ms
Time needed: 33705ms
Time needed: 33923ms
Time needed: 33727ms
Third execution:
Time needed: 13124ms
Time needed: 30706ms
Time needed: 31555ms
Time needed: 32275ms
Time needed: 32752ms
What is the reason for the jump from first to second row?
My setup:
Ubuntu 13.10
Google Chrome 36.0.1985.125 (Mozilla Firefox 30.0 giving same kind of results)
EDIT:
I modified the code leaving it semantically the same but inlining everything. Interestingly it does not only speed up the execution significantly but it also removes the phenomena that I described above to a great extent. A slight jump is still noticable though.
Modified code:
http://jsfiddle.net/cay69/
Results:
First execution:
Time needed: 13786ms
Time needed: 14402ms
Time needed: 14261ms
Time needed: 14355ms
Time needed: 14444ms
Second execution:
Time needed: 13778ms
Time needed: 14293ms
Time needed: 14236ms
Time needed: 14459ms
Time needed: 14728ms
Third execution:
Time needed: 13639ms
Time needed: 14375ms
Time needed: 13824ms
Time needed: 14125ms
Time needed: 14081ms
After a bit testing, I think I have pin-pointed what may be causing the difference. It must have something to do with type I think.
var i,
a = 0,
startTime;
var a = 0 gives me a uniformed result with an overall faster performance, on the other hand var a = "0" gives me the same result as yours: the first one is somewhat faster.
I have no clue why this happens.
The following is only a pseudo-answer, which I hope may become updated by the community. Originally, it was going to be a comment, but it became too lengthy, too quickly and thus needed to be posted as an answer.
Interesting/Findings
In running a few tests, I couldn't find any correlation to console.log being used. Testing in OSX Safari, I found that the problem existed with and without printing to the console.
What I did notice was a pattern. There was an inflection point as I approached 2147483648 (2^31) from your initial starting value. This most likely depends on the user's environment, but I found an inflection point around 2147485000 (try numbers above and below; 2147430000..2147490000). Somewhere around this number is where the timings became more uniform.
I was really hoping it would be 2^31 [exactly], since that number is also significant in computer terms; it is the upper bound of a long integer. However, my tests resulted to a number that was slightly more than that (for reasons unknown at this point). Other than making sure the swap file wasn't being used, I didn't do any other memory analysis.
EDIT from asker:
On my setup it actually is exactly 2^31 where the jump occurs. I tested it by playing around with with the following code:
http://jsfiddle.net/8w24v/
This information may support Derek's initialization observation.
This is just a thought and might be a stretch:
The LLVM or something else may be performing some up-front optimizations. Perhaps the loop variable starts out as an int and then after a pass or two the optimizer notices the variable becomes a long. In trying to optimize, it tries to set it as a long up front, only in this case it's not an optimization that saves time, since working with a regular integer performs better than the conversion cost from int to long.
I wouldn't be surprised if the answer was somewhere in the ECMAScript documentation :)
It appears that Google Chrome is breaking up your script execution into chunks, and giving processing time to other processes. Its not noticeable until your execution hits around 600ms per function call. I tested with a smaller subset of data (300000000 if I remember correctly.)

What is the reason JavaScript setTimeout is so inaccurate?

I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)

calculating FPS in Javascript less than 1 millisecond

Is it possible to measure time gaps less than 1 milliseconds that is supported in all browsers i know of only one way which is in Chrome.
The chrome method : window.performance.now()
Currently i do FPS measurements in millisecond time spaces, but if less than 1ms passes i get infinity because the two numbers are rounded to nearest millisecond so they are the same value.
Does any one know a cross browser function calculate less than 1 millisecond time gaps in javascript?
Here's how you get accurate measurements without an accurate timer, so long as what you're timing occurs often, which I'm hoping they do in your case.
Average/aggregate the imprecise measurements of the duration of your event. A snippet out of one of my projects:
var start = Date.now();
... (stuff to be timed)
var now = Date.now();
if (DEBUG.enabled) {
var profile_this_iter = now - start;
profile += (profile_this_iter - profile) * 0.02;
}
Each new value measures nudges your reading closer to it by a factor of 0.02. Obviously you'll want to tweak that a bit. This will allow you to read an average that hovers around 0.5ms if you read a duration of 1ms half the time and 0ms half the time (with a 1ms resolution timer).
This is obviously not a replacement for a proper higher resolution timer. But I use this simple algorithm to give my javascript projects a non-crappy FPS reading. You get a damping factor that you can tweak depending on if you want more accuracy or more immediate response to changes. Considering the simplicity of this one, you'd be hard pressed to find a more elegant algorithm to provide you a good representation without any improved input data. One of the ways to enhance it would be to adjust the approach factor (that 0.02 constant) based on the frequency of sampling itself (if that changes), this way a slower measured rate could be made to converge more quickly than with a fixed value.
There is actually another way to calculate the fps, which may be a way to go around this issue. It is to count the actual number of frames in a second, which should be quite accurate, I think.
var fpsStart = new Date().getTime();
var fpsCounting = 0;
var fps = 0;
start_the_first_frame();
// Loop
function update(){
do_time_consuming_stuff();
fpsCounting++;
var thisFrame = new Date().getTime();
if(thisFrame - fpsStart >= 1000){
fpsStart += 1000;
fps = fpsCounting;
fpsCounting = 0;
}
request_next_animation_frame();
}
P.S. just typed right here, not tested, may require slight changes.
I remember seeing a way like this in lwjgl tutorial...
Also as noted by #StevenLu, you can modify it to count the number of frames in 0.5 second and multiply the "fps" by two, or even shorter time (e.g. 0.25 second) so that the update of the fps value will be more frequent.
High resolution time is available in Chrome 20, but you should be aware, that time resolution in JS depends on the browser, device and circumstances. It might vary between 4ms and 1000+ms

Monotonically increasing time in JavaScript?

What’s the best way to get monotonically increasing time in JavaScript? I’m hoping for something like Java’s System.nanoTime().
Date() obviously won’t work, as it’s affected by system time changes.
In other words, what I would like is for a <= b, always:
a = myIncreasingTime.getMilliseconds();
...
// some time later, maybe seconds, maybe days
b = myIncreasingTime.getMilliseconds();
At best, even when using the UTC functions in Date(), it will return what it believes is the correct time, but if someone sets the time backward, the next call to Date() can return a lesser value. System.nanoTime() does not suffer from this limitation (at least not until the system is rebooted).
Modification: [2012-02-26: not intended to affect the original question, which has a bounty]
I am not interested knowing the “wall time”, I’m interested in knowing elapsed time with some accuracy, which Date() cannot possibly provide.
You could use window.performance.now() - since Firefox 15, and window.performance.webkitNow() - Chrome 20]
var a = window.performance.now();
//...
var delay = window.performance.now() - a;
You could wrap Date() or Date.now() so as to force it to be monotonic (but inaccurate). Sketch, untested:
var offset = 0;
var seen = 0;
function time() {
var t = Date.now();
if (t < seen) {
offset += (seen - t);
}
seen = t;
return t + offset;
}
If the system clock is set back at a given moment, then it will appear that no time has passed (and an elapsed time containing that interval will be incorrect), but you will at least not have negative deltas. If there are no set-backs then this returns the same value as Date.now().
This might be a suitable solution if you're writing a game simulation loop, for example, where time() is called extremely frequently — the maximum error is the number of set-backs times the interval between calls. If your application doesn't naturally do that, you could explicitly call it on a setInterval, say (assuming that isn't hosed by the system clock), to keep your accuracy at the cost of some CPU time.
It is also possible that the clock will be set forward, which does not prevent monotonicity but might have equally undesirable effects (e.g. a game spending too long trying to catch up its simulation at once). However, this is not especially distinguishable from the machine having been asleep for some time. If such a protection is desired, it just means changing the condition next to the existing one, with a constant threshold for acceptable progress:
if (t > seen + leapForwardMaximum) {
offset += (seen - t) + leapForwardMaximum;
}
I would suggest that leapForwardMaximum should be set to more than 1000 ms because, for example, Chrome (if I recall correctly) throttles timers in background tabs to fire not more than once per second.
Javascript itself does not have any functionality to access the nanoTime. You might load a java-applet to aqcuire that information, like benchmark.js has done. Maybe #mathias can shed some light on what they did there…
Firefox provides "delay" argument for setTimeout...
this is the one of ways to implement monotonically increased time counter.
var time = 0;
setTimeout(function x(actualLateness) {
setTimeout(x, 0);
time += actualLateness;
}, 0);

Categories