Detecting whether browser is fast enough for site - javascript

Feature detection is generally preferred over browser sniffing. What should I do in a case where certain browsers "support" the features I'm using but have javascript runtimes that are too slow?
I'm using the d3 library for some complicated visualizations. The visualization is very smooth in chrome / firefox, acceptable in IE9, and slow yet working in IE8. I'd like to display a banner to IE8 users telling them to upgrade and a notice banner to IE9 users that it would be faster in chrome or FF. Is it bad to do this via user agent sniffing?

Why not measuring the time that the browser takes to compute something complex, similar to what you want to do, and set a threshold time for it?
function detectBrowserSpeed(){
var i,
slowThreshold = 100; // milliseconds
startMillis = + new Date(); //The + is to 'force' casting to an integer representing EPOCH milliseconds. If + is ommited, then I get an instance of Date.
//Do something complex here:
for (i=0;i<100000;i+=0.1){
}
var elapsed = (+ new Date()) - startMillis;
if(elapsed > slowThreshold){
return 'slow';
}else{
return 'fast';
}
}

Related

Building your own Timer in React [duplicate]

I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)

Date object in Firefox always returns milliseconds rounded to hundreds

I found this behavior by accident when I recently used console.time. In Firefox it always returns either 0ms or 100ms. This happens because the date is always rounded to hundreds of milliseconds. For example +new Date() will return 1552469978800 instead of 1552469978877. Do you know since when is this a thing or how can I possibly get exact time? Also affects setTimeout and setInterval.
This happens because the date is always rounded to hundreds of milliseconds.
I don't see that behavior in Firefox v65 on *nix, nor v48, v56, v57, or v65 on Windows.
But if it's happening in some versions or on some platforms, it may have been a response to Spectre. For the same reason, the alternative I would have pointed out to (performance.now) is less useful than it would otherwise be, because:
The timestamp is not actually high-resolution. To mitigate security threats such as Spectre, browsers currently round the results to varying degrees. (Firefox started rounding to 1 millisecond in Firefox 60.) Some browsers may also slightly randomize the timestamp. The precision may improve again in future releases; browser developers are still investigating these timing attacks and how best to mitigate them.
Finally found a final answer to this problem. The whole problem is privacy.resistFingerprinting setting enabled by default in recent versions of Firefox.
Fingerprinting protection probably makes more problems than good in this case. You are now completely unable to properly set timezone in Javascript so some web app, for example Slack will always show GMT+0 time instead of your actual time.
Also another annoying thing is that JavaScript animations (especially affecting jQuery plugins) which use setInterval or setTimeout functions are now running at 10 frames per second.
When you disable fingerprinting protection, everything works fine after you restart the browser.
Try this:
function convert( ms ) {
var seconds = ms / 1000;
var hours = parseInt( seconds / 3600 );
seconds = seconds % 3600;
var minutes = parseInt( seconds / 60 );
seconds = seconds % 60;
var pad = function(x) { return (x < 10) ? "0"+x : x; }
return pad(hours)+":"+
pad(minutes)+":"+
pad(seconds);
}
var time = 100000000;
console.log(convert(time));
That would convert milliseconds to hours : minutes : seconds format.

How to measure the accuracy of performance.now

I'm trying to figure out what the accuracy of performance.now is in Chrome. Using the following code:
const results = []
let then = 0
for(let i = 0; i < 10000; i++) {
const now = performance.now()
if (Math.abs(now - then) > 1e-6) {
results.push(now)
then = now
}
}
console.log(results.join("\n"))
I am getting the following results:
55058.699999935925
55058.79999976605
55058.89999959618
55058.99999989197
55059.09999972209
My understanding is that these values are in seconds, which means that each measurement is roughly 100ms apart. Is my testing methodology flawed or is performance.now actually limited to 100ms resolution in Chrome? I looked online and what I found stated the accuracy to be 100μs with 100μs jitter.
These results are in milliseconds (55K seconds would mean you had your page opened for 15 Hours when this script did execute...)
As for the precision, this is now browser dependent and subject to change when better solutions against TimeBased attacks will be found, but yes, Chrome does limit the accuracy (0.1s) and add jitter (±0.1ms), Firefox does limit even further (1ms by default) and also adds jitter (though there you can set these options), Edge does like Chrome according to this comment, and it seems Safari does a 1ms clamp only...

What is the reason JavaScript setTimeout is so inaccurate?

I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)

What is some JavaScript that illustrates the speed differences between IE and FireFox (or Chrome, Safari, etc)

Recently, I've heard a number of different people lamenting the speed differences in IE versus pretty well every other browser when it comes to using JavaScript to manipulate the DOM.
I thought I'd put together a tiny little script to see what the differences really were, but I think I'm looking at the wrong problem as IE performs as well as or better with the tests I've developed.
Does anyone have some JavaScript laying around that would be good at illustrating the differences in speed of IE versus other browsers, specifically code that manipulates the DOM?
I'd like to test some optimization techniques, but I need a good test case first.
Edit: Sorry, here is my tiny little throwaway script:
var counter = 0; // Global element counter
function addCheckBoxes(){
var container = document.getElementById('container');
var newBox = document.getElementById('check1').cloneNode(true);
newBox.id = '';
container.appendChild(newBox);
}
function addLotsOfBoxes(){
var thistime = new Date();
for(i=0; i < 8000; i++)
{
addCheckBoxes();
}
var thattime = new Date();
var timediff = thattime - thistime;
alert(timediff);
}
The dromaeo benchmark, by mozilla should be a good test of dom manipulation performance.
There are also the sunspider benchmarks, however those do not touch the DOM at all.
I have a few demos that have been thrown around in the past
A fluid dynamics simulator
A particle engine
Edge detection of video in canvas
But there are a huge number at Nihilogic (especially this one)
[edit(olliej): whoops, i just realised that none of these will work in IE :-( ]

Categories