I'd like to run some calculations in a browser window, but I don't want it to slow the client computer down for user interaction, especially for single core machines. Is there some way to adjust the nice level of my executing JavaScript so that it will execute as fast as possible without detracting from the responsiveness of the machine?
I can't think of anything else than delaying execution of your calculations. In example, dividing all work into small pieces and then running them sequentially with some delay (using setTimeout or setInterval) between each task.
Create open loops... an example
This is a close loop
for( i = 0 ; i < 10000 ; i++)
{
doSomeMath(i);
}
This is an open loop
i = 0;
iMax = 10000
var MyWorkingThread =
setInterval(function()
{
if( i < iMax)
{
doSomeMath(i);
i++;
}
else
{
clearInterval(MyWorkingThread);
}
},1);
You can make a more, or a less work inside the open loop, but this is the general idea. I have made this many times for similar issues and I left the browser work very smooth.
For speed attempt to use a single multidimensional array to contain your data and then at the end of your function use a single join to convert that array to a string for output and output that data using the innerHTML method. That is the fastest possible method of containing and serving data in JavaScript. Definitely do not use DOM methods or elements to output your data as that is about 4 times as slow.
Output your data as few times as possible. This is going to be determined upon which event you use to execute your function. I recommend not using the onload event as this will slow the initial load time of your page. I would recommend using the onclick function associated with a button, because then the user is aware that they caused the execution that is slowing down your page.
I did some testing, and the browser needs quite some time between bursts of work to be reasonably responsive:
function work(cnt) {
// do some heavy work
for (var i=0;i<100000000;i++) ;
// start next work
if (cnt > 0) {
window.setTimeout(function(){work(cnt-1);},200);
}
}
run the calculation on the server with an ajax request
open a new window or frame and run the code there
run the code in a loop broken into intervals
be ready to use web-worker processes (html5 asynchronous script)
Related
I have a piece of text that uses Webkit's background-clip:text feature and for pure aesthetics purposes, I want to include a script to scroll through the image on the text. The one way I have thought of doing this is using setInterval and running the function I have included below. I want to know if I will run into any problems of using setInterval since I have it looping at 50ms.
Function:
function movePos(){
var obj = document.querySelector('h1 span');
var pos = parseInt(obj.style.backgroundPositionY);
obj.style.backgroundPositionY = (pos + 1) + 'px';
}
Any modern browser with even modest hardware should be able to handle executing these three lines of code 20 times per second. You should be fine.
If you want to optimize, and the obj variable won't change, you could cache that value so you're not running a selector 20 times per second. I'm not certain, but this is probably the most CPU intensive of the three lines of code.
(I need a process.nextTick equivalent on browser.)
I'm trying to get the most out of javascript performance so I made a simple counter ...
In a second I make continuous calls to a function that just adds one to a variable.
The code: codepen.io/rafaelcastrocouto/pen/gDFxt
I got about 250 with setTimeout and 70 with requestAnimationFrame in google chrome / win7.
I know requestAnimationFrame goes with screen refresh rate so, how can we make this faster?
PS: I'm aware of asm.js
Well, there's setImmediate() which runs the code immediately, ie as you'd expect to get with setTimeout(0).
The difference is that setTimeout(0) doesn't actually run immediately; setTimeout is "clamped" to a minimum wait time (4ms), which is why you're only getting a count of 250 in your test program. setImmediate() really does run immediately, so your counter test will be orders of magnitude higher using it.
However you may want to check browser support for setImmediate -- it's not available yet in all browsers. (you can use setTimeout(0) as a fallback of course though, but then you're back to the minimum wait time it imposes).
postMessage() is also an option, and can achieve much the same results, although it's a more complex API as it's intended for more doing a lot more than just a simple call loop. Plus there are other considerations to think of when using it (see the linked MDN article for more).
The MDN site also mentions a polyfill library for setImmediate which uses postMessage and other techniques to add setImmediate into browsers that don't support it yet.
With requestAnimationFrame(), you ought to get 60 for your test program, since that's the standard number of frames per second. If you're getting more than that, then your program is probably running for more than an exact second.
You'll never get a high figure in your count test using it, because it only fires 60 times a second (or fewer if the hardware refresh frame-rate is lower for some reason), but if your task involves an update to the display then that's all you need, so you can use requestAnimationFrame() to limit the number of times it's called, and thus free up resources for other tasks in your program.
This is why requestAnimationFrame() exists. If all you care about is getting your code to run as often as possible then don't use requestAnimationFrame(); use setTimeout or setImmediate instead. But that's not necessarily the best thing for performance, because it will eat up the processor power that the browser needs for other tasks.
Ultimately, performance isn't just about getting something to run the maximum number of times; it's about making the user experience as smooth as possible. And that often means imposing limits on your call loops.
Shortest possible delay while still being asynchronous is from MutationObserver but it is so short that if you just keep calling it, the UI will never have chance to update.
So trick would be to use MutationObserver to increment value while using requestAnimationFrame once in a while to update UI but that is not allowed.
See http://jsfiddle.net/6TZ9J/1/
var div = document.createElement("div");
var count = 0;
var cur = true;
var now = Date.now();
var observer = new MutationObserver(function () {
count++;
if (Date.now() - now > 1000) {
document.getElementById("count").textContent = count;
} else {
change();
}
});
observer.observe(div, {
attributes: true,
childList: true,
characterData: true
});
function change() {
cur = !cur;
div.setAttribute("class", cur);
}
change();
Use postMessage() as described in this blog.
I have wrote this code to make seconds (with decisec & centisec) counting up.
You've wasted time <span id="alltime">0.00</span> seconds.
<script type="text/javascript">
function zeroPad(num,count)
{
var numZeropad = num + '';
while(numZeropad.length < count) { numZeropad = "0" + numZeropad; }
return numZeropad; }
function counttwo() {
tall = document.getElementById('alltime').innerHTML;
if(parseFloat(tall) < 1.00) { tnew2 = tall.replace('0.0','').replace('0.',''); }
else { tnew2 = tall.replace('.',''); }
tnum = parseInt(tnew2) + 1;
//check if have to add zero
if(tnum >= 100) { tstr1 = tnum + ''; }
else { tstr1 = zeroPad(tnum,3); }
tlast = tstr1.substr(0,tstr1.length - 2) + '.' + tstr1.substr(tstr1.length - 2);
document.getElementById("alltime").innerHTML = tlast;
}
var inttwo=setInterval("counttwo()",10);
</script>
In HTML document and run.
It works well but when I use Firefox 4 and run the code. Seems like it LAG a bit (stop a bit before counting up) when it's on some numbers (randomly like 12.20, 4.43). I've tried change "counttwo()" to counttwo but that doesn't help.
I have told some of my friends to run on Firefox 4 too. They said it doesn't lag at all. This cause because of my computer ? or My Firefox ? or something else ?
Thanks in advance!
PS. Fiddle here: http://jsfiddle.net/XvkGy/5/ Mirror: http://bit.ly/hjVtXS
When you use setInterval or setTimeout the time interval is not exact for several reasons. It is dependent on other javascript running, the browser, the processor etc. You have to take a reliability of +- 15ms for granted afaik. See also ...
That's a lot of counting, so on some computer, yes, it might lag (if it's a prehistoric one or the user's got his processor really busy with something), also if I'm right, that thing won't work with Chrome's V8, since that script would freeze if you switched tabs, and resume executing only when you return to that tab.
If you're just seeing pauses every so often, you're probably seeing garbage collection or cycle collection pauses.
You can test this by toggling your javascript.options.mem.log preference to true in about:config and then watching the error console's "Messages" tab as your script runs. If the GC/CC messages are correlated with your pauses, then they're the explanation for what you see.
As for why you see it but others don't... do you see the problem if you disable all your extensions?
The problem with setInterval is that it can eventually lead to a back-up. This happens because the JavaScript engine tries to execute the function on the interval (in your case, 10ms), but if ever that execution takes longer than 10ms, the JS engine starts trying to execute the next interval before the current one stops (which really just means it queues it up to run as soon as the previous callback finishes).
Since JavaScript executes single-threaded (with the exception of web workers in HTML 5), this can lead to pauses in your UI or DOM updates because it is continuously processing JavaScript callbacks from your setInterval. In worst case scenarios, the whole page can become permanently unresponsive because your stack of uncompleted setInterval executions gets longer and longer, never fully finishing.
With a few exceptions, it is generally considered a safer bet to use setTimeout (and invoking the setTimeout again after execution of the callback) instead of setInterval. With setTimeout, you can ensure that one and only one timeout is ever queued up. And since the timers are only approximate anyway (just because you specify 10ms doesn't mean it will happen at exactly 10ms), you don't typically gain anything from using setInterval over setTimeout.
An example using setTimeout:
var count = function(){
// do something
// queue up execution once again
setTimeout(count, 10);
};
count();
One reason why you may see pauses on some browsers, and not others, is because not all JavaScript engines are created equal :). Some are faster than others, and as such, less likely to end up with a setInterval backup.
Different browsers use different JavaScript engines, so it's possible that this code just finds a spot where Firefox's JägerMonkey scripting engine has some problems. There doesn't seem to be any obvious inefficiencies in the counting itself..
If it's working on your friends' installs of FF4, then it's probably just an isolated problem for you, and there isn't much you'll be able to do by changing the code.
We are using Bing and/or Google javascript map controls, sometimes with large numbers of dynamically alterable overlays.
I have read http://support.microsoft.com/kb/175500/en-us and know how to set the MaxScriptStatments registry key.
Problem is we do not want to programmatically set this or any other registry key on users' computers but would rather achieve the same effect some other way.
Is there another way?
Hardly anything you can do besides making your script "lighter". Try to profile it and figure out where the heaviest crunching takes place, then try to optimize those parts, break them down into smaller components, call the next component with a timeout after the previous one has finished and so on. Basically, give the control back to the browser every once in a while, don't crunch everything in one function call.
Generally a long running script is encountered in code that is looping.
If you're having to loop over a large collection of data and it can be done asynchronously--akin to another thread then move the processing to a webworker(http://www.w3schools.com/HTML/html5_webworkers.asp).
If you cannot or do not want to use a webworker then you can find your main loop that is causing the long running script and you can give it a max number of loops and then cause it to yield back to the client using setTimeout.
Bad: (thingToProcess may be too large, resulting in a long running script)
function Process(thingToProcess){
var i;
for(i=0; i < thingToProcess.length; i++){
//process here
}
}
Good: (only allows 100 iterations before yielding back)
function Process(thingToProcess, start){
var i;
if(!start) start = 0;
for(i=start; i < thingToProcess.length && i - start < 100; i++){
//process here
}
if(i < thingToProcess.length) //still more to process
setTimeout(function(){Process(thingToProcess, i);}, 0);
}
Both can be called in the same way:
Process(myCollectionToProcess);
I'm calling a javascript function that sets the opacity of an iframe an unknown amount of times in rapid succession. Basically this tweens the alpha from 0 to 100.
here is the code
function setAlpha(value)
{
iframe.style.opacity = value * .01;
iframe.style.filter = 'alpha(opacity =' + val + ')';
}
My problem is that for the first time it is working in ie (7) and not in firefox (3.02). in Firefox I get a delay and then the contentdocument appears with an opacity of 100. If I stick an alert in it works, so I'm guessing it is a race condition (although I thought javascript was single threaded) and that the setAlpha function is being called before the last function has finished executing.
Any help would be greatly appreciated. I've read the 'avoiding a javascript race condition post' but I think this qualifies as something different (plus I can't figure out how to apply that example to this one).
The issue is that most browsers don't repaint until there is a pause in the javascript execution.
This can be solved by using setTimeout, as others have suggested. However, I recommend using something like jQuery, or any of the javascript libraries to do animations. Running setTimeout 100 times is a bad idea because the length of the animation will vary based on the browser and speed of the user's computer. The correct way to do animations, is to specify how long they should last and check the system time to determine how far the animation should progress.
function fadeIn(elem,animation_length) {
var start = (new Date()).getTime();
var step = function() {
window.setTimeout(function() {
var pct = ((new Date()).getTime() - start)/animation_length;
elem.style.opacity = Math.min(pct,1);
if (pct < 1)
step();
},20);
};
step();
}
[edit:] The code above is only to illustrate how to do animations based on the system clock instead of simple intervals. Please use a library to do animations. The code above will not work on IE, because IE uses "filter:opacity(xx)" instead of "opacity". Libraries will take care of this for you and also provide nice features such as completion events, and the ability to cancel the animation.
Javascript doesn't run across multiple threads so you're safe from race conditions (ignoring upcoming Worker thread support in Safari and Firefox :D ).
Simple question, how are you calling setAlpha multiple times, firefox, safari and opera all coalesce style sheet updates -- eg. they won't repaint or even recalc style info while js is running unless they have to. So they will only paint if JS has completed.
So if you're doing
while(...) setAlpha(...)
they won't update, you'll probably need to use setTimeout to trigger multiple distinct calls to update the style.
An alternative would be to use a library such as jQuery, mootools,etc that i vaguely recall provide a simplified mechanism to do these types of animations and transitions. As an added bonus i believe at least a few libraries will also use webkit transition and animation css rules when available (eg. Safari, and i think the latest firefox builds)
[edit: caveat: i haen't actually used any of these libraries, i only read about what they're supposed to do. My sites render the same in lynx as any other browser because i couldn't design my way out of a paper bag :D ]
Are you using setTimeout or a tight loop? If you're using just a loop to call the function, then switch to using setTimout.
example:
function setAlpha(value)
{
iframe.style.opacity = value * .01;
iframe.style.filter = 'alpha(opacity =' + val + ')';
if(value < 100 ) {
setTimeout(function () {setAlpha(value+1)},20);
}
}
setAlpha(0);
Because you see, it's not just javascript that's single threaded. It's the whole damn browser. If your javascript goes into a tightloop, you hang the whole browser. So the browser pauses waiting for javascript to finish, and doesn't even have a chance to update the screen, while your code is rapidly changing some dom values.
Some browsers are smart enough to delay changes to the DOM until the call stack is empty.
This is a generally a smart thing to do. For example, if you call a function that changes an element to yellow, and immediately call a function that changes the same element back to it's original state, the browser shouldn't waste time making the change, since it should happen so quickly as to be imperceptible to a user.
The setTimeout(func, 0) trick is commonly used to force Javascript to delay execution of func until the call stack is empty.
In code:
function setAlpha(opacity){
some_element.style.opacity = opacity;
}
/**
* This WON'T work, because the browsers won't bother reflecting the
* changes to the element's opacity until the call stack is empty,
* which can't happen until fadeOut() returns (at the earliest)
**/
function fadeOut(){
for (var i=0; i<10; i++){
setAlpha(0.1*i);
}
}
/**
* This works, because the call stack will be empty between calls
* to setAlpha()
**/
function fadeOut2(){
var opacity = 1;
setTimeout(function setAlphaStep(){
setAlpha(opacity);
if (opacity > 0){
setTimeout(setAlphaStep, 10);
}
opacity -= 0.1;
}, 0);
}
All this boils down to being a wonderful excuse to use one of many javascript libraries that handle this tricky stuff for you.
Edit: and here's a good article on the tricky Javascript call stack