I am using setInterval(foo,ms) to carry out an animation. I really don't want to post all the code for the animation here as it spans multiple files. It's basically a bunch of images falling. Every second I call ctx.drawImage(img,...) while updating the coordinates to simulate gravity.
I have divided the canvas into two sections, one animation on the left and one on the right. When one of them is activated the frame rate is stable at 30 fps. If, however, I activate both of them, the performance plummets. This has nothing to do with overloading my computer, as I can cut the complexity of each animation by a factor of 10 and the problem persists. My guess is the setIntervals are interfering with each other.
My question is whether it is safe to execute more than one setInterval calls. Thanks
You can have many setIntervals() without issue but be aware that JS is fundamentally single-threaded (per-page). Multiple "Concurrent" threads are actually handled by jumping one thread about the code.
What this means is that timing won't be consistent - especially if one of the methods takes a considerable length of time to run.
As the others say, you can have as much as possible. Nevertheless, you should have as few as necessary for good performance. Maybe you can find a way to use only one intervala for both animations.
There might be a problem though if you use global variables. This could have an influence on the animations (maybe even on the performance, depends on what you use them for).
I would advise using setTimeout it could avoid performance issuse.
Take a look at this question setTimeout or setInterval?
It does a great Job of explaining the difference between the two and why you should you generally use setTimeout.
My question is whether it is safe to execute more than one setInterval calls
Short answer: yes, absolutely.
Yes its safe to have multiple setIntervals running.. There's no underlying performance issue with using setIntervals.. profile your own code, you'll almost certainly find the problem there.
Related
We're building a web framework and have identified forced reflows as one of the main performance bottlenecks of our applications. From web research, we learned that actions that trigger forced reflows should be grouped into read- and write-batches to minimize the number of reads that actually are costly (because something changed). [Our main goal is to get rid of forced reflows altogether, but we doubt that we can remove every single one – and from what we learned it is the first one after a change that is expensive.]
In a proof of concept, we implemented this batch approach using the fastdom library for a single application start. Surprisingly, instead of the performance increase, we hoped for, the startup performance decreased from (in median) 372ms to 391ms time to interactive. In the performance trace, we see that the large recalculate style/layout cycle now directly comes from revealing the changed HTML, and that the js-triggered forced reflows (in the animation frame after that reveal) are actually fast. However, the overall performance was better with the forced reflows happening before the reveal.
performance trace before implementing fastdom:
performance trace after implementing fastdom:
Can someone explain why we observe this behavior? Why does it seem that the community-approved approach works in the opposite direction in our case?
Thanks for all hints in advance!
I'm working on a multiple projectile simulator for a college project and I'm wondering how to best setup timing in JavaScript rendering to a HTML5 canvas
I'm using an Euler integrator setup for physics and accuracy is very important for this project. The rendering is very bare bones
My question is how to best setup the timing for all this.
Right now I have:
The physics and other logic running in a function that loops using setTimeout() with a fixed time step
The rendering in another function that loops using a requestAnimationFrame() call (flexible time step)
These two loops run sort of simultaneously (I know JavaScript doesn't really support threads without Web Workers) but I don't want the rendering (currently running at a much higher FPS than needed) to be unnecessarily 'stealing' CPU cycles from the physics simulation, if you see what I mean.
Given that physics accuracy is most important here how would you recommend setting up the timing system? (Maybe using Web Workers would be useful here but I havent seen this used in other engines)
Thanks!
I'd suggest that you don't try to 'multithread' unless you're actually doing it, and even then, I wouldn't necessarily recommend it.
The best way to keep everything in synch is to have a single thread of execution. A single setTimeout loop of about 33ms seems to work ok for my games.
Also, in my experience at least, setTimeout offers a much more aesthetic experience than setInterval or requestAnimationFrame. With setInterval, Javascript tries to hard to 'catch up' when frames are delivered late, which makes animation frames inconsistent. With requestAnimationFrame, frames are skipped to ensure a smooth running game, which actually makes things harder, because your users aren't entirely sure their view is up to date at any given second.
One way would be to set an interval for processing physics, and once per x frames, render everything.
var physicsTime;
var renderFrequency;
var frameCount;
setInterval(function(){updateStuff()},physicsTime);
then in updateStuff()
function updateStuff(){
frameCount ++;
if (frameCount >= renderFrequency){
frameCount -= renderFrequency;
render();
}
physics();
}
In my game engine, there are objects that need to be updated periodically. For example, a scene can be lowering its alpha, so I set an interval that does it. Also, the camera sometimes needs to jiggle a bit, which requires interpolation on the rotation property.
I see that there are two ways of dealing with these problems:
Have an update() method that calls all other object's update methods. The objects track time since they were last updated and act accordingly.
Do a setInterval for each object's update method.
What is the best solution, and why?
setInterval does not keep to a clock, it just sequences events as they come in. Browsers tend to keep at least some minor amount of time between events. So if you have 10 events that all need to fire after 100ms you'll likely see the last event fire well into the 200ms. (This is easy enough to test).
Having only one event (and calling update on all objects) is in this sense better than having each object set it's own interval. There may be other considerations though but for at least this reason option 2 is unfeasible.
Here is some more about setInterval How do browsers determine what time setInterval should use?
The best way I have found out to make a good update() function and keeping a good framerate and less load is as following.
Have a single update() method which draws your frame, by looping some sort of queue/schedule of all drawable object his own update() function which are added to this update event queue/ schedule. (eventlistener)
This way you don't have to loop all objects which are not scheduled for a redraw/update (like menu buttons or crosshairs). And you don't have an over abundance of intervals running for all drawable objects.
I recommend using the update() method over the setInterval.
Also, I would guess that the timing on the several setintervals running would be unreliable.
Another possibility, depending on what other things are happening in your game, using a bunch of separate intervals could introduce race conditions in the counting and comparing of scoring, etc
The proposed algorithms proposed are not exclusive to the related method. That is, you can use setInteval to call all the update methods, or you can have each object update itself by repeatedly calling setTimeout.
More to the point is that a single timer is less overhead than multiple timers (of either type). This really matters when you have lots of timers. On the other hand, only one timer may not suit because some objects might need to be updated more frequently than others, or to a different schedule, so just try to minimise them.
An advantage with setTimeout is that the interval to the next call can be adjusted to meet specific scheduling requirements, e.g. if one is delayed you can skip the next one or make it sooner. setInterval will slowly drift relative to a consistent clock and one–of adjustments are more difficult.
On the other hand, setInteval only needs to be called once so you don't have to keep calling the timer. You may end up with a combination.
This is a very simple game I'm trying to write but I have a really bad performance problem.
I'm not using an HTML5 Canvas—just plain javascript—so that may be the problem.
Here's the game: http://ivcdn.net/aga2/dead.html
Currently I'm using divs as game objects. And to move them, I increase or decrease their position on the page (and they're all positioned absolute). But (I think) doing so is causing serious performance issues.
What can I do to increase performance? And do I have have any other options than to use HTML5 and/or a better language?
One suggestion to improve performance in that would be instead of moving each individual div, move the container of the divs instead. That way you're only moving one thing instead of many.
Also limit how many setTimeout calls you're making. Ideally make one that encompasses all your game logic, and at the end of that game logic, call setTimeout again to call itself (provided the game is not over).
Currently, I am rendering WebGL content using requestAnimationFrame which runs at (ideally) 60 FPS. I'm also concurrently scheduling an "update" process, which handles AI, physics, and so on using setTimeout. I use the latter because I only really need to update objects roughly 30 times per second, and it's not really part of the draw sequence; it seemed like a good idea to save the remaining CPU for actual render passes, since most of my animations are fairly hardware intensive.
My question is one of best practices. setTimeout and setInterval are not particularly kind to battery life and CPU consumption, especially when the browser is not in focus. On the other hand, using requestAnimationFrame (or tying the updates directly into the existing render phase) will potentially enforce far more updates every second than are strictly necessary, and may stop updating altogether when the browser is not in focus or at other times the browser deems unnecessary for "animation".
What is the best course of action for updating, but not rendering content?
setTimeout and setInterval are not particularly kind to battery life and CPU consumption
Let's be honest: Neither is requestAnimationFrame. The difference is that RAF automatically turns off when you leave the tab. That behavior can be emulated with setTimeout if you use the Page Visibility API, though, so in reality the power consumption problems between the two are about on par if used intelligently.
Beyond that, though, setTimeout\Interval is perfectly appropriate for use in your case. The only thing that you may want to be aware of is that you'll be hard pressed to get it perfectly in sync with the render loop. You'll have cases where you may draw one too many times before your animation update hits, which can lead to minor stuttering. If you're rendering at 60hz and updating at 30hz it shouldn't be a big issue, but you'll want to be aware of it.
If staying perfectly in sync with the render loop is important to you, you could simply have a if(framecount % 2) { updateLogic(); } at the top of your RAF callback, which effectively limits your updates to 30hz (every other frame) and it's always in sync with the draw.