I'm working on a site that abuses Javascript pretty heavily for some constant animation effects; namely scrolling backgrounds. When running maximized in Firefox 3.6.13 at 1920x1080 it seems to lag, and sometimes ends up a bit skippy. When running at a smaller size though (resized to 50%-75%) it's very smooth.
I was just wondering if its possible to grab the number of discarded processes, or loosely the FPS, that a given animation is running at. The animation is actually driven by window.setInterval() with a delay of 10 (1/100 of a second).
http://forum.jquery.com/topic/why-jquery-uses-77-fps-by-default-in-animation
Credit to Google for the link :)
Related
the situation is as follows: there is an HTML5 game. The game is quite heavy in terms of the javascript code, graphics and sounds – approximately 30 MB. All the animation is created by means of createjs. In the game I use 2 canvases one over the other; the control buttons are between the canvases (the buttons are created by means of the standard button elements).
One canvas is the main one and most of the animation is implemented on this canvas; the second canvas is used when it is necessary to show the animation over the control buttons. Initially, the size of each canvas was 970px x 740px. All the animation worked perfectly in desktop browsers as well as in Chrome and FireFox in Android. However, in the mobile Safari browser the animation worked very slowly.
Then I reduced the size of the canvases by 30 % (accordingly to the graph on the sprite sheets). As a result, Safari started working more efficiently on an ipad. The animation of the lower canvas stopped hanging but only unless the extra animation was launched on the upper canvas. Then everything got slow again. I don’t apply WebGL because I have to use both bitmap and the vector graphics. Who knows how to improve the situation in this case so as to prevent the FPS from falling, could you help me, please?
There is another funny thing: if you minimize the browser and then maximize it again, the game starts working faster from time to time.
I have a SPA (Angular 1.4.8).
On start up, the client has to download many files while also performing authentication and bootstrapping.
Therefore I have a splash screen, which contains a simple GIF and few text lines that are updated according to the current loading stage.
I've noticed that the GIF smoothness depends on the browser/computer you open the page in.
On a very fast computer, the GIF's animation runs as smooth as my eye can notice, while on other computers, slower ones, the animation is chunky (low FPS).
I'm assuming that this has to do with the load on the browser that has to do many things when loading my application, causing low FPS on the GIF's animation.
This is just an assumption, according to my tests.
What can I do in order to ensure that the browser sacrifice all the required resources for playing the GIF right?
Thanks
Do not use gif. Use CSS animated spinner like these. CSS animated spinners like these are not affected by the jam on main thread of the browser which executes your JS and updates the UI.
However, there is currently a bug on Blink that does cause the CSS animated spinner to pause with the jam on main thread.
Google Chrome compositor-driven animation affected by jam in main thread
But that should be fixed soon.
In our single-page application we have two modes that the user can switch between. The most striking visual difference is that the top navigation bar is completely different, which is why we use crossfade between the two as visual cue to the mode change.
As the animation is playing , there's a lot else happening in the browser - DOM is updated and redrawn, XHRs are done, objects are created&destroyed etc. End result is that the animation becomes quite choppy. In fact, sometimes it seems as if no smooth animation actually happened, because the new navigation bar just appears in two or three steps.
Are there any tips or tricks that one could utilize to achieve smooth animation during time when javascript engine in the browser is under a lot of strain? Are CSS perhaps transitions any benefit? Any suggestions or quality reading material on this are welcome.
I'm working on a web page with an animated background. I'm using MooTools to crossfade a series of images that are of a decent size (like 1100px x 750px).
I think since the browser is having to do so much rendering work crossfading these images, when you make the page fullscreen, the crossfade animation will become choppy.
In looking for ways to overcome this, I'm already planning on rewriting the slideshow in the most efficient JavaScript I can muster.
Does anyone have any other ideas on how to have the animated background run smoothly at large browser window sizes?
Crossfade effect is expensive and you'll get very low frames per second at that resolution. Instead of a crossfade I'd try a slide up/down/left/right or wipe effect. I'm a big fan of the jQuery tools Scrollable plugin.
AS you said yourself, this is probably a rendering issue, and not really an issue with the speed of your javascript, especially since there is a performance difference between fullscreen and non-fuulscreen.
I doubt optimizing your js will have any real effect.
Hi I was wondering if there is a limit to the number of divs that are allowed on a web page?
For example will Internet Explorer start to choke when it has to render a webpage with a thousand divs?
I know this is an old post, but I recently did a test that is directly related to this topic and I wanted to share my results.
I created a simple php script that spits out x number of 5px by 5px inline-block divs to test browser stability and page scroll-ability.
At 1000 divs on the page IE9, Firefox, and Chrome have no problem whatsoever and don't even seem to hiccup when scrolling.
At 10,000 divs IE9 and Chrome are able to scroll with a barely-noticeable delay, still within the 'acceptable' range in my book, however Firefox begins to lag more noticeably, to the point where you feel the scroll bar is jumping into position a half-second later than it should.
Interestingly, the performance difference between 10,000 divs and 100,000 is not as drastic as you'd imagine. IE9 and Chrome perform with only a barely perceptible delay in scrolling (with Chrome being the slightly smoother of the two), and Firefox has a delay that is very noticeable and would probably be considered annoying, but still functions reasonably well (i.e. it doesn't crash).
Now at 500,000 divs on the page it finally started to get interesting. IE9 Crashed and tried to restart itself (on the same page, of course) and crashed again, at which point I shut it down properly, restarted it, and tried one more time to make sure the same result would happen again. It did.
Chrome remained stable, but it became nearly impossible to scroll the page due to the extreme delay.
The big surprise was Firefox, the browser that was chunky at 100,000 divs is just about the same at 500,00 divs ... scrolling is not smooth, but it is way, way better than Chrome.
Amazingly, the results were pretty much the same for 1,000,000 divs on a page! Firefox handled them without crashing and remained scrollable though 'chunky'. IE9 crashed. And Chrome was able to load the page but became so slow that it was virtually unusable.
I know this isn't exactly a scientific study, but I figured it might be interesting to someone else besides myself.
Tests were performed on a Dell workstation with Dual-Xeon processors and 4 gigs of ram, running Windows 7.
There are two things to consider. Memory is one, where DOM nodes take up a huge amount of space. The other is CPU time needed to re-render all of these nodes when something changes. The threshold of smooth rendering depends on the engine used. In my experience, IE falls far behind, starting to choke after several hundred. Firefox can take several thousand, and it's about the same (and a little better) for WebKit browsers like Chrome.