I'm working on my new portfolio and I want to use a complex javascript (for animating, moving, effecting dom elements) and i going to do as much as possible optimization to maximize the performance. BUT I can't prepare for all the case with my site will be faced. So i started to looking for a script with I can check the browser performance (maximum in a few seconds) and based on the performance test results I can set the number of displayed and calculated effects on the page.
So is there any way to check browser performance and set the optimal number of applied effect on a page?
If possible, use CSS transforms/transitions instead of pure-js effects, as the former are usually hardware accelerated and thus orders of magnitude faster.
Even if you don't use CSS transforms, you can detect support for them using e.g. modernizr, and if supported, you can assume that the browser is very modern and has pretty good performance in general. Take a look at window.requestAnimationFrame, it automatically throttles the framerate.
Related
I'd like to add a spinner to my site and I'm wondering which method to use. My initial investigation resulted in the following (see links and Why not animated GIF instead of animated CSS sprites?):
Animated GIFs
Pros
Depending on the specific spinner(s), may provide the smallest footprint (673 bytes for the default one from AjaxLoad)
High browser compatibility
Simple to use with data binding (simply bind to the visibility of the img tag)
Cons
Gifs can't be changed once downloaded
Gif animation can't be started/stopped reliably
All instances animate at the same time in a synchronized manner
Animation may freeze in some circumstances, such as HTML manipulation
JavaScript (spin.js)
Pros
Highest browser compatibility (even falls back to VML)
Highly dynamic and customizable
Cons
Larger footprint (4.02KB for spin.min.js 2.0.1)
Possible CPU usage issues (link)
CSS
Pros
Dynamic and customizable
Purely declarative (no JS)
Cons
Larger footprint (4.32KB for one the default spinners from CSSLoad)
Poorest browser support (requires CSS3)
Possible CPU usage issues (link)
Do you agree with the analysis above?
In simple cases, would it make sense to go with gifs as they seem to keep things simple while being fairly compatible?
A few other considerations (sorry too big to put in a comment)
GIFS
Con: no variable transparency, I've had cases where the desired effect is to 'dim' the page behind and put a loader in place but of course this means nasty jagged edges as you cannot match the background colour on aliasing.
Pro: animated gifs tend to load progressively with the browser showing whatever image data it has therefore it'll show the first frame superfast - which is a key consideration in a loader image
Also worth a mention is SVG animations which I've found to work really well for complex loader sequences for heavier single page sites (marketing campaign sites etc). Of course you dont get support across the full browser spectrum but I've certainly had situations where these have been the obvious goto.
GSAP claims to use HTML5 to perform outstanding animations for web use, but says clearly in the their article on Greensock.com that it does not use the canvas frame work in html5. It is clear that they are using Javascript from the provided script, but it is very confusing to interpret. In what other way would they use html5 animations without the canvas? And if they do use pure HTML5 does this mean that HTML5 animations are significantly faster than CSS, jQuery, and Javascript?
There's not such thing as "html5 animations".
There are, mainly, CSS3 animations (with either CSS transition or CSS animations) and Javascript animations.
CSS3 animations are generally well optimised (with a few quirks) but lacks support (old IEs) and flexibility (you'll have to use JavaScript to tweak them.) They're best for hover effect (with transitions) or basic animations.
JavaScript transition used to be based on a setInterval. A timed loop, and inside this loop the styles are changed. JQuery does this, and not very well.
Recently, Window.requestAnimationFrame() was introduced to replace these setInterval animations. Support is limited (old IEs), performance is top notch (because the browser can skip frames), and it's always style updates inside.
What GSAP does is using this requestAnimationFrame() while optimizing for less repaints and adding a lot of useful features (reverse, timelines, stagger...) On basic animations, you can achieve the same performances with CSS3 or your own JS code... if you know what to do.
There are also others animations (canvas, svg... event webgl) but more specialized.
In my limited experience with GSAP, and during some discussion with the developer of Velocity.js, it seems that GSAP has some very convoluted code that is quite difficult to interpret; I'm not sure if this is because they're purposefully obfuscating the code or that they do some very crazy optimizations... Maybe a bit of both. They may have also taken native code and created a Javascript implementation. In any case it makes reading the source quite difficult to do.
As far as your question about HTML5 animation, which I'm taking to mean animating objects on the DOM without Javascript, if you're using a compliant browser, you can achieve many animations via the CSS transform and transition property, the former being about DOM object transformations (moving it 10px to the left) and the latter being about how the object moves as it transforms (does it move linearly over a period of time, or use a user specified cubic bezier curve to describe its motion?). With current compliant browsers, the main difference in these properties besides vendor prefixes like -webkit- and -moz- is small. What this means is that you can reliably get animations on DOM objects across those browsers if you take case of the prefixes. You can use these transform/transition properties to manipulate almost any DOM object property.
In terms of speed, it depends. CSS animation is generally the faster between it and Javascript, but it lacks control because it's hard/impossible to manipulate certain properties like key frames. Javascript animation is slower, but using a good library makes this difference negligible and in some cases Javascript animation can be faster. It really depends on what you're trying to achieve. CSS and newer Javascript library animation is considerably faster than jQuery due to some reasons listed here.
*: Of course some browsers, like IE9 and 10, while being relatively compliant, do miss some properties and have quirks about their rendering engines.
So I'm building large size site, which uses css3 transitions for animations (I'm using jaquery.transit to manipulate element transitions and their css styles). And I stumbled upon 2 problems:
FF (latest update) doesn't use GPU for translate3d() layer rendering, maybe I'm wrong and mozilla doesn't use GPU accelerated graphics at all. I really don't understand that completely yet.
Even if I trick for example Chrome in using GPU for translate3d() and translateZ() layer rendering, on computers with bad GPU or with no GPU those graphics are so terrible you sometimes can't even see middle of transition just start and end.
Questions:
What do I do to improve FPS for transitioning elements, e.g. I have 3200x3200 div rotating and scaling and translating in x,y axis at same time, with approx. 5-20 elements displayed on that div's surface?
Maybe there is a way I can detect if browser has enough GPU support to know if I need to redirect to simpler version of site or not?
Because WebGL uses GPU, the amazing Modernizr project permits to check that for webGL supported browsers: http://modernizr.com/news/
Check Modernizr.webgl under http://modernizr.com/docs/
The main issue at the time was that all of the PNG's on the screen had to recalculated and recompiled in the browser.
There were several things I had to do to to maximize performance:
Always have predefined width and height attributes on images. What this does is let's the browser know what size the picture should be and when used together with scale() it won't recalculate and recompile those images. These things were very expensive. So basically if nothing else than scale() modified image size, everything was perfect and animations were awesome.
Wherever possible avoid using visibility property, it literally acts like opacity: 0 keeping the element in the layout making layout recalculation much longer. Always where possible use display: none, this will completely eliminate element from the layout calculations. This was a major pitfall, because I had to re-think the UI to exclude visibility and I had to minimize used DOM node count.
Overall it was a huge adventure and big experience, hope this question/answer helps someone.
Is there a standard (accepted/easy/performant) way to determine how fast a client machine renders javascript?
When I'm running web apps (videos, etc) on my other tabs my JS animations slow to a crawl.
If I could detect slowness from my JS, I would use simpler animations to provide a better user experience.
Update:
Removing animations for everyone is not the answer. I am talking about the simplest of animations which will stutter depending on browser / computer. If I could detect the level of slowness, I would simply disable them.
This is the same as video games with dynamic graphics quality: you want to please people with old computers without penalizing those who have the extra processing power.
One tip is to disable those hidden animations. if they are on another tab that is not in focus, what's the use of keeping them animated?
Another is to keep animations to a minimum. I assume you are on the DOM, and DOM operations are expensive. keep them to a minimum as well.
One tip I got somewhere is if you are using image animation manipulation, consider using canvas instead so that you are not operating on the DOM.
Also, consider progressive enhancement. Keep your features simple and work your way up to complicated things. Use the simple features as a baseline every time you add something new. That way, you can easily determine what causes the problem and fix it accordingly.
The main problem you should first address is why it is slow, not when it is slow.
I know this question is old, but I've just stumbled across it. The simplest way is to execute a long loop and measure the start and end time. This should give you some idea of the machine's Javascript performance.
Please bear in mind, this may delay page loading, so you may want to store the result in a cookie, so it's not measured on every visit to the page.
Something like:
var starttime = new Date();
for( var i=0; i<1000000; i++ ) ;
var dt = new Date() - starttime;
Hope this helps.
My website has a jQuery script (from Shadow animation jQuery plugin) which constantly changes the colour of box shadow of various <div>s on the home page.
The animation is not essential, but it does take up a lot of CPU time on slower machines.
Is it possible to find out if the script will run 'too slowly'? I can then disable it before it impacts performance.
Is this even a good idea? If not, is there an easy way to break up the jQuery animate?
This may indirectly solve your problem. Pick a few algorithms and performance tests from this site http://dromaeo.com/ that seem similar to your jQuery plugin. Don't run comprehensive tests as they do on the site. Instead, pick fairly small and fast algorithms, and run them for an unnoticeable period of time.
Use a tiny predefined time span to limit how long these tests are allowed to run. Let's say if that span is 200 ms, and on a fast machine with browser A, you can get 100 iterations, while on some random user's machine, it's only able to complete 5 iterations, then you may want to consider disabling it on the user's machine. Tweak and tweak till you find the optimal numbers.
As a bonus, send all test results back to your server so you have a better idea of where your users lie in the speed spectrum. If a big majority of users are using slower computers and older browsers, then it just may make sense to remove the thing altogether.
You could probably do it by timing a few times round a loop which did some intensive processing on page load, but that's going to slow the page and add even further to CPU load, so it doesn't seem like a great solution.
A compromise I've used in the past, though, was to make the decision based on browser version, for example, Internet Explorer 6 users get simpler content whereas newer browsers with better JavaScript performance get the animation. That seemed to work pretty well at a practical level. In practice, browser choice is a big factor in JavaScript performance and you might get a 90% fit with what you want very simply just by taking that into account.
You could do something like $(window).width() to get the browser width. Using this you could make the assumption that anything < 1024px wide is likely to be either a netbook, smartphone or old computer.
This wouldnt be nearly as accurate as timing a loop, but much more efficient.
Obviously its this rule's a generalisation and there are will be slow computers with > 1024px. But in general a 1024px + computer would typically be able to handle a fair bit of javascript (untill the owner puts on loads of software, virus scans and browser toolbars!)
hope this is useful!