Comparing alike javascript code - javascript

As we all know, there are multiple ways of going about doing the same thing. I am looking for a way to compare the efficiency of three different JavaScripts that do the same thing. I've placed all of the code in different text files just to rank the file size, but I don't think the smallest code is necessarily the most efficient. I have chrome developer tools, and firebug will these get the job done or is there a fancier way?

you can use the jsPerf.com site to test performance on different js codes.

Use dynatrace to measure the efficiency of the javascript.
http://www.dynatrace.com

One way would be to create some test data, and set up a loop to benchmark each of your Javascript programs by running the data through some number of times, say 10000 repetitions with each program. Use Javascript to time each program and show you the times for each.

Related

Is there a way/function to set a limit on how much processing power a script can use in javascript?

What I am trying to do is set a limit on how much processing power my script can use in javascript. For instance, after 2 MB it starts to lag similar to how NES games would lag when too much action is being displayed on the screen.
I have tried doing research, but nothing is giving me the answer I'm looking for. Everything is about improving performance as opposed to reducing the amount that can be used.
Since it seems you are trying to limit memory usage you might be able to track whatever it is that is using memory and simply limit that for example if you're creating a lot of objects you could reuse created objects once you created a certain number wich would limit used space.
As an asside I would suggest you check out service workers they might present an alternate way of solving your issue.

How to get metrics of javascript/dom loading and processing time

We found ourselves working in a webpage that manages to work with >20k users in real time.
We worked so hard to optimize the back-end system but we now need to make optimizations in the front end.
The first thing I came up with was use some tool of monitoring the load time of the JS files.
But we don't really need to get the times of loading javascript, what we really need is know what parts of our javascript code are taking more time to finish.
Now we are using new relic to track our site, and know what server-side scripts need to be optimized.
But I can't see any metrics about front-end codes or witch files need to be optimized.
There's any tool out there that may help us with this?
The best way to test your javascript speed cross browsers is to write your own function.
wrap whatever code you want to test into a function and run that function as many times as you possibly can within 1 second counting the iterations. run that test at least five times because you will get a different result every time and then find the average number of iterations. I also suggest using chrome since it is about 4 times faster than any other browser out there when it comes to javascript performance. Test all your code and optimize it for your browser and the improvement will impact your users experience as well.
I have my own speedTest obj that I wrote which you can view at my website here http://www.deckersdesign.com/source.php which is under construction. FYI this obj finds the median of iterations not the average.

Is it a fact that a smaller jQuery plugin (in KB) will perform better than same plugin with the same functionality but of a bigger size?

Or does it depend on the way in which it is written?
My question is related to page rendering time. Can we determine which will give better performance?
Should we always choose a smaller sized plugin?
If you mean that you have two plugins with comparable functionality, but different size (not just minified but really different code), the answer is: maybe. One thing is obviously for sure: the smaller plugin will load faster. But for a million reasons, the bigger plugin can be faster after that. Without benchmarks, you can only guess.
Only way to know is to measure it with tool like yslow or google page speed
e.g. I can write a simplest plugin which can halt your page rendering, or downloads too many things.
But usually if both are jquery plugin with exact same behaviour, shorter the better, but if you are comparing two different things e.g. one jquery plugin , one plain javascript then you must also consider the size of jquery library and such things.
A normally minified JavaScript file with whitespace removed will run equally fast as non minified. The difference is neglible, as the only benefit for the interpreter is that it doesn't have to skip as much whitespace. The benefit of minified JS files comes from reduced file sizes, not from increased performance.
But if a JavaScript is file 'packed' with Dean Edward's Packer, that's a whole different case as it might actually decrease performance because the code must be evaluated first with eval(). That's why in some cases it might not be advisable to pack with the 'maximum' setting – that is, with the 'Base62 encode' option.
So, if you're looking for performance optimizations, don't start by stripping whitespace or shortening variable names :)
But as an answer to your last question, yes, you should always choose the minified version. At least it will load faster due to it's smaller filesize.
Assuming you mean that the one plugin file has been minified, I would say in the majority of cases, yes.
But it really depends on hundreds of factors...
jquery distributes a "minified" version of the library. the code is stripped of inessential whitespace and other elements using a tool called JSMin. it's identical in function to the non-minified version, but loads much quicker. if you are debugging tho, you might find it easier to use the full version.

Financial calculator in Javascript runs much slower than on Excel

I've ported an Excel retirement calculator into Javascript. There are 35 worksheets in the original Excel containing many recursive calculations, all of which I've converted to Javascript. Javascript is running slower (1-2 seconds compared to Excel's instantaneous).
I am caching the recursive calculations already to speed things up and prevent stack overflows in the browser.
Is it realistic to try and make the the Javascript faster?
How does Excel manage to be so efficient?
I read somewhere that Excel only re-calculates when a cell's precedents have been modified. Even still, it seems to me that Excel is pretty much instantaneous no matter how much data needs to be re-calculated.
Excel is faster because it's a few layers closer to the CPU -- running compiled bytecode within the OS, rather than within a browser running interpreted JavaScript.
I'd compare performance with Google Chrome or FF3 that have a new generation of JavaScript engines and see how things improve. See John Resig's post: http://ejohn.org/blog/javascript-performance-rundown/.
JavaScript is slower than any compiled language out there, that's why Excel is so much faster. I would use Firebug's profiler to figure out where your code is spending most of its time and focus on improving that.
If you've ported the Excel formulas to JavaScript while preserving the algorithms intact, the JavaScript code that you've ended up with may not be the most ideal for JavaScript. Have you considered refactoring things to take advantage of JavaScript's powerful language features?
Also, are you rendering things (updating table cells etc.) while doing the calculations? Keep in mind that some DHTML updates may put a big burden on the browser (hey, you're running this inside a browser, right?) Perhaps separating the calculation and rendering may help. You'd first busily do all the calculations and then do the presentation as the final step.
Like other people have said, JavaScript is nowhere near as fast as a compiled language. Currently, there's somewhat of an arms race between Chrome, Firefox and Webkit's JavaScript interpreters, which has really improved the speed situation with JavaScript. However, it's still pretty slow, and if you're using IE7 (or even worse IE6), performance can be pretty dismal.
You may want to look at some of the JavaScript libraries that are out there (personally, I prefer jQuery) to see if some of them have utility functions that you could take advantage of. Some of the more heavily used JavaScript libraries may have optimized some of the work that you're trying to do. It certainly won't make JavaScript as fast as Excel, but if you can replace a lot of your functionality with utilities that have been optimized by many different people, you could see a little bit of a speed increase.

Is there any generalities in the cost of executing an instruction in javascript?

I was wondering, if there is any generalities (among all the javascript engines out there) in the cost related to execute a given instruction vs another.
For instance, eval() is slower than calling a function that already has been declared.
I would like to get a table with several instructions/function calls vs an absolute cost, maybe a cost per engine.
Does such a document exists?
There's a page here (by one of our illustrious hosts, no less) that gives a breakdown by browser and by general class of instruction:
http://www.codinghorror.com/blog/archives/001023.html
The above page links to a more detailed breakdown here:
http://www.codinghorror.com/blog/files/sunspider-09-benchmark-results.txt
Neither of those pages breaks down performance to the level of individual function calls or arithmetic operations or what have you. Still, there is quite a bit of potentially useful information.
There is also a link to the benchmark itself:
http://www2.webkit.org/perf/sunspider-0.9/sunspider.html
By viewing the benchmark source you can get a better idea of what specific function calls are being tested.
It also seems like it might be a simple matter to create your own custom version of the benchmark that collects the more specific data you are interested in. You could then run the modified benchmark on a variety of browsers, perhaps even taking advantage of a service such as browsershots.org to test a wide spectrum of browsers. Not sure how well that would work, but it might be fun to try....
It is of course possible that the same operation executed in the same browser might take significantly different amounts of time depending on the context in which it's being used, in ways that might not be immediately obvious. For example, I could imagine a Javascript engine spending more time optimizing code that is executed frequently, with the result that the code executed in a tight loop might run faster than identical code executed infrequently. Of course, that might not matter much in practice. Still, I imagine that a good table of the sort you are looking for might also summarize any such effects if they turned out to be important.

Categories