I am creating a website professionally for a client as my first project and i am using too many libraries for instance velocity.js,jquery,jquery.ui,animate.css and also some image slider plugin for jquery right now i am using the min version and all of the files are downloaded in my machine but when i will put the site live will it severely affect the loading time of website or it will be normal.
You can put it up. Test it Click here. But the best way is to put it up and test the ping.
Yes, it will severely affect the loading of the page. Chrome comes with developer tools out of the box, and Firebug for Firefox is only a couple of clicks away. That, combined with the RTT time and badnwidth to the site gives you enough information to calculate exactly how slow the first hit page load time.
Assuming that caching is configured correctly, subsequent page transitions should be faster - but even loading all this javascript from cache and parsing it will have a big impact on the load time. Note that in the of Dokuwiki, the CMS already handles merging of javascript into a single file.
Using PJax or similar will help with subsequent transitions. Using online repositories for standard libraries does not help with performance. Minimizing the javascript does not give very big wins (since you are already compressing the files on the fly, aren't you?)
There's big wins to be had from defering Javascript loading and stripping/optimizing/merging CSS files.
There are a lot of other things you can do to make your page render faster - but space here is too limited.
Related
I'm developing a Node.JS application. My client side javascript is about 1.000 LoC, plus 3 libraries. I minified and concatenate my javascript files to a file of total 150kb size (50kb gzipped).
I wonder at which point of file size it would be reasonable to think about using AMD, require.js or similiar.
Short answer:
~100-150kb will be a very big warning sign but file size is not the only indicator. Also, a check for if it's loading blocks responsiveness and UI loading and how much (a look at the network panel in Chrome's devtools) is important for making a good decision.
Long answer:
The reason for using Asynchronous loading is less about file size and more about usage patterns of parts of the code and timing. The thought is to load the minimum required in the initial load, so the page will load as fast as it can and the user get the most responsive experience and then load, in the background, resources the will be needed in the future.
A 150kb file is a lot (as an example, yours is 50kb gzipped so less scary) but if the js file loading doesn't block the UI rendering then file size is less of a problem. I mean, if the user doesn't notice a page loading delay then there isn't a file size problem. Also, if this is the case you might want to consider using the async tag attribute. This of course is void for applications built on client-side frameworks like angular or ExtJs as the js is rendering the UI.
Ask yourself how much of your code is used by how many of your users, how often, (be honest with yourself). A rule of thumb is if any UI elements appear later in the application flow, they can be loaded later. But remember that as with all rule-of-thumbs there are exceptions pay attention. Good analytics data is better than an arbitrary file size, you can really see what parts of the UI (and code) is used and when. Maybe some need to load very late in the flow.
If all code is needed upfront (This is rarely the case), then AMD is probably not for you.
If your application has, lets say, a statistics charts dialog that uses a lot of unique code but only 20% of the users click on the button that opens and that button is located in settings page then loading all it's code with everything else on every other page is a waste and better to do so when in the settings page or when the button is clicked, if there isn't a settings page (or even when the mouse is over the button).
Understanding usage patterns and timing is a key criteria.
These two articles are very interesting and might help in making some deployment decisions:
https://alexsexton.com/blog/2013/03/deploying-javascript-applications/
http://tomdale.net/2012/01/amd-is-not-the-answer/
And also if you still want to load everything in one file requireJs also has it's optimizer tool worth a look:
http://requirejs.org/docs/optimization.html
I'm displaying a bar code on a web page and wondering if there is any disadvantage using JavaScript to do it rather than an image or pdf in terms of caching?
Caching will most likely be applied on scripts and images, so in that case there shouldn't be much difference. However, I think your image will change every now and then showing a different bar code? In that case a script that generated the code can have the advantage because the generation code never changes and will stay cached while the differing images must be downloaded again.
Javascript can be cached if it's in an external file, so go for it. Recognise however, that some browsers have Javascript disabled, it's probably more % of users than you would think with the popularity of plugins such as NoScript.
The advantage of an image is it will work on far far more browsers (100%). It can also be saved, if this is something visitors might do. Saving the output of a JS file is a little trickier for end users.
So I would favour an image, unless you are generating hundreds of new barcodes and have a solution in JS that makes that job a lot easier.
Cliffnotes: It depends on your situation. But hopefully I've outlined the most important pros/cons.
Any code you 'produce' via javascript cannot be cached
If you are using javascript to load an image or some other document then it will be cached.
"The jQuery Mobile "page" structure is optimized to support either single pages, or local internal linked "pages" within a page." jQuery docs
What gives better performance for a jQuery Mobile application - which runs on PhoneGap?
all pages in a single.html file and internal loading
either single pages with external links
Any other aspects to consider?
I use jQuery mobile, and all the sites that I've made have been one page sites. The only external pages that I create are those that have embedded Google maps, just so the iframe loading doesn't happen if the user doesn't need it.
I think it boils down to this: one page with lots of content may slow initial loading but will be snappier once loaded, whereas a tiny home page will be quick from the start, buteach linked page will trigger an Ajax request. When designing for mobile, my rule of thumb is to minimize http requests as much as possible. Though many users are on 3+ G networks, it can still be a wait depending on connectivity. Also, connectivity can change in an instant and if the user has been navigating through the site successfully, and all of sudden things slow down to a crawl, this may create a bit of frustration. Therefore, I think from a user experience POV, users are willing to wait a few extra ticks on the initial load if everything else is quick once it's loaded.
Designing all in one page is also good for development with jQM, imo, because I just create a cache-manifest that includes only one page (and the css and js files). Then my site is cached and runs even if the user has no connectivity. If you've worked with applicationCache, you quickly realize that the more files you have, the more difficult it is to maintain the cache manifest and updates are slower.
I can't say much about browser performance, but you should consider load times.
Multiple pages in one document are loaded with the document, so if there's more of them, the DOMready will happen after some time, which can give an unpleasant look. Separate pages are fethed when you need them, so if there are no reasons to use multipage, then I'd recommend sticking with multiple HTML files. For an online app
Also - a multipage can't be used much if you want to stick with progressive enhancement which is JQM's development philosophy.
Any other aspects to consider?
Yes... As far as I know, there still might be some problems (eg. with dialogs) in multipage documents. JQMalpha3 didn't want to display dialogs for me if there was more than one in a multipage.
This depends on your app's size personally I've realised that using one page apps is more responsive if you have only a few pages in my case it was only 3 screens loading in external data which was more responsive than 3 seperate pages.
Hope that helps
I am getting extremely slow load times and before any action is taken I was wondering if there was an easy way to test to see if it is because of the ISP or the coding of the site.
We currently have a T1 going to two mirrored servers, so I don't think the ISP is the issues, we only have a few users on at a time.
The website is: http://www.designfacilitator.com/v20/Default.aspx?login=true&ReturnUrl=%2fv20%2fDefault.aspx
Is there a definitive test to determine where the problem lies? or any advice would be great.
Thanks!
Do you notice high load times if you run the webApp on a intranet?
If it's the Coding it'll go slow on a local deployment load-testing as well - but to know for sure you wanna turn on asp.net tracing and have a look at load times and latencies through the trace viewer (or directly in the pages). The figures will jump to the eye!
The definitive test you're looking for would be to access the website from somewhere else with a different ISP (if it's still slow --> there you go), but this is a fairly obvious suggestion so I am probably missing some element here.
Anyway, Experience-wise, it's almost always the Coding :)
I loaded the site in the Firebug Net panel and the initial HTML loads in less than a second, so it doesn't look like a really limited server or bandwidth situation. Still there is a lot you can do to speed up the page.
First get Firefox (if you don't have it), then install Firebug (getfirebug.com), then install YSlow (from firefox plugin site) which will analyze your page and give you recommendations. There is also a plugin there from Google called Page Speed that does some of the work for you. It'll optimize your images and combine the JS into a single file.
There is a 'net' tab that shows you at what point each file included in your page is loaded and how long it takes. This can help spot problems. Yslow will also give you specific recommendations.
From the quick look I saw of your src, you need to move your JS files to the bottom of the page, and I think you could combine them into fewer files for even more speed.
Remember, the trick is to only load the smallest amount of code required to make your page work. Then, once the page is loaded there are a number of ways to load additional code as you need it.
Keep an eye on Steve Souder's blog (http://www.stevesouders.com/), he's pretty much the guru of front-end performance.
I am trying to compare the performance of several Javascript libraries. Although measuring transaction times helps to determine the working performance of a library, it doesn't account for the time required to download and initiate the individual libraries. I'm looking for suggestions for the best method of determining the load time other than using tools such as firebug, etc. I would like to be able to setup a controlled environment where a page could be loaded n-times, while capturing the start and end times. Should the Javascript library contents be included in the page rather than as an include file, or is there a better way?
Reading this article by John Resig on JavaScript Benchmark Quality before you start anything may help you out.
After that, I would suggest that you might try requesting the javascript from your sever, getting, and timing how long the eval(responseJS); takes. That way, you are only timing how long the Library takes to load rather than that plus the time it takes to download from the server.
The libraries should always be an external file, included via a script tag, either on their own or with the site's scripting also rolled in. Minified and packed files will have a smaller size attachment. Delivered via a CDN is optimal as well, as the CDN will have it cached. Many of the popular frameworks are available over Google's CDN.
You must also account for not only the library, but the application using the library. The quality of the JS in the libraries is (typically) top notch, but what about the quality of the code tapping into those libraries, or even the code of plugins which may not be developed by the library authors. You also have to look at what browser is being used. As much as we hate it, most of these cross browser libraries are optimized for best performance out of Internet Explorer, because it retains 85+% market share.
The performance of any library is really a trade off. Deciding what is acceptable in order to get your application to do whatever it is that you want to do.