Optimising Angular Js e-commerce project - javascript

I'm working on a AngularJs project whose loading time is too high >15sec. It has got too many Js and Css files how to optimise? As the page loads everything at once of 1st time loading. I want to make the Js and Css loading on demand.
From where to start so that I can optimise? please suggest some good guidelines and workflows.

Some points to consider:
Make sure your CSS, JavaScript and HTML files are being served with gzip encoding. This will dramatically reduce the amount of data which needs to be transferred on pageload.
Ensure your JS and CSS is minified, which eliminates whitespace and reduces the size of the files.
Try to remove dependencies on large, bulky libraries. JQuery is huge, and often more lightweight libraries can be used instead.
Take a look at Google's PageSpeed module for Apache and NGINX. This can perform several magic tasks such as compiling your javascript and css into single files. https://developers.google.com/speed/pagespeed/module/
If you use a compatible webserver, consider enabling HTTP2 (or SPDY on older servers).
Content Delivery Networks
Some people promote the use of public Content Delivery Networks (CDNs, such as maxcdn) to serve your JavaScript files. This has several potential advantages:
With popular libraries, it's likely the files will already be cached in the browsers of people who are visiting your site.
CDNs are generally globally distributed, which should make them fast and responsive worldwide.
However, I personally disagree, and I make sure that all my 3rd party JavaScript libraries are served locally. Here's why:
If a CDN is offline or having issues, your site will be broken (though this can be patched somewhat with fallback loading)
If a CDN is overloaded or saturated, your site will also be slow
It adds at least one additional DNS lookup on page load, which is another point of failure
If the CDN gets compromised, someone could place some malicious javascript on your site (this can be avoided by using the "integrity" checksum)
CDN served files cannot be inlined, combined and optimised by Google Pagespeed
CDN served files cannot be pipelined with the rest of your site

Related

Page performance: To load multiple library scripts from CDN or combine and minify those libraries into one download from server?

I understand the advantages of loading big and common libraries like jQuery from CDN.
But what about the smaller plugins and library helpers like jQuery-ui or bootstrap and its helpers. My site has about 10-12 of those.
Should I pick each of them individually from cdnjs and get the CDN benefits, but potentially have many more HttpRequests. Or should I locally compress and minimize them into one big plugins file and load from my server?
I see this question on SO framed as "Multiple libraries off a public CDN or one concatenated file on our CDN", but what if we don't subscribe to a CDN?
The good thing about using CDN is that the user gets the file from the closest CDN. Also, it's probable that the user already has that file on his computer so he doesn't need to download it.
Creating bundles will minimize the amount of GETs the browser needs in order to get all the scripts/styles.
My recommendation is to use a combination of both because browsers can only download 5 files at the time. So, use CDNs for the most common libraries (the user probably already have them) and use bundles for your own files (your stylesheets or your JS files).
Check the network tab in Developer Tools measure the time it takes in each case.
I'll definitely advice to use what you can from cdnjs and similar CDNs. Most of the time people will have the files cached so there will be no additional HTTP requests.
Concatenating and magnifying everything that you can not load form CDN will save you some bandwidth and speed the loading time of your site.

HTML + JS + CSS converter

I have a HTML file with JS (jQuery) and CSS. I want a converter that converts all the files, minimizes it and just puts it all in a index.html for example. Google seems to be using this, they have no external files, not even the image, everything is just in one file and I'm sure pre-compiled before release.
Also is this a good idea?
This is not a good idea, in general.
Splitting out your CSS and JavaScript files means that they can be cached independently. You will likely be using a common CSS and JavaScript across many pages. If you don't allow those to be cached, and instead store them in each page, then the user is effectively downloading a new copy of those files for every page they visit.
Now, it is a good idea to served minified versions of these files. Also make sure to add gzip or deflate transfer encoding so that they are compressed. Text compresses nicely... usually around a ratio of 1/8.
(I should note that there has been one occasion where I have loaded everything into a single file. I was working on a single-page web application for the Nintendo Wii, which had no caching capability at all. This is about the only instance where putting everything into a single file made sense. Even then, it is only worth the effort if you automate it server-side.)
I don't recommend to concat CSS with JS.
Just put your css at the top of the page and js at the bottom.
To minify your CSS and JS you have to use gruntjs
Also I recommend you to read this article: Front-end performance for web designers and front-end developers
If your intention is to load the pages faster:
For images: try to use image sprites or images from different domains because browsers love downloading resources from different domains instead of just one domain.
For scripts as well as css: use online minifiers that can reduce white-spaces and reduce the size (if you are on a web hosting, your host may be already compressing the scripts for you using gzip etc)
For landing pages like index pages: If you have less styles then try inserting them inside the <style></style> tag, this will make the page load very fast, Facebook mobile does it that way.
If it wasn't a good idea, google wasn't be using it!
If you put everything in single file, you'll get less HTTP requests when the browser will check if the newer version of file is available.
You also get read of the problem that some resources are not refreshed, which is the headache for 'normal' developers, but it's a disaster in AJAX applications.
I don't know of any publicly available tool doing it all, surely Google is having its own. Note also that, for example in GWT, many such embedding was done by compiler.
What you can do is to search for:
CSS image embedder - for encoding images into CSS
CSS and JS minifier - for building single CSS/JS and minimizing it
And you need some simple tool that will embed it into HTML.

Organizing script files

I usually have jQuery code that is page specific along with a handful of functions that many pages share. One approach is to make seperate files for organizing, but i'm thinking that putting all the script in one file and making comments in the file for readability would also work. Then when the site goes live I can minify and obfuscate if needed.
I think the question comes down to limiting http requests or limiting file size. Is one of these a bad habit?
You can have it both ways. Develop with as many individual .js files as you need. Then use a build/deployment process that assembles the files into one larger one, then pushes them through something like Google's Closure Compiler. Compression can be handled transparently by your web server if configured properly.
Of course, this implies a structured development and deployment workflow -- e.g., with files to be assembled/compiled in a specific directory, separated from files that should be served as-is.
References:
Closure Compiler
Apache Ant
Automating the Closure Compiler with Ant
If you can put all the scripts in one file which is minified then that's what you should do first.
Also if your webserver sends out gzipped content the actual script transfer would be small, and the script will be cached on client. Since tcp transfers starts out slow and increase in speed, limiting the number of requests is the best way to speed up the overall loading of a page.
This is the same reason you see sites concatenating images into one larger image, and using CSS to display the correct part of it.

Disadvantages of dynamically including CSS/JS files?

What are the specific disadvantages (if any) of dynamically including the CSS and JS files for a website?
By dynamically, I mean, using the document.write() method generate and tags.
I'd like to use this technique on a very large, high-traffic website, since it allows me to easily manage which files are downloaded for which site sections, and to switch on a compressed mode in which only minified files are downloaded.
Thoughts?
Reliability. People may have JS
disabled, etc.
Debugging. Some browsers (IE) don't
give you the included file's line
number on an error, but simply the
document.write line in the main file.
The advantages are that you can manage and organize your code more easily and you're able to load only those scripts on the page that are absolutely necessary.
The disadvantage, one that I can think of, is that some website performance measuring tools such as PageSpeed and YSlow will warn you about the number of CSS and JavaScript files referenced by a page. Modern web development practices often encourage you to Combine CSS files and Combine JavaScript files to reduce the total number of files required to render a page and improve network performance. Generally speaking, serving one big, bloated file is better than serving 10 small lean-and-mean files because of the overhead associated with requesting a file from the server.

What Ext JS Framework files are necessary in a working site?

I've inherited a high-traffic site that loads some Ext javascript files and I'm trying to trim some bandwidth usage.
Are Ext libraries necessary for development only or are they required for the finished site? I've never used Ext.: Ext JS - Client-side JavaScript Framework
The site loads ext-base.js (35K), ext-all-debug.js (950K), expander.js, exteditor.js. It appears that expander.js and exteditor.js have some site specific code, so they should stay?
But what about ext-base.js and ext-all-debug.js? Am I reading this correctly - are base and debugging libraries necessary for a live site?
Simply consult the documentation the previous developers have written for you. :P
To actually answer your question: You will more than likely want to keep all of the files available. You might however want to change ext-all-debug.js to ext-all.js since the debug file contains non-minimized Javascript.
The previous posters are correct that if the site is actually using ExtJS, then you will need to keep the references to ExtJS. Assuming that you actually need to keep the references, replacing ext-all-debug.js with ext-all.js will save some bandwidth. Additionally, consider using one of the CDNs available now. For instance, using Google's CDN, you will save not only your own bandwidth, but bandwidth for your client and decrease page load times.
ExtJS files are available to be hosted on the Cachefly CDN: Ext CDN – Custom Builds, Compression, and Fast Performance.
Hosting the files remotely should remove the load for at least those files.
As to which you can safely remove, you need a JavaScript developer to work on documenting what's truly necessary to your application.
As to what ExtJS is, it's a JavaScript library and framework - a la jQuery, YUI, MooTools, PrototypeJS, etc. So indeed, it can be critical to your site if your site relies on JavaScript to work.
I don't know much about Ext, but I think it's to assume that expander.js and exteditor.js depend on ext-base.js and ext-all-debug.js. As such, removing the latter two will break the site functionality.
The only thing I'd change would to switch from the debug version of ext-all to the production (which is most probably called ext-all.js and you should be able to load it from the same place the debug is located or from the Ext site).
One option would be to condense all of those files into one file (it would be larger, but it would reduce the overhead of multiple HTTP requests). Also verify that the server is sending the ETag and Expires headers, so that the browser can cache as much of it as possible...

Categories