My Angular project is #Angular4.3.3
ng build -prod
Takes 77 seconds to make a build
ng build --prod --build-optimizer=true
Takes 190 seconds to make a build, No vendor chunk, less in size(but not a big difference in size though)
Chunk differences on console image:
I read Bundling & Tree-Shaking but still don't get the clear difference between builds created by those commands.
Why there are these two different ways and what is the difference in performance or any other way?
--build-optimizer and --vendor-chunk
From Angular CLI Docs:
When using Build Optimizer the vendor chunk will be disabled by
default. You can override this with --vendor-chunk=true.
Total bundle sizes with Build Optimizer are smaller if there is no separate vendor chunk because having vendor code in the same chunk
as app code makes it possible for Uglify to remove more unused code.
First of all why is vendor chunk useful in the first place?
vendor.js is most useful during development because you're updating your code far more frequently than you're downloading a new framework or updating npm packages.
Therefore compile time is faster during development with vendor chunk enabled.
As for why is --vendor-chunk even an option? This is off the top of my head but:
If your app has a lot of users on a slow connection and you frequently update it then it may be advantageous to have a larger vendor chunk that remains unchanged for longer. When updating your app then the chunks will be smaller. This will not give you fully optimized (tree shaken) app, but in very specific circumstances it could be useful. [This assumes you're using fingerprinting where the filename is literally a checksum/hash of the contents of the file - so if it doesn't change then the file can be cached.]
Very occasionally there can be subtle bugs in your code that only become apparent when minimizing code in a certain way. This may be due to relying on a method/class name that gets 'optimized out'. So you may have to enable vendor chunk in production if you have such a bug (while you fix it).
Enable vendor chunk deliberately to make your app slower to load - then tell your boss you're working late to optimize it - and disable it ;-)
Why not? People like to play!
Related
I have a project on Next.js framework and the problem is that First Load JS shared by all pages is rather heavy.
I want to know what possible aspects I can take into consideration to reduce it and also know if I'm doing something wrongly.
my next js version : ^10.0.3
information relating to pages while building :
I would suggest installing #next/bundle-analyzer to get a better idea of what dependencies you're importing and which ones are contributing to that file size. This can help in identifying any unused or unnecessary libraries that could potentially be removed.
You can also look into using code splitting to reduce the bundle for the initial load of the application. This can be achieve by lazy loading code using dynamic import() and/or next/dynamic.
Furthermore, Next.js also mentions in their documentation other tools you can use to understand how much a dependency can add to your bundle.
(...) you can use the following tools to understand what is included inside each JavaScript bundle:
Import Cost – Display the size of the imported package inside VSCode.
Package Phobia – Find the cost of adding a new dev dependency to your project.
Bundle Phobia - Analyze how much a dependency can increase bundle sizes.
Webpack Bundle Analyzer – Visualize size of webpack output files with an interactive, zoomable treemap.
bundlejs - An online tool to quickly bundle & minify your projects, while viewing the compressed gzip/brotli bundle size, all running locally on your browser.
— Next.js, Going to Production, Reducing JavaScript Size
This relates to an AngularJS application, originally built using yo angular-fullstack with JS scripting (not ts). Functionally fine, but hitting performance UX issues.
My production deployment is to AWS ElasticBeanstalk nano instances, which probably isn't helping but I get the impression from reading up that my {app|polyfill|vendor}bundles are too big.
Homework done - biblio and webpack.config.plugins at end.
Even with all of the above plus 'lessons learned' from wiser souls than me (see biblio) included in my project, I'm still seeing 'big' warnings from webpack and large, whole-package bundles in the webpack-bundle-analyzer output:
polyfills.66cd88606235a4586eb1.js.gz 25.8 kB [emitted]
app.66cd88606235a4586eb1.js.gz 48.8 kB [emitted]
vendor.66cd88606235a4586eb1.js.gz 319 kB [emitted] [big]
../client/index.html 4.77 kB [emitted]
I don't think my app is that complicated. 10-ish components on 4 views, one chart (angular-chart.js).
Here are my questions:
various modules appear to be bundled whole (e.g. angular 1.23MB, lodash 528KB, ...) For example, I've changed all of my own imports to specific lodash functions/modules, but the overall lodash library is still being bundled. Does that mean that one of my dependencies (or one nested further down is pulling it in, or am I missing something?
I think I can change my import module from 'angular'; statements to something more specific like import ... from '#angluar/core'; but all the documentation seems to imply that I'd have to upgrade the whole app to Angular5 to do that - am I missing something? (Said another way, I can't find beginner-friendly documentation on angular2 '#angular/core'.)
the angular-cli seems to be for new ng inited projects, so I think it's production shrinking and tree-shaking features aren't available to me unless I upgrade/restructure my project - is that correct?
are the bundles that webpack injects into index.html the gzipped ones? Not sure when/how they get used (if at all).
I'm configuring and invoking webpack from gulp - are all of the plugins I'm using (see list above) equivalent to the -p CLI option? ...or do I need to do more?
Help! What else do I need to look at or do to reduce bundle size? Do I have to go delving in node_modules/** to find out what is sucking in whole packages? What am I missing?
TIA
Biblio
Angular2 CLI huge vendor bundle: how to improve size for prod? - from which I learned about not importing whole modules when I only need a bit of it (spin-off question above)
How to use babel loader on node_modules in webpack for ES6+? - from which I learned about setting up a raft of webpack plugins to reduce, organise and compress my bundles (spin-off question above)
https://hackernoon.com/optimising-your-application-bundle-size-with-webpack-e85b00bab579 - from which I got confirmation from Google that my app is slow because the bundles are big and some more webpack config hints
https://hackernoon.com/reduce-webpack-bundle-size-for-production-880bb6b2c72f - followed all the steps, but it didn't seem to make any different to my bundle sizes
https://github.com/mgechev/angular-performance-checklist#introduction - picked as much as I could out of this without going to the 'upgrade to 5' areas.
webpack
My webpack config for production build currently includes:
config.devtool='source-map'
DefinePlugin (NODE_ENV=production)
NoEmitOnErrorsPlugin
UglifyJsPlugin with everything turned up to 11
OccurrenceOrderPlugin
AggressiveMergingPlugin
CommonsChunkPlugin
ModuleConcatenationPlugin
HashedModuleIdsPlugin
OK, so did a raft more delving and think I'm onto some fruitful tools and techniques now. In summary...
On #1 I went back to a clean, unmodified yo angular-fullstack and it turns out that my bundles weren't much different in size to the ones which came out of the box. So, I don't think that expending loads of effort trying to unpick where I've created bundle size is going to be too fruitful.
On #2 and #6, I think the answer is technically yes, I could delve into each node_module dependency and 'fix' it from depending on whole bundles, but that seems counter-intuitive for how better coders than I have written those modules in the first place. So, I'm not progressing that.
On #3, yes I was blurring AngularJSand Angular(5). I want to stay with AngularJS for now so no further action here.
On #4, no. Manually changing the webpacked index.html to use the gzip bundles results in fast download, but the app doesn't work. Using zipped resources is a web server config thing. Amended my nginx config to serve gzipped js instead - that's just going through deployment as I write.
On #5, I think so, yes.
On #6, I found Madge, which helps visualize dependency graphs. Pointing this at my client entry point (client/app/app.js in my case) and including node_modules (see CLI options) suggests:
I need to look at structuring my single module into a series of modules so each can be loaded seperately
I need to look at lazy loading of modules which are not required for the home page (example)[https://toddmotto.com/lazy-loading-angular-code-splitting-webpack]
Further tuning needs to come from optimizing functional design of the home page so it is light on both number http requests and size of responses. Nothing earth-shatteringly insightful there, it just hadn't shown up on local dev environment so I hadn't thought about it until now.
My team is building a large React application. I am wanting to know if what we are trying to accomplish in regards to build and deployment are possible with Webpack.
Let’s say our team is building Google Admin. There are 4 modules/app within the admin that 4 different teams are focused on. There is then a console application that is the entry point to these 4 modules/apps. We want to be able to work on each of the modules independently and be able to deploy them independently.
How we have it setup right now is there would be 4 separate applications that are dev harnesses to build these modules. We build them and copy the distribution .js and .js.map files to the console's ./modules folder. We would reference these modules lazily using System.import.
Is it possible, while the console app is built and in production, to “swap out” the module-one.js and module-one.js.map files that the console depends on without having to rebuild and redeploy the entire console app?
Goals:
Do not package these apps for npm. This would definitely require the console app to update and rebuild.
Build any module and deploy just that specific module to production without having to rebuild the console application.
Do not redirect to separate SPAs.
I tried my best to explain the goal. Any input would be much appreciated. I have found nothing in my search.
webpack loads the modules into memory and watches the filesystem for changes, as long as webpack is running you shouldn't have an issue replacing any given module. However webpack will attempt to build the entire in memory bundle with each module change (as it has no way of knowing that your module is truly independent). The only thing I can thin of would be to write a shim between the console app and the modules that watches the files (like webpack) but only replaces the in memory version of the local file that was changed. Reading this I'm not even sure if it makes sense to me...
Since I know now, what "angular compiler" actually means, this would be my next question.
I read here about the positive and negative points of ahead of time compiling. To me, it boils down to this:
Use AOT for deployments
Use JIT for development
The only valid reason (imho) for JIT is, that it runs way faster (which is nice for the development process). Is there any other reason?
AOT has so many HUGE advantages over JIT, that I wonder why JIT is even an option for for a deployment.
Well, There are many difference and some of them are pointed out by you very well. It also dependent what kind of environment you are in and your requirement.
I have been using angular since angular-2.beta.17 when angular cli was not existed and we have to take the help of many build system and run environment to run the project in anywhere.
Systemjs building and running:
In some situation where multiple project and frameworks run together, you can not make a AOT build and bootstrap your code into the SPA and run. In system js environement you have to load the scripts one by one and they bootstrap it.
You must have seen this kind of build and loading script in many online tools like codepen, plunker, etc. I think they all uses systemjs loading
There are many other like commonjs loader, babel build system, webpack build system. But now angular cli is a bullet proof tool with internally use web pack and amber cli to handle everything you want.
Ahead of Time compilation and JIT debugging
As the name suggest AOT do a tree shaking and include everything which is really used in your code and throw away unused codes and compact the code size which can be loaded very fast. But by doing that it looses debugging control and don't give you a nice error message if in case in production you want to see what is wrong.
Same time JIT can point to the line number in typescript file which have the error and make your life super easy to debug while running in dev mode through angular cli.
You can get lot more in angular compiler and there are many tools and games available as well.
one of my favorite is ngrev
I could not find any explanation regarding the work of "npm run build",
It is simple and easy to use and i get the "build" folder that works great,
But, in create-react-app, what happens exactly behind the scene?
Is it a complete different use of a build tool?
If not, is it utilizing other build tools?
Developers often break JavaScript and CSS out into separate files. Separate files let you focus on writing more modular chunks of code that do one single thing. Files that do one thing decrease your cognitive load as maintaining them is a quite cumbersome task.
What happens exactly behind the scene?
When it’s time to move your app to production, having multiple JavaScript or CSS files isn’t ideal. When a user visits your site, each of your files will require an additional HTTP request, making your site slower to load.
So to remedy this, you can create a “build” of our app, which merges all your CSS files into one file, and does the same with your JavaScript. This way, you minimize the number and size of files the user gets. To create this “build”, you use a “build tool”. Hence the use of npm run build.
As you have rightly mentioned that running the command (npm run build) creates you a build directory. Now suppose you have a bunch of CSS and JS files in your app:
css/
mpp.css
design.css
visuals.css
...
js/
service.js
validator.js
container.js
...
After you run npm run build your build directory will be:
build/
static/
css/
main.css
js/
main.js
Now your app has very few files. The app is still the same but got compacted to a small package called build.
Final Verdict:
You might wonder why a build is even worth it, if all it does is save your users a few milliseconds of load time. Well, if you’re making a site just for yourself or a few other people, you don’t have to bother with this. Generating a build of your project is only necessary for high traffic sites (or sites that you hope will be high traffic soon).
If you’re just learning development, or only making sites with very low traffic, generating a build might not be worth your time.
It's briefly explained here: https://github.com/facebookincubator/create-react-app#npm-run-build-or-yarn-build.
It correctly bundles React in production mode and optimizes the build for the best performance.
The build is minified and the filenames include the hashes.
Behind the scenes, it uses babel to transpile your code and webpack as the build tool to bundle up your application.