This relates to an AngularJS application, originally built using yo angular-fullstack with JS scripting (not ts). Functionally fine, but hitting performance UX issues.
My production deployment is to AWS ElasticBeanstalk nano instances, which probably isn't helping but I get the impression from reading up that my {app|polyfill|vendor}bundles are too big.
Homework done - biblio and webpack.config.plugins at end.
Even with all of the above plus 'lessons learned' from wiser souls than me (see biblio) included in my project, I'm still seeing 'big' warnings from webpack and large, whole-package bundles in the webpack-bundle-analyzer output:
polyfills.66cd88606235a4586eb1.js.gz 25.8 kB [emitted]
app.66cd88606235a4586eb1.js.gz 48.8 kB [emitted]
vendor.66cd88606235a4586eb1.js.gz 319 kB [emitted] [big]
../client/index.html 4.77 kB [emitted]
I don't think my app is that complicated. 10-ish components on 4 views, one chart (angular-chart.js).
Here are my questions:
various modules appear to be bundled whole (e.g. angular 1.23MB, lodash 528KB, ...) For example, I've changed all of my own imports to specific lodash functions/modules, but the overall lodash library is still being bundled. Does that mean that one of my dependencies (or one nested further down is pulling it in, or am I missing something?
I think I can change my import module from 'angular'; statements to something more specific like import ... from '#angluar/core'; but all the documentation seems to imply that I'd have to upgrade the whole app to Angular5 to do that - am I missing something? (Said another way, I can't find beginner-friendly documentation on angular2 '#angular/core'.)
the angular-cli seems to be for new ng inited projects, so I think it's production shrinking and tree-shaking features aren't available to me unless I upgrade/restructure my project - is that correct?
are the bundles that webpack injects into index.html the gzipped ones? Not sure when/how they get used (if at all).
I'm configuring and invoking webpack from gulp - are all of the plugins I'm using (see list above) equivalent to the -p CLI option? ...or do I need to do more?
Help! What else do I need to look at or do to reduce bundle size? Do I have to go delving in node_modules/** to find out what is sucking in whole packages? What am I missing?
TIA
Biblio
Angular2 CLI huge vendor bundle: how to improve size for prod? - from which I learned about not importing whole modules when I only need a bit of it (spin-off question above)
How to use babel loader on node_modules in webpack for ES6+? - from which I learned about setting up a raft of webpack plugins to reduce, organise and compress my bundles (spin-off question above)
https://hackernoon.com/optimising-your-application-bundle-size-with-webpack-e85b00bab579 - from which I got confirmation from Google that my app is slow because the bundles are big and some more webpack config hints
https://hackernoon.com/reduce-webpack-bundle-size-for-production-880bb6b2c72f - followed all the steps, but it didn't seem to make any different to my bundle sizes
https://github.com/mgechev/angular-performance-checklist#introduction - picked as much as I could out of this without going to the 'upgrade to 5' areas.
webpack
My webpack config for production build currently includes:
config.devtool='source-map'
DefinePlugin (NODE_ENV=production)
NoEmitOnErrorsPlugin
UglifyJsPlugin with everything turned up to 11
OccurrenceOrderPlugin
AggressiveMergingPlugin
CommonsChunkPlugin
ModuleConcatenationPlugin
HashedModuleIdsPlugin
OK, so did a raft more delving and think I'm onto some fruitful tools and techniques now. In summary...
On #1 I went back to a clean, unmodified yo angular-fullstack and it turns out that my bundles weren't much different in size to the ones which came out of the box. So, I don't think that expending loads of effort trying to unpick where I've created bundle size is going to be too fruitful.
On #2 and #6, I think the answer is technically yes, I could delve into each node_module dependency and 'fix' it from depending on whole bundles, but that seems counter-intuitive for how better coders than I have written those modules in the first place. So, I'm not progressing that.
On #3, yes I was blurring AngularJSand Angular(5). I want to stay with AngularJS for now so no further action here.
On #4, no. Manually changing the webpacked index.html to use the gzip bundles results in fast download, but the app doesn't work. Using zipped resources is a web server config thing. Amended my nginx config to serve gzipped js instead - that's just going through deployment as I write.
On #5, I think so, yes.
On #6, I found Madge, which helps visualize dependency graphs. Pointing this at my client entry point (client/app/app.js in my case) and including node_modules (see CLI options) suggests:
I need to look at structuring my single module into a series of modules so each can be loaded seperately
I need to look at lazy loading of modules which are not required for the home page (example)[https://toddmotto.com/lazy-loading-angular-code-splitting-webpack]
Further tuning needs to come from optimizing functional design of the home page so it is light on both number http requests and size of responses. Nothing earth-shatteringly insightful there, it just hadn't shown up on local dev environment so I hadn't thought about it until now.
Related
I have a project on Next.js framework and the problem is that First Load JS shared by all pages is rather heavy.
I want to know what possible aspects I can take into consideration to reduce it and also know if I'm doing something wrongly.
my next js version : ^10.0.3
information relating to pages while building :
I would suggest installing #next/bundle-analyzer to get a better idea of what dependencies you're importing and which ones are contributing to that file size. This can help in identifying any unused or unnecessary libraries that could potentially be removed.
You can also look into using code splitting to reduce the bundle for the initial load of the application. This can be achieve by lazy loading code using dynamic import() and/or next/dynamic.
Furthermore, Next.js also mentions in their documentation other tools you can use to understand how much a dependency can add to your bundle.
(...) you can use the following tools to understand what is included inside each JavaScript bundle:
Import Cost – Display the size of the imported package inside VSCode.
Package Phobia – Find the cost of adding a new dev dependency to your project.
Bundle Phobia - Analyze how much a dependency can increase bundle sizes.
Webpack Bundle Analyzer – Visualize size of webpack output files with an interactive, zoomable treemap.
bundlejs - An online tool to quickly bundle & minify your projects, while viewing the compressed gzip/brotli bundle size, all running locally on your browser.
— Next.js, Going to Production, Reducing JavaScript Size
I'm using Webpack in a fairly simple, straightforward way that bundles together a few JS and TS files into one bundle, and it works well on my site.
However, I want to split the current bundle into smaller bundles, as I get both a warning when I build the bundle due to it's size, and I get warnings running Lighthouse audits in browser that I should reduce the file size of my bundle.js file.
The simplest solution in my mind is to split my current bundle into 4 parts, i.e. bundle1.min.js, bundle2.min.js, etc... Then I just serve the bundles consecutively.
The problem is splitting and serving my bundle this way is breaking other JS on my page. For example a function defined in bundle1 and called in a different JS file no longer works, unless I remove all the other bundle.js files. It seems that only the most recently loaded bundle file works.
Is there a better approach to get smaller bundles, and make sure that all bundles work correctly?
Route-based code splitting is quite popular because each page/route usually has a small subset of components on it.
The guide can be found here (for React)
A little embarrassed, looks like this was just a scoping issue with a dependency in one bundle breaking code in another by being absent. Reorganizing my bundles so dependencies are present where needed. Ai ya.
My Angular project is #Angular4.3.3
ng build -prod
Takes 77 seconds to make a build
ng build --prod --build-optimizer=true
Takes 190 seconds to make a build, No vendor chunk, less in size(but not a big difference in size though)
Chunk differences on console image:
I read Bundling & Tree-Shaking but still don't get the clear difference between builds created by those commands.
Why there are these two different ways and what is the difference in performance or any other way?
--build-optimizer and --vendor-chunk
From Angular CLI Docs:
When using Build Optimizer the vendor chunk will be disabled by
default. You can override this with --vendor-chunk=true.
Total bundle sizes with Build Optimizer are smaller if there is no separate vendor chunk because having vendor code in the same chunk
as app code makes it possible for Uglify to remove more unused code.
First of all why is vendor chunk useful in the first place?
vendor.js is most useful during development because you're updating your code far more frequently than you're downloading a new framework or updating npm packages.
Therefore compile time is faster during development with vendor chunk enabled.
As for why is --vendor-chunk even an option? This is off the top of my head but:
If your app has a lot of users on a slow connection and you frequently update it then it may be advantageous to have a larger vendor chunk that remains unchanged for longer. When updating your app then the chunks will be smaller. This will not give you fully optimized (tree shaken) app, but in very specific circumstances it could be useful. [This assumes you're using fingerprinting where the filename is literally a checksum/hash of the contents of the file - so if it doesn't change then the file can be cached.]
Very occasionally there can be subtle bugs in your code that only become apparent when minimizing code in a certain way. This may be due to relying on a method/class name that gets 'optimized out'. So you may have to enable vendor chunk in production if you have such a bug (while you fix it).
Enable vendor chunk deliberately to make your app slower to load - then tell your boss you're working late to optimize it - and disable it ;-)
Why not? People like to play!
From the webpacker gem:
Webpacker makes it easy to use the JavaScript pre-processor and
bundler Webpack 2.x.x+ to manage application-like JavaScript in Rails.
It coexists with the asset pipeline, as the primary purpose for
Webpack is app-like JavaScript, not images, CSS, or even JavaScript
Sprinkles (that all continues to live in app/assets).
However, it is possible to use Webpacker for CSS, images and fonts
assets as well, in which case you may not even need the asset
pipeline. This is mostly relevant when exclusively using
component-based JavaScript frameworks.
Why is it more relevant for component-base frameworks to use Webpacker for assets? If I'm using React, what difference does it make to get assets from asset pipepline vs Webpack?
In terms of strictly holding assets - I don't think there's too much difference. However, I've recently migrated one of our apps from the asset pipeline to webpack - I will try share some learnings of why webpack is beneficial below.
Despite Rails being a fast moving and dynamic web framework, using the newest front-end tools with the default rails assets handler is difficult. Managing JS libraries with bundler is a pain. Webpack makes maintaining 3rd party libraries considerably easier.
Page loads using webpack were faster with webpack than the default asset pipeline considering it compiled files by default during each refresh.
Rails directory structure doesn't distinguish clearly enough between the front-end and back-end of the application. The dawn of single page applications has meant that identifying the client-side of an app as a separate entity and not some addon to the back-end is something we viewed as quite important. Front end components are not just addons. They are their own beings.
Separating assets from views is strange - views and assets create one being and should sit in one place, Rails views are treated more like a backpack on the controller.
Hot-reloading of our app front-end is great. This saves a lot of time in development.
However
we've found that it can be volatile with constant configuration changes and unfriendly as a result.
It doesn't run automatically on a request, like something like sprockets does. For example, if you are using webpacker, You need to have the webpacker dev server running that first looks for file changes, then compiles, and only then may reload your page.
The fact that webpack is primarily concerned with js and not jpegs, pngs, svgs etc. makes comparing the rails asset pipeline and webpack a little confusing...
Not sure if it did, but I hope this helps!
I could not find any explanation regarding the work of "npm run build",
It is simple and easy to use and i get the "build" folder that works great,
But, in create-react-app, what happens exactly behind the scene?
Is it a complete different use of a build tool?
If not, is it utilizing other build tools?
Developers often break JavaScript and CSS out into separate files. Separate files let you focus on writing more modular chunks of code that do one single thing. Files that do one thing decrease your cognitive load as maintaining them is a quite cumbersome task.
What happens exactly behind the scene?
When it’s time to move your app to production, having multiple JavaScript or CSS files isn’t ideal. When a user visits your site, each of your files will require an additional HTTP request, making your site slower to load.
So to remedy this, you can create a “build” of our app, which merges all your CSS files into one file, and does the same with your JavaScript. This way, you minimize the number and size of files the user gets. To create this “build”, you use a “build tool”. Hence the use of npm run build.
As you have rightly mentioned that running the command (npm run build) creates you a build directory. Now suppose you have a bunch of CSS and JS files in your app:
css/
mpp.css
design.css
visuals.css
...
js/
service.js
validator.js
container.js
...
After you run npm run build your build directory will be:
build/
static/
css/
main.css
js/
main.js
Now your app has very few files. The app is still the same but got compacted to a small package called build.
Final Verdict:
You might wonder why a build is even worth it, if all it does is save your users a few milliseconds of load time. Well, if you’re making a site just for yourself or a few other people, you don’t have to bother with this. Generating a build of your project is only necessary for high traffic sites (or sites that you hope will be high traffic soon).
If you’re just learning development, or only making sites with very low traffic, generating a build might not be worth your time.
It's briefly explained here: https://github.com/facebookincubator/create-react-app#npm-run-build-or-yarn-build.
It correctly bundles React in production mode and optimizes the build for the best performance.
The build is minified and the filenames include the hashes.
Behind the scenes, it uses babel to transpile your code and webpack as the build tool to bundle up your application.