Existing App:
My application is using require.js.
Benefit of my application is that other people can extend my application via writing 3rd party plugins using require.js
For example: (a 3rd party plugin registration)
// registering a new plugin
{
url:'#new-page-url',
js:'plugin-folder/new-page-url.js'
}
So when #new-page-url is hit anywhere, requirejs consumes js file from plugin-folder/new-page-url.js
Please note that after I compile my application, it does not include third party source as they can be fetched on the fly with requriejs
Question:
I have been looking into webpack and since it compiles everything before distribution (bundle.js as starting file).How can a third party plugin work on the fly like above example ?
Please note that after I compile my application, it does not include third party source as they can be fetched on the fly with requriejs
Yes, that's the problem.
I'm in a situation similar to yours where a large application of mine written as a collection of AMD modules can load editing modes at runtime, which are also AMD modules. The modes are not generally bundled with the application.
AFAIK it is not possible to replicate only with Webpack the possibility of doing the equivalent of a RequireJS' require([a]) where a is a variable whose value cannot be known at build time but is determined at run time. (And for the benefit of readers who may not be familiar with RequireJS, I'll add that yes, I do mean the first argument to be [a] and not just a. RequireJS makes a distinction between the two.)
Webpack needs to know at build time which modules it is going to bundle together. ("Needs to know" means it needs to know the name and find the module's source.) So it does not support dynamically loading modules that only get to be known at runtime. If you read the documentation, or tutorial, you'll run into discussions of dynamic loading with Webpack but these do not allow doing the equivalent of the require([a]) case. Webpack can split a bundle into chunks and load chunks as needed but for this to work, Webpack still needs to know ahead of time the whole set of modules it is ever going to need. This does not allow loading at runtime a module that was unknown at build time. There's also require.context, but while it allows you to determine at run time which specific module you want, the set from which the module comes is determined at build time. If at build time you know that you're going to be using one of A, B, C, that's fine. But if you don't know at build time the name of your module, or cannot feed its source to Webpack, you're out of luck.
This being said, it should be possible to build your application's core with Webpack and have this core perform direct calls to a module loader like RequireJS or SystemJS. This is the direction I'm moving into with my own application but I've not crossed that bridge yet.
Related
I've recently been thrown in to clean up a project which has like 45-50 individual .js javascript files. I wonder what the best approach would be to decrease the loading size of them all. Just concatenate all files into one with npm or gulp? Install some module loader? webpack?
If you're already concatenating, minifying, and uglifying and you don't want all the files to be loaded on all the pages due to a monolithic bundle, you might be looking for something like Webpack's Commons Chunk Plugin.
This plugin walks down the tree of dependencies for each endpoint defined in your Webpack.config file and determines which modules are required across all pages. It then breaks the code into two bundles, a "common" bundle containing the modules that every page requires, which you must load with a script tag on each page:
<script src="commons.js" charset="utf-8"></script>
And an endpoint bundle for each individual page that you reference normally in a script tag placed after the commons script tag:
<script src="specificpage.bundle.js" charset="utf-8"></script>
The result is that an individual page will not have to load modules that will only ever be used on other pages.
Again, this is a Webpack plugin. I don't know if this functionality is available as a Gulp plugin, because it must have knowledge of all endpoints in order to determine which dependencies are common to them all.
I redirect you to the very good https://github.com/thedaviddias/Front-End-Checklist
In particular the following advises:
JavaScript Inline: High You don't have any JavaScript code inline
(mixed with your HTML code).
Concatenation: High JavaScript files
are concatenated.
Minification: High JavaScript files are minified (you can add the .min suffix).
You can accomplish this with a package manager such as gulp, grunt or webpack (for the most famous ones). You just need to choose what you prefer to use.
If you consider webpack, You can start with my very simple (but understanding) starter: https://github.com/dfa1234/snippets-starter
There's no much thing that you can do, basically is:
Concatenation - https://www.npmjs.com/package/gulp-concat
Minification - https://www.npmjs.com/package/gulp-minify
Instead of creating all those scripts, you can get something to re-use on yeoman, f.e. the Fountain, so it will reduce a lot of time just typing procedural code for doing the concatenation/minification.
Also if you can use some lazy load (like RequireJS or some frameworks have support to lazy load the module, like Angular) that will improve the performance of your aplication
EDIT:
If you want even more performance, you can install some compression tool in your server, for example this one for NodeJS https://www.npmjs.com/package/compression
I'm my personal opinion, if you have time, the best approach would be to read and understand the purpose of the project. Then plan a proper refactor. You are not fixing anything with concatenating, this is just a deployment step.
You should analyze which technologies are being used and if you want to maintain this code, in the long run, make a proper refactor into a much more modern stack, maybe you can take a seed project with ES6, webpack, Babel... and create a proper repository well maintained with proper modularity and dependencies resolution.
Once you have that, decreasing the load its just about adding proper tools in build time (babel, webpack, etc).
You would like to add some unit tests and continue working properly :)
I'm working on an old Codeigniter website where the Javascript codebase is messy and poorly structured and I would like to use Webback to manage the scripts. My goal would be start using it to bundle the code I have the way it is, and gradually refactoring it to make use of modules and imports.
At the moment I'm using Gulp on development (but mostly to minify the files) and Carabiner (a Codeigniter library) to insert the scripts in the views.
The scripts, which are all written as IIFEs are not bundled, so in every controller function I have an array of the scripts needed in that page. For example:
public function homepage()
{
$this->carabiner->js([
['libraryThatIOnlyNeedHere.min.js'],
['myscript1.js'],
['myscript2.js'],
['myscript3.js'],
]);
I would like to use Webpack to create a series of bundles so that I end up loading maximum two files on every page: one for the libraries and one for my scripts.
All the practical examples I've seen with Webpack, though, are for Single Page Applications where it's quite easy to bundle everything together.
What would be the best approach in my case? Considering that the code is still not ready to properly use modules and imports, shall I create many entry points in the Webpack configuration file, possibly one for every page and list every script needed in that page?
Main idea behind webpack is to build modules graph from provided sources and then to combine it into bundles. If your code doesn't have explicit dependencies - it may be good idea to start with creating a module from every file using any of available approaches (AMD, CommonJS, etc). You will need to create module identifiers and define dependencies for every module. It may be worth to read this article for example.
As intermediate step you may want to use some loader like Require.JS to load your modularized code.
Until this step will be done - there is not much use of Webpack.
If you code an application using ES6 modules and classes, is there any need to use a module loader framework, or is the best practice to just use a build tool to concatenate all the code into a file (or files) and include those using a normal script tag?
Yes. Somebody, somewhere along the line has to load the module.
I think you're conflating compiling modules ahead of time vs loading them individually. Webpack is a module loader that outputs a single file for the browsers to use later, while the System API and requirejs et al load a number of individual files.
There are performance factors on both sides, particularly longer build time (when precompiling) vs longer load time (with multiple files).
Webpack, Browserify, and most other module loaders (with the notable exception of the System API) allow you to define some loaders for certain file types and automagically compile your (S)CSS or templates on the way through, as well as running other tools to uglify or obfuscate your code. The ES6 System API does not provide these features, but is a more robust runtime loader than most.
This boils down to two trade-offs:
support for non-JS modules (styles, templates) vs build time
single request and longer build vs many requests and short/no build
Evaluate them for your users (high-bandwidth vs mobile), environment (if you have two dozen CI agents, who cares if the build takes an extra 3s?), and stack (if you have a lot of template files, compiling them AOT could be important).
I've read a lot of articles on AMD solutions like RequireJS or module loaders that follow CommonJS style in Javascript.
Let's say I have an app splitted in this parts:
App definition that rely on the framework i use
Model 1 that rely on App definition and framework
Model 2 that rely on App definition, Model 1 and my framework
I may write each part as a RequireJS module or a common JS module and split my project in how many files I want but what's the advantage of writing each part as a module or splitting them in many files and then load them in the right order (to avoid dependency problems) maybe concatenatening all the files in a big one to reduce HTTP requets (as done by r.js optimizer)?
In my opinion, there are three rather important reasons:
You can create and re-use modules without polluting the global namespace. The more polluted your global namespace is, the bigger the chance of a function/variable collision. That means you define a function called "foo" and another developer defines the function "foo" = one of the functions gets overwritten.
You can structure your code into separate folders and files and requirejs will load them asynchronously when needed, so everything just works.
You can build for production. RequireJS comes with its own build tool called R.JS that will concat and uglify your javascript modules into a single (or multiple) packages. This will improve your page speed as the user will have to make less script calls and load less content (as your JS is uglified).
You can take a look at this simple demo project: https://c9.io/peeter-tomberg/requirejs (in cloud9ide).
To build your modules into a single app, all you have to do is have requirejs npm package installed and run the command: r.js -o build/build.properties.js
If there are any questions, ask away.
Edit:
In development, having all modules in separate files is just a good way to structure and manage your code. It also helps you in debugging (e.g. error on "Module.js line 17" instead of "scripts.js line 5373").
For production, you should use the build tool to concat and uglify the javascript into a single file. This will help the page load quicker as you are making less requests. Every request you make to load something slows down your page. The slower your page, the less points Google gives you. The slower the page, the more frustrated your users will be. The slower your page, the less sales you will get.
If you wish to read more about web page performance, look at http://developer.yahoo.com/performance/rules.html
I have many JS files. Some of them depend on each other. Many of them depend on jQuery.
I need tool that can accept one file as parameter, fetch all its dependencies transitively and compile them into one file in proper order (based on dependencies)
Dependency information not always available inside files itself, so it would be nice to have it somewhere outside (xml file? folder structure?)
I've heard about Yahoo JS compiler, closure and so on, but I am not sure they do what I need.
Look: I have module "CustomerPage". It sould include "validation.js" and "gui.js". Both require jquery.js. And "gui.js" also requires "myFunctions.js".
I want some ant task or some script that would generate "CustomerPage.js" as result of all that files.
Tool should check dependency order, prevent double including and so on.
My project could have around 500 js files, how could I live with out of this tool?
People says "use GWT", but I need plain JS.
You might want to look at one of the AMD-style module loaders, such as RequireJS. Some of these can do what you want for precompiling, and can run in a development mode which makes it easier to debug by including all the files directly.