Bundling and Minifying Durandal Applications - javascript

I'm using MVC5/Durandal and wondering what the recommended approach to bundling/minifying a durandal application would be. Ive seen docs on using Weyland but will be deploying to an Azure Website and don't see how to leverage this in my .net-based deployment process. How can I go about configuring automated bundling/minification of my durandal application when deploying to Azure?

I've spent a bit of time trying to optimize an AngularJS application for one of the biggest banks in Holland. Although it's no Durandal, this might still give you some ideas.
So what did we use for bundling and minification? Out-of-the-box bundling and minifcation from ASP.NET MVC ( which is from the system.web.optimization namespace )
You need to get a couple of things in order to leverage this:
Organize your files
Organize your code files in a way that they can easily be bundled. We had a large tree structure under the /app folder in the web project. Something like:
- App
|- Modules
| |-Common
| | |- Directives
| | |- Templates
| | |- Filters
| --User
| ...
| app.js
So the application skeleton was inside the app.js and all the other JS files were required by the application. Point being: all SPA code is separated from vendor javscript files and the rest of course
Set up the budling inside the bundle configuration
That's a breeze now, just do regular-old-bundling from your Global.asax.cs:
Make sure there's a line in the Application_Start() with:
BundleConfig.RegisterBundles(BundleTable.Bundles);
That calls into your BundleConfig class which only needs 1 bundle to pack up the whole /app folder:
bundles.Add(new ScriptBundle("~/bundles/app")
.Include("~/app/*.js")
.IncludeDirectory("~/app", "*.js", true));
We needed the app.js to load first - therefore we put it explicitly at the top. Don't worry, it will not be requested twice.
For bundling - only the sequence of files can be important. However, through including that file explicitly, we could control that and it worked like a charm.
Minification
Now for minification we had to do some code changes. AngularJS can be used with different types of syntax - some of which can be minified, others give problems.
Example:
angular.module('myapp').controller(function($http,$scope) { ... });
can not be minified. The minifyer will change the name of $http so something shorter, after which the injector cannot do dependency injection anymore, since it only knows stuff called $http and $scope and not the minified variable name.
So for Angular you need to use a different syntax:
angular.module('myapp').controller(['$http', '$scope', function($http,$scope) { ... }]);
With this, the injector will know that the first argument of the function is '$http' because that's the first string variable in the array. OK, but that's Angular and you're looking for Durandal.
I've heard that Durandal uses AMD right? So within a module, minification shouldn't be a problem, because it should be smart enough. However, if you're using external things, you want to make sure everything still works. I've read here that you'll want to use te following syntax for your AMDs:
define("someModule", ["jquery", "ko"], function($,ko) { ... });
And that gave us a reduction of 80% of the requests and around the same number for the Javascript payload.
Added AngularJS bonus
This might not be of interest to you, but maybe for other readers. The reason we didn't get a 99% reduction of requests is because AngularJS uses something called 'directives'. These are like HTML templates. Those HTML templates still needed to be downloaded every time they were used.
They were also included in our /app folder - hence we had to add an IgnoreRoute in the routeconfig:
routes.IgnoreRoute("app/");
I Googled, but couldn't find anything similair for Durandal. So Angular will go and get all of the small HTML files, but will first check its $templatecache. In case the HTML content is not in the cache, it goes out and downloads it and places it in the cache, so it needs to be downloaded only once.
We, well I, wrote a T4 generator that outputs a JS file in which all the HTML files in the /app folder are added to the $templatecache. So the output would look like:
angular.module('myapp').run(function($templateCache) {
/// For all *.html files in the /app folder and its children
$templateCache.put('...filename...', '...content of html file ...');
});
Because this .JS file was inside the /app folder, it would immediately get bundled with the application, no more configuration required. This got our requests down for the whole application to just 1. Since the amount of HTML was quite small, it seemed to be faster to do 1 larger request, then multiple smaller ones.
Point is: if Durandal has something similair and it will look for some templates, find the caching mechanism ( because it will have it ) and try to tap into that.
Controlling bundling and minification
I'll quote this site: http://www.asp.net/mvc/overview/performance/bundling-and-minification
Bundling and minification is enabled or disabled by setting the value
of the debug attribute in the compilation Element in the Web.config
file. In the following XML, debug is set to true so bundling and
minification is disabled.
<system.web>
<compilation debug="true" />
</system.web>
So for your release build - this flag shouldn't be set and thus bundling + minification should happen.
But of course, you will want to test locally - you can either remove this from your web.config or override it with BundleTable.EnableOptimizations = true;
Deployment to Azure
Then you mention deployment to Azure. I don't know how this would be any different from deploying to a local server. We used web-deploy. Bundling and minification doesn't happen build-time, so there are no changes in the build process. Also, the optimization framework is being deployed with the site - so no difficult things for deployment either.
Maybe one thing though: you could consider adding the CDN for the libraries you are using:
bundles.Add(new ScriptBundle("~/bundles/jquery", "http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.1.min.js")
In case the CDN location of jQuery was already cached by the client's browser, you'll save another request.
Measuring the performance was easy: just open up the network tab on Chrome and reload the page ( make sure it's not caching ). Check the total number of requests and the total amount of data downloaded. Like I said: we saw a huge improvement.
Well, hope it helps or points you in a right direction.

The below answers are pretty complicated. I've just gone through this with a simple(r) approach here:
https://lifelibertycode.wordpress.com/2015/04/14/how-to-bundle-up-your-mvc-durandal-app/
The steps below:
Step 1: Install Node
Step 2: Install Gulp
$ npm install --global gulp
$ npm install --save-dev gulp
Step 3: Create your gulpfile.js
This should be at the root of your project, and should initially contain this:
var gulp = require('gulp');
gulp.task('default', function() {
// place code for your default task here
});
Step 4: Install gulp-durandal
npm install gulp-durandal --save-dev
Step 5: Update your gulpfile.js
var durandal = require('gulp-durandal');
gulp.task('durandal', function(){
durandal({
baseDir: 'app', //same as default, so not really required.
main: 'main.js', //same as default, so not really required.
output: 'main.js', //same as default, so not really required.
almond: true,
minify: true
})
.pipe(gulp.dest('dir/to/save/the/output'));
});
Step 6: Add a post-build event to your project
if '$(Configuration)'=='Release' (
cd $(ProjectDir)
gulp durandal
)
Step 7: Add a pre-build event to your project
I needed this because occasionally gulp would hang when generating the new main-built.js on top of an existing version. So I just delete the old version before the build begins:
if '$(Configuration)'=='Release' (
cd $(ProjectDir)/app
del main-built.js
del main-built.js.map
)
Now, when you build your project, you’ll generate a new main-built.js file each time that can be served down to your clients. Sweet.
At this point, you probably have some concerns.
How do I keep my files un-bundled when I’m debugging?
#if (HttpContext.Current.IsDebuggingEnabled) {
<script type="text/javascript" src="~/Scripts/require.js" data-main="/App/main"></script>
} else {
#Scripts.Render("~/Scripts/main-built")
}
Where ‘main-built’ is defined in your BundleConfig:
bundles.Add(new ScriptBundle("~/Scripts/main-built").Include(
"~/app/main-built.js"));
How do I bust cache when I have new stuff to ship?
If you’re using the above approach, bundling will take care of this for you. ASP.NET will detect a change to your main-built.js file and append a unique identifier to your bundles to bust the cache.
What if my client has downloaded my SPA, and then I ship an update. Won’t the (outdated) client-side code stick around until they refresh?
Yup. Unless you leverage build versioning to tell the client when it’s out of date, and then tell the user.

I happen have written a blog post about this:
https://javascriptkicks.com/articles/4230
Hopefully that helps you out

Related

Generate Javascript source map without adding NPM to project

My web project serves static web pages and scripts. There is no preprocessing done at all. All changes are done in the client.
It has a main page that lists some other pages. When the user clicks a link, jQuery-UI will load the associated HTML page and any linked Javascript/CSS files.
This works great, and gives us flexibility to add/remove new pages with ease. The problem is when we want to debug the loaded JS and the browser appears not to know about it.
Took me a while to find out about Source Maps, and then find out they are all geared towards framework projects like Angular and React.
We don't want that in this project. Just basic HTML & JS that we can plug in and reload. I realize we may need to run an external command to generate the source maps, but it must be a free standing tool - no NPM or frameworks.
It's an internal web project, so security/privacy is not a concern. We want the clients to see the source code if they need to.
I know there are a lot of Questions about JS source maps, but every single one that I've found assumes using some framework tools. We have no framework and do not want one in this project.
Any suggestions on how we can generate the source maps we need, or do you know of any alternative to debug simple JS loaded via jQuery?
First and foremost, you do not need to use Angular/React for sourcemaps to work. These are just a common use case.
Secondly, NPM is exactly what it says it is; a package manager. So you don't need NPM either.
What you need is a build process. You're quite clear that you don't want to minify the js, but you do want sourcemaps. This is a common configuration used to debug js, and is typically accomplished by "building" or "Uglifying" the code with all of the optimizations disabled.
You could likely avoid NPM entirely if you were willing to use the Closure Compiler, but that is a can of worms and I'd suggest you avoid.
Instead I suggest using installing Uglify* globally* (per dev machine) with NPM. This is a "once per machine" step.
npm install uglify-js -g
*: Hopefully this side steps your NPM-less requirement. I did experiment with cloning the Uglify repo directly, but even then you'd need to get it running, and to do that, at a minimum, you'd want to install its dependencies with NPM). I'd love to be proven wrong about this, but I figured it was very unrelated to this post.
And then writing a build script using that. I've attempted to gather the parts for you here:
File: gen-map.sh
#!/usr/bin/env bash
uglifyjs file1.js --config-file gen-map.json \
-o file1.min.js \
--source-map "root='http://foo.com/src',url='file1.min.js.map'"
cat file1.min.js
File: gen-map.json
{
"compress": false,
"output": {
"beautify": true
},
"sourceMap": {
"content": "content from file1.js.map",
"url": "file1.js.map"
}
}
File: file1.js
var b = function() {
console.log('b');
};
function c() {
console.log('c');
};
console.log('a');
b();
c();
(function() {
console.log('d');
})();
File: file1.min.js
var b = function() {
console.log("b");
};
function c() {
console.log("c");
}
console.log("a");
b();
c();
(function() {
console.log("d");
})();
//# sourceMappingURL=file1.min.js.map
File: file1.min.js.map
{"version":3,"sources":["file1.js"],"names":["b","console","log","c"],"mappings":"AAAA,IAAIA,IAAI;IACNC,QAAQC,IAAI;;;AAGd,SAASC;IACPF,QAAQC,IAAI;;;AAGdD,QAAQC,IAAI;;AACZF;;AACAG;;CACA;IACEF,QAAQC,IAAI;EADd","sourceRoot":"http://foo.com/src"}
*: Uglify-es if you're using ES6 features.
After that the only thing left to do would be to update the paths, filenames, and actual script tags. Using this config you must still serve the min.js file, although it seems possible that manually tagging your JS file to point to the map might work...
With this config, you'll need to keep your built files up to date by running:
🐚 ./gen-map.sh
Doing this with npm and gulp would be simpler, but, if you don't mind another package, there are 2 generic "files been changed watchers" that I can suggest;
Nodemon:
🐚 nodemon gen-map.sh
entr
🐚 entr gen-map.sh

Moving from Gulp/Grunt to Webpack for an AngularJs 1.x project

I have a two-year-old AngularJs 1.x project, which is built with Gulp for development and Grunt for production (don't ask me why; I don't know either).
The build process is basically:
Compile all scss files into one css file
Merge all JS files into one JS file. We are not using any import mechanism. Each file is basically one of AngularJs' controller, component, service or filter. Something like this:
angular.module("myApp").controller("myCtrl", function() {//...});
Merge all html templates into one JS file. Each template is hardcoded with $templateCache.
Moving assets like images and fonts into the build folder.
Moving third-party libraries into the build folder.
Now I want to switch to webpack for this project. I want to incrementally modernize this project, but the first step would be just building it with webpack with a similar process like the above. I would like to keep the code base as much the same as possible. I don't want to add import for all the JS files yet. There are too many. I would also like to add a babel-loader.
I have some basic concepts about webpack, but never really customized the configuration myself.
Would anyone please give me some pointers? Like which loaders/plugins would I need, etc.? Thanks!
My process to do such a transition was gradual, I had a similar Grunt configuration.
These are my notes & steps in-order to transition to Webpack stack.
The longest step was to refactor the code so it will use ES6 imports/exports (yeah, I know you have said that it is not a phase that you wanna make, but it is important to avoid hacks).
At the end each file looks like that:
//my-component.js
class MyComponentController { ... }
export const MyComponent = {
bindings: {...},
controller: MyComponentController,
template: `...`
}
//main.js
import {MyComponent} from 'my-component'
angular.module('my-module').component('myComponent', MyComponent);
In order not going over all the files and change them, we renamed all js files and added a suffix of .not_module.js.
Those files were the old untouched files.
We added grunt-webpack as a step to the old build system (based on Grunt).
It processed via Webpack all the new files (those that are without .not_module.js suffix).
That step produced only one bundle that contains all the files there were converted already, that file we have added to concat step.
One by one, each converted file gradually moved from being processed by Grunt tasks to be processed by Webpack.
You can take as a reference that webpack.config.
Good luck.

Is there a way to cause the JS engine to load a .js file without explicitly importing something from it?

Maybe I'm trying to do something silly, but I've got a web application (Angular2+), and I'm trying to build it in an extensible/modular way. In particular, I've got various, well, modules for lack of a better term, that I'd like to be able to include or not, depending on what kind of deployment is desired. These modules include various functionality that is implemented via extending base classes.
To simplify things, imagine there is a GenericModuleDefinition class, and there are two modules - ModuleOne.js and ModuleTwo.js. The first defines a ModuleOneDefinitionClass and instantiate an exported instance ModuleOneDefinition, and then registers it with the ModuleRegistry. The second module does an analogous thing.
(To be clear - it registers the ModuleXXXDefinition object with the ModuleRegistry when the ModuleXXX.js file is run (e.g. because of some other .js file imports one of its exports). If it is not run, then clearly nothing gets registered - and this is the problem I'm having, as I describe below.)
The ModuleRegistry has some methods that will iterate over all the Modules and call their individual methods. In this example, there might be a method called ModuleRegistry.initAllModules(), which then calls the initModule() method on each of the registered Modules.
At startup, my application (say, in index.js) calls ModuleRegistry.initAllModules(). Obviously, because index.js imports the exported ModuleRegistry symbol, this will cause the ModuleRegistry.js code to get pulled in, but since none of the exports from either of the two Module .js files is explicitly referenced, these files will not have been pulled in, and so the ModuleOneDefinition and ModuleTwoDefinition objects will not have been instantiated and registered with the ModuleRegistry - so the call to initAllModules() will be for naught.
Obviously, I could just put meaningless references to each of these ModuleDefinition objects in my index.js, which would force them to be pulled in, so that they were registered by the time I call initAllModules(). But this requires changes to the index.js file depending on whether I want to deploy it with ModuleTwo or without. I was hoping to have the mere existence of the ModuleTwo.js be enough to cause the file to get pulled in and the resulting ModuleTwoDefinition to get registered with the ModuleRegistry.
Is there a standard way to handle this kind of situation? Am I stuck having to edit some global file (either index.js or some other file it references) so that it has information about all the included Modules so that it can then go and load them? Or is there a clever way to cause JavaScript to execute all the .js files in a directory so that merely copying the files it would be enough to get them to load at startup?
a clever way to cause xxJavaScriptxx Node.js to execute all the .js files in a directory:
var fs = require('fs') // node filesystem
var path = require('path') // node path
function hasJsExtension(item) {
return item != 'index.js' && path.extname(item) === '.js'
}
function pathHere(item) {
return path.join('.', item)
}
fs.readdir('./', function(err, list) {
if (err) return err
list.filter(hasJsExtension).map(pathHere).forEach(require) // require them all
})
Angular is pretty different, all the more if it is ng serve who checks if your app needs a module, and if so serves the corresponding js file, at any time needed, not at first load time.
In fact your situation reminds me of C++ with header files Declaration and cpp files with implementation, maybe you just need a defineAllModules function before initAllModules.
Another way could be considering finding out how to exclude those modules from ng-serve, and include them as scripts in your HTML before the others, they would so be defined (if present and so, served), and called by angular if necesary, the only cavehat is the error in the console if one script tag is not fetched, but your app will work anyway, if it supposed to do so.
But anyway, it would be declaring/defining those modules somewhere in ng-serve and also in the HTML.
In your own special case, and not willing to under-evalute ng-serve, but is the total js for your app too heavy to be served at once? (minified and all the ...), since the good-to-go solution may be one of the many tools to build and rebuild your production all.js from your dev js folder at will, or like you said, with a drag&drop in your folder.
Such tool is, again, server-side, but even if you only can push/FTP your javascript, you could use it in your prefered dev environment and just push your new version. To see a list of such tools google 'YourDevEnvironment bundle javascript'.
To do more with angular serve and append static js files under specific conditions, you should use webpack so the first option i see here is eject your webpack configuration and after that you can specify what angular should load or not.
With that said, i will give an example:
With angular cli and ng serve any external javascript files you wanna include, you have to put them inside the scripts array in the angular-cli.json file.However you can not control which file should be included and which one not.
By using webpack configuration you can specify all these thing by passing a flag from your terminal to the webpack config file and do all the process right there.
Example:
var env.commandLineParamater, plugins;
if(env.commandLineParamater == 'production'){
plugins = [
new ScriptsWebpackPlugin({
"name": "scripts",
"sourceMap": true,
"filename": "scripts.bundle.js",
"scripts": [
"D:\\Tutorial\\Angular\\demo-project\\node_moduels\\bootstrap\\dist\\bootstrap.min.js",
"D:\\Tutorial\\Angular\\demo-project\\node_moduels\\jquery\\dist\\jquery.min.js"
],
"basePath": "D:\\Tutorial\\Angular\\demo-project"
}),
]}else{
plugins = [
new ScriptsWebpackPlugin({
"name": "scripts",
"sourceMap": true,
"filename": "scripts.bundle.js",
"scripts": [
"D:\\Tutorial\\Angular\\demo-project\\node_moduels\\bootstrap\\dist\\bootstrap.min.js"
],
"basePath": "D:\\Tutorial\\Angular\\demo-project"
}),
]
}
then:
module.exports = (env) => {
"plugins": plugins,
// other webpack configuration
}
The script.js bundle will be loaded before your main app bundle and so you can control what you load when you run npm run start instead of ng-serve.
To Eject your webpack configuration, use ng eject.
Generally speaking, when you need to control some of angular ng-serve working, you should extract your own webpack config and customize it as you want.

Assemble every module into a single .js file

I want to minimize the number of HTTP requests from the client to load scripts in the browser. This is going to be a pretty general question but I still hope I can get some answers because module management in javascript has been a pain so far.
Current situation
Right now, in development, each module is requested individually from the main html template, like this:
<script src="/libraries/jquery.js"></script>
<script src="/controllers/controllername.js"></script>
...
The server runs on Node.js and sends the scripts as they are requested.
Obviously this is the least optimal way of doing so, since all the models, collections, etc. are also separated into their own files which translates into numerous different requests.
As far as research goes
The libraries I have come across (RequireJS using AMD and CommonJS) can request modules from within the main .js file sent to the client, but require a lot of additional work to make each module compliant with each library:
;(function(factory){
if (typeof define === 'function' && define.amd) define([], factory);
else factory();
}(function(){
// Module code
exports = moduleName;
}));
My goal
I'd like to create a single file on the server that 'concatenates' all the modules together. If I can do so without having to add more code to the already existing modules that would be perfect. Then I can simply serve that single file to the client when it is requested.
Is this possible?
Additionally, if I do manage to build a single file, should I include the open source libraries in it (jQuery, Angular.js, etc.) or request them from an external cdn on the client side?
What you are asking to do, from what I can tell, is concat your js files into one file and then in your main.html you would have this
<script src="/pathLocation/allMyJSFiles.js"></script>
If my assumption is correct, then the answer would be to use one of the two following items
GULP link or GRUNT link
I use GULP.
You can either use gulp on a case by case basis, which means calling gulp from the command line to execute gulp code, or use a watch to do it automatically on save.
Besides getting gulp to work and including the gulp files you need to do what you need, I will only provide a little of what I use to get your answer.
In my gulp file I would have something like this
var gulp = require('gulp');
var concat = require('gulp-concat');
...maybe more.
Then I have the file paths I need to be reduced into one file.
var onlyProductionJS = [
'public/application.js',
'public/directives/**/*.js',
'public/controllers/**/*.js',
'public/factories/**/*.js',
'public/filters/**/*.js',
'public/services/**/*.js',
'public/routes.js'
];
and I use this info in a gulp task like the one below
gulp.task('makeOneFileToRuleThemAll', function(){
return gulp.src(onlyProductionJS)
.pipe(concat('weHaveTheRing.js'))
.pipe(gulp.dest('public/'));
});
I then run the task in my command line by calling
gulp makeOneFileToRuleThemAll
This call runs the associated gulp task which uses 'gulp-concat' to get all the files together into one new file called 'weHaveTheRing.js' and creates that file in the destination 'public/'
Then just include that new file into your main.html
<script src="/pathLocation/weHaveTheRing.js"></script>
As for including all your files into one file, including your vendor files, just make sure that your vendor code runs first. It's probably best to keep those separate unless you have a sure fire way of getting your vendor code to load first without any issues.
UPDATE
Here is my gulp watch task.
gulp.task('startTheWatchingEye', function () {
gulp.watch(productionScripts, ['makeOneFileToRuleThemAll']);
});
Then I start up my server like this (yours may differ)
npm start
// in a different terminal window I then type
gulp startTheWatchfuleye
NOTE: you can use ANY movie or show reference you wish! :)
Now just code it up, every time you make a change in the specified files GULP will run your task(s).
If you want to say run Karma for your test runner...
add the following to your gulp file
var karma = require('karma').server;
gulp.task('karma', function(done){
karma.start({
configFile: __dirname + '/karma.conf.js'
}, done);
});
Then add this task karma to your watch I stated above like this...
gulp.task('startTheWatchingEye', function(){
gulp.watch(productionScripts, ['makeOneFileToRuleThemAll', 'karma']);
});
ALSO
Your specific settings may require a few more gulp modules. Usually, you install Gulp globally, as well as each module. Then use them in your various projects. Just make sure that your project's package.json has the gulp modules you need in dev or whatever.
There are different articles on whether to use Gulp or Grunt. Gulp was made after Grunt with a few additions that Grunt was lacking. I don't know if Grunt lacks them anymore. I like Gulp a lot though and find it very useful with a lot of documentation.
Good luck!
I'd like to create a single file on the server that 'concatenates' all the modules together. If I can do so without having to add more code to the already existing modules that would be perfect.
Sure you can. You can use Grunt or Gulp to do that, more specifically grunt-contrib-concat or gulp-concat
Here's an example of a Gruntfile.js configuration to concat every file under a js directory:
grunt.initConfig({
concat: {
dist: {
files: {
'dist/built.js': ['js/**/**.js'],
},
},
},
});
Also, you can minify everything after concatenating, using grunt-contrib-minify.
Both libraries support source maps so, in the case a bug gets to production, you can easily debug.
You can also minify your HTML files using grunt-contrib-htmlmin.
There's also an extremely useful library called grunt-usemin. Usemin let's you use HTML comments to "control" which files get minified (so you don't have to manually add them).
The drawback is that you have to explicitely include them in your HTML via script tags, so no async loading via javascript (with RequireJS for instance).
Additionally, if I do manage to build a single file, should I include the open source libraries in it (jQuery, Angular.js, etc.) or request them from an external cdn on the client side?
That's debatable. Both have pros and cons. Concatenating vendors assures that, if for some reason, the CDN isn't available, your page works as intended. However the file served is bigger so you consume more bandwidth.
In my personal experience, I tend to include vendor libraries that are absolutely essential for the page to run such as AngularJS for instance.
If I understand you correctly, you could use a task runner such as Grunt to concatenate the files for you.
Have a look at the Grunt Concat plugin.
Example configuration from the docs:
// Project configuration.
grunt.initConfig({
concat: {
dist: {
src: ['src/intro.js', 'src/project.js', 'src/outro.js'],
dest: 'dist/built.js',
}
}
});
Otherwise, as you have stated, a 'module loader' system such as Require JS or Browserify may be the way to go.

Yii2: Registering Asset Bundle vs registering external Js file

Hi I wanted to know the advantage of registering Asset Bundle following the process described in the docs like
Process one
in AppAsset.php
public $js = [
'js/myjsfile.js'
];
then in the view file
adding Namespace like
namespace app\assets;
and then adding the use statement like
use app\assets\AppAsset;
AppAsset::register($this);
Instead of doing all this if I use
Process Two
$this->registerJs('js/myjsfile.js', $this::POS_READY);
it works fine.
So why should I use Process One.
Any advantage and reason for this will be greatly appreciated.
If I follow the process one Do I need to add all the js files in
AppAsset.php individually.
Thanks.
Asset Bundles have some advantages over normal registering. Apart from what #deacs said in his/her answer here are others:
Assets Bundles can publish the file to assets if its not in web accessible directory
Assets Bundle can deal with less files (in case of CSS) as well as compressing the assets.
Makes Code Elegant especially in solving dependencies and hence reusability
All the features that makes bundles shine are found in docs
One of the main reasons for using an Asset Bundle is that your assets' paths will always be correct. Consider:
$this->registerJsFile('js/myjsfile.js', ['position'=>$this::POS_READY]);
will generate something like:
<script src="js/myjsfile.js"></script>
Which works great for non urlManager enabled urls, e.g. http://localhost/yiiproject/index.php?r=user/update&id=8 because your browser looks for the js file at: /yiiproject/js/myjsfile.js
But if you enable urlManager, your url will look like http://localhost/yiiproject/user/update/8, which means your browser will look for your js file at: /yiiproject/user/update/8/js/myjsfile.js.
You could overcome this problem by using:
$this->registerJsFile(Yii::$app->request->baseUrl.'/js/myjsfile.js', ['position'=>$this::POS_READY]);
But the Asset Bundle basicly does that for you.
Using Asset Bundles, you can also get the latest version from 'vendor' folder, so if you need to update some lib you don't need to manually do this since composer already do this.

Categories