Dynamically generating a required module with Browserify and Gulp - javascript

Is there a way to invoke Browserify (via Gulp) so that it includes a different file when requireing a module with a given name?
Briefly, the end result I would like is for my Browserify entry point, main.js:
var myPlatformSpecificImplmentation = require('./platform');
// go to town
to use the contents of ./path/to/platform-a.js when I run gulp js:platform-a and ./path/to/platform-b.js when I run gulp js:platform-b.
If I were using RequireJS, this would be as simple as modifying the paths option accordingly:
paths: {
platform: './path/to/platform-a'
}
It would be great if I could somehow generate these modules dynamically via gulp's built-in streaming mechanism. In that case, I could, say, pipe a file into gulp-template and on into Browserify.
Thanks

One solution would be to use my pathmodify plugin like so:
gulpfile.js
var
pathmod = require('pathmodify'),
paths = {a: '/path/to/platform-a.js', b: '/path/to/platform-b.js'};
function platform (version) {
return function () {
return browserify('./main')
.plugin(pathmod(), {mods: [
pathmod.mod.id('app/platform', paths[version])
]})
.bundle()
.pipe(...);
};
}
gulp.task('js:platform-a', platform('a'));
gulp.task('js:platform-b', platform('b'));
main.js
var myPlatformSpecificImplmentation = require('app/platform');
I've illustrated this with your require() string changed to app/platform because that allows the simplest implementation of pathmodify without collisions with other ./platform relative paths in other files. But this can be implemented with pathmodify without risking collision (by testing the parent module [main.js in this case] pathname). If it's important to you to keep the ./platform string I'll illustrate that.
Or you could use a transform. Take a look at makeRequireTransform() in benbria/browserify-transform-tools if you don't want to roll your own.
It would be great if I could somehow generate these modules dynamically via gulp's built-in streaming mechanism. In that case, I could, say, pipe a file into gulp-template and on into Browserify.
That's not out of the question, but it's not really easy to do. To do it without touching disk, you'd need to do something like create / gulp.src() a vinyl file and run it through whatever gulp plugins, then convert it to a stream to feed to browserify.

Related

browserify and babelify very slow due to large data js files

I have a nodejs project which uses large dictionary lists (millions of entries), stored in js files, that look like this:
module.exports = ["entry1", "entry2", "entry3", "entry4", "entry5", etc.];
and then I use them from the other files like this:
var values = require('./filePath');
This works great and it works in the browser too (using browserify), however bundling takes ages - about 10 minutes.
I use the following command to create the bundle:
browserify "./src/myModule.js" --standalone myModule -t [ babelify --presets [ es2015 stage-2 ] --plugins ["transform-es2015-classes", {"loose": true}]
I have tried to avoid parsing of my dictionary js files using --noparse ["path1", "path2", "path3", etc.] but it did not make any difference.
Ideally I would like to just speed up the browserify\babelify process, however if that's not possible I would be very happy to find another way (ie. avoid require) to store and use my lists, so that they don't slow the process down but that crucially work in node and in the browser too.
You can bundle the data files separately, so you'll only need to rebundle them when they change. This is possible using the --require -r and --external -x options.
To create the data bundle, do something like this:
browserify -r ./path/to/data.js -r ./path/to/other/data.js > data-bundle.js
The resulting data-bundle.js will define the require function globally which can be used to obtain any file you listed in the command above. Just make sure you include this bundle in a script tag before your main bundle.
It would be nice to be able to --require a glob pattern, but unfortunately browserify does not support this. If you try to use the shell to expand a pattern, the -r option will only apply to the first, which sucks. You can probably write a shell script that builds a command from an ls or something, to avoid having to list all of the data files explicilty, but that's beyond the scope of the question, I think.
To create your main bundle without rebuilding the data files, simply add an option like this to your command:
-x './path/to/data/*.js'
This tells browserify to basically ignore them and let them be pulled in through the global require function created by your other bundle. As you can see, this does support glob patterns, so it's a bit easier.
Update:
To make the two bundles into one, just put something like this at the end of a shell script that starts with the browserify command that builds your main bundle:
cat data-bundle.js main-bundle.js > bundle.js
rm main-bundle.js
Unfortunately this will always have to write a copy of data-bundle.js to disk, which may be the ultimate cause of the slowdown, as I mentioned in the comments below. Worth giving a shot, though.
If even that doesn't work, there are some other, much more hacky approaches you might take. I'll pass on going into those for now though, because I don't think they're worth it unless you absolutely must have it as one file and have no other way of doing it. :\
If you have files with data - just load them in separate way and don't include them into build process
Format your big data files as JSON
On the server use:
let fs = require('fs');
let yourContent = JSON.parse(fs.readFileSync('path/to/file'));
On client use:
let request = require("client-request"); // do npm install client-request
var options = {
uri: "http://.com/path/to/file",
json: true
}
var req = request(options, function callback(err, response, body) {
console.log(response.statusCode)
if (body) {
let yourContent = body
}
})
Or use any other library which makes HTTP request which you prefer

Using a node module in my Angularjs app

I have come across a few modules i would like to use in my Angular app but am at a crossroads on how to make work in my angular app as i will need to "require()" in my factory file.
Heres the node module im interested in: https://github.com/TimNZ/node-xero
On this current project i am using Gulp Angular in Yeoman to generate my boilerplate and am having a hard time figuring out how i should make this work if i need to modify any of the gulp scrips.
I was thinking i can just "Browserify" the single file that will use require() but is this the wrong approach? should i just browserify all the project files? is this standard practice?
Any advice is appreciated, currently has me at a stand still.
All the modules i want to use in relation to Xero all seem to be node modules.
The simplest starting point would be to use Browserify to build a standalone bundle that uses a global of your choice.
To do this, you could create a JS file that requires the node module(s) you want to use. You could create a file named bundle-index.js with this content:
exports.xero = require('node-xero');
You could then run this command to build a standalone module:
browserify --standalone the_global bundle-index.js > bundle.js
Where the_global is a name you find appropriate for the global object that will contain the exports. With the bundle.js file included in a script element, you would be able use it like this:
var privateApp = new window.the_global.xero.PrivateApplication({ ... });
Doing things this way would involve the least amount of disruption to your current project. And if you are only needing to use Browserify to require third party libraries that don't change frequently, you could start with a simple manual process for building the standalone bundle.
Note that you can export other required modules by adding additional exports to bundle.js:
exports.xero = require('node-xero');
exports.someOtherModule = require('some-other-module');
(function(){
var xero = require('node-xero');
angular
.module('app')
.factory('myFactory', myFactory);
function myFactory(){
var helper = {
myMethod: myMethod,
};
return helper;
function myMethod(){
xero.doStuff();
}
}
})();

Assemble every module into a single .js file

I want to minimize the number of HTTP requests from the client to load scripts in the browser. This is going to be a pretty general question but I still hope I can get some answers because module management in javascript has been a pain so far.
Current situation
Right now, in development, each module is requested individually from the main html template, like this:
<script src="/libraries/jquery.js"></script>
<script src="/controllers/controllername.js"></script>
...
The server runs on Node.js and sends the scripts as they are requested.
Obviously this is the least optimal way of doing so, since all the models, collections, etc. are also separated into their own files which translates into numerous different requests.
As far as research goes
The libraries I have come across (RequireJS using AMD and CommonJS) can request modules from within the main .js file sent to the client, but require a lot of additional work to make each module compliant with each library:
;(function(factory){
if (typeof define === 'function' && define.amd) define([], factory);
else factory();
}(function(){
// Module code
exports = moduleName;
}));
My goal
I'd like to create a single file on the server that 'concatenates' all the modules together. If I can do so without having to add more code to the already existing modules that would be perfect. Then I can simply serve that single file to the client when it is requested.
Is this possible?
Additionally, if I do manage to build a single file, should I include the open source libraries in it (jQuery, Angular.js, etc.) or request them from an external cdn on the client side?
What you are asking to do, from what I can tell, is concat your js files into one file and then in your main.html you would have this
<script src="/pathLocation/allMyJSFiles.js"></script>
If my assumption is correct, then the answer would be to use one of the two following items
GULP link or GRUNT link
I use GULP.
You can either use gulp on a case by case basis, which means calling gulp from the command line to execute gulp code, or use a watch to do it automatically on save.
Besides getting gulp to work and including the gulp files you need to do what you need, I will only provide a little of what I use to get your answer.
In my gulp file I would have something like this
var gulp = require('gulp');
var concat = require('gulp-concat');
...maybe more.
Then I have the file paths I need to be reduced into one file.
var onlyProductionJS = [
'public/application.js',
'public/directives/**/*.js',
'public/controllers/**/*.js',
'public/factories/**/*.js',
'public/filters/**/*.js',
'public/services/**/*.js',
'public/routes.js'
];
and I use this info in a gulp task like the one below
gulp.task('makeOneFileToRuleThemAll', function(){
return gulp.src(onlyProductionJS)
.pipe(concat('weHaveTheRing.js'))
.pipe(gulp.dest('public/'));
});
I then run the task in my command line by calling
gulp makeOneFileToRuleThemAll
This call runs the associated gulp task which uses 'gulp-concat' to get all the files together into one new file called 'weHaveTheRing.js' and creates that file in the destination 'public/'
Then just include that new file into your main.html
<script src="/pathLocation/weHaveTheRing.js"></script>
As for including all your files into one file, including your vendor files, just make sure that your vendor code runs first. It's probably best to keep those separate unless you have a sure fire way of getting your vendor code to load first without any issues.
UPDATE
Here is my gulp watch task.
gulp.task('startTheWatchingEye', function () {
gulp.watch(productionScripts, ['makeOneFileToRuleThemAll']);
});
Then I start up my server like this (yours may differ)
npm start
// in a different terminal window I then type
gulp startTheWatchfuleye
NOTE: you can use ANY movie or show reference you wish! :)
Now just code it up, every time you make a change in the specified files GULP will run your task(s).
If you want to say run Karma for your test runner...
add the following to your gulp file
var karma = require('karma').server;
gulp.task('karma', function(done){
karma.start({
configFile: __dirname + '/karma.conf.js'
}, done);
});
Then add this task karma to your watch I stated above like this...
gulp.task('startTheWatchingEye', function(){
gulp.watch(productionScripts, ['makeOneFileToRuleThemAll', 'karma']);
});
ALSO
Your specific settings may require a few more gulp modules. Usually, you install Gulp globally, as well as each module. Then use them in your various projects. Just make sure that your project's package.json has the gulp modules you need in dev or whatever.
There are different articles on whether to use Gulp or Grunt. Gulp was made after Grunt with a few additions that Grunt was lacking. I don't know if Grunt lacks them anymore. I like Gulp a lot though and find it very useful with a lot of documentation.
Good luck!
I'd like to create a single file on the server that 'concatenates' all the modules together. If I can do so without having to add more code to the already existing modules that would be perfect.
Sure you can. You can use Grunt or Gulp to do that, more specifically grunt-contrib-concat or gulp-concat
Here's an example of a Gruntfile.js configuration to concat every file under a js directory:
grunt.initConfig({
concat: {
dist: {
files: {
'dist/built.js': ['js/**/**.js'],
},
},
},
});
Also, you can minify everything after concatenating, using grunt-contrib-minify.
Both libraries support source maps so, in the case a bug gets to production, you can easily debug.
You can also minify your HTML files using grunt-contrib-htmlmin.
There's also an extremely useful library called grunt-usemin. Usemin let's you use HTML comments to "control" which files get minified (so you don't have to manually add them).
The drawback is that you have to explicitely include them in your HTML via script tags, so no async loading via javascript (with RequireJS for instance).
Additionally, if I do manage to build a single file, should I include the open source libraries in it (jQuery, Angular.js, etc.) or request them from an external cdn on the client side?
That's debatable. Both have pros and cons. Concatenating vendors assures that, if for some reason, the CDN isn't available, your page works as intended. However the file served is bigger so you consume more bandwidth.
In my personal experience, I tend to include vendor libraries that are absolutely essential for the page to run such as AngularJS for instance.
If I understand you correctly, you could use a task runner such as Grunt to concatenate the files for you.
Have a look at the Grunt Concat plugin.
Example configuration from the docs:
// Project configuration.
grunt.initConfig({
concat: {
dist: {
src: ['src/intro.js', 'src/project.js', 'src/outro.js'],
dest: 'dist/built.js',
}
}
});
Otherwise, as you have stated, a 'module loader' system such as Require JS or Browserify may be the way to go.

How do I package a node module with optional submodules?

I'm writing a javascript library that contains a core module and several
optional submodules which extend the core module. My target is the browser
environment (using Browserify), where I expect a user of my module will only
want to use some of my optional submodules and not have to download the rest to
the client--much like custom builds work in lodash.
The way I imagine this working:
// Require the core library
var Tasks = require('mymodule');
// We need yaks
require('mymodule/yaks');
// We need razors
require('mymodule/razors');
var tasks = new Tasks(); // Core mymodule functionality
var yak = tasks.find_yak(); // Provided by mymodule/yaks
tasks.shave(yak); // Provided by mymodule/razors
Now, imagine that the mymodule/* namespace has tens of these submodules. The
user of the mymodule library only needs to incur the bandwidth cost of the
submodules that she uses, but there's no need for an offline build process like
lodash uses: a tool like Browserify solves the dependency graph for us and
only includes the required code.
Is it possible to package something this way using Node/npm? Am I delusional?
Update: An answer over here seems to suggest that this is possible, but I can't figure out from the npm documentation how to actually structure the files and package.json.
Say that I have these files:
./lib/mymodule.js
./lib/yaks.js
./lib/razors.js
./lib/sharks.js
./lib/jets.js
In my package.json, I'll have:
"main": "./lib/mymodule.js"
But how will node know about the other files under ./lib/?
It's simpler than it seems -- when you require a package by it's name, it gets the "main" file. So require('mymodule') returns "./lib/mymodule.js" (per your package.json "main" prop). To require optional submodules directly, simply require them via their file path.
So to get the yaks submodule: require('mymodule/lib/yaks'). If you wanted to do require('mymodule/yaks') you would need to either change your file structure to match that (move yaks.js to the root folder) or do something tricky where there's a yaks.js at the root and it just does something like: module.exports = require('./lib/yaks');.
Good luck with this yak lib. Sounds hairy :)

build script for big js package

I am writing a quite complex js charting library using d3.js. The code is organise in multiple files (17 to be exact) and each file has one or more classes. Moreover, the code has multiple external dependencies such as d3.js, jQuery, underscore,...
To load the charts, it is necessary to load the different files in an appropriate order to manage the dependencies of the files relatively to each other.
I would like to create a build script that would manage all the dependencies (internal and external) to create a standalone library. I know requirejs, love it and would like to use it. But I did not find a way to make it compile server-side the code without adding a dependency on the client side.
The goal here is really to allow the library to be easily used on any project as any other library by only loading one file. As I plan to use the library on the server side too, I would like the solution to be compatible with node.js too.
Here is a bogus code example that show what my code looks like.
// in file1.js
var Foo = {}
Foo.Class1 = function(params){
this.params = params;
this.bar = function(){
return this.params.bar || "bar";
}
}
// in file2.js
foo.Class2() = function(params){
$.extend(this, new Foo.Class1(params));
this.bar = function(){
return this.params.bar || "BAR";
}
}
There are lot of projects to combine JavaScripts, for example YUI Compressor, Grunt or Brunch.
I have chosen to go with Grunt. No real reasons except that it looks that it is well documented and very active.
So less than 15 minutes after knowing the existence of Grunt, here is a grunt.js file that resolve my problem.
module.exports = function(grunt) {
// Project configuration.
grunt.initConfig({
concat: {
dist: {
src: ['file1.js', 'file2.js'],
dest: 'built.js'
}
}
});
};
Really looking forward to use more Grunt!
cheers Andreas Köberle!

Categories