I am writing a quite complex js charting library using d3.js. The code is organise in multiple files (17 to be exact) and each file has one or more classes. Moreover, the code has multiple external dependencies such as d3.js, jQuery, underscore,...
To load the charts, it is necessary to load the different files in an appropriate order to manage the dependencies of the files relatively to each other.
I would like to create a build script that would manage all the dependencies (internal and external) to create a standalone library. I know requirejs, love it and would like to use it. But I did not find a way to make it compile server-side the code without adding a dependency on the client side.
The goal here is really to allow the library to be easily used on any project as any other library by only loading one file. As I plan to use the library on the server side too, I would like the solution to be compatible with node.js too.
Here is a bogus code example that show what my code looks like.
// in file1.js
var Foo = {}
Foo.Class1 = function(params){
this.params = params;
this.bar = function(){
return this.params.bar || "bar";
}
}
// in file2.js
foo.Class2() = function(params){
$.extend(this, new Foo.Class1(params));
this.bar = function(){
return this.params.bar || "BAR";
}
}
There are lot of projects to combine JavaScripts, for example YUI Compressor, Grunt or Brunch.
I have chosen to go with Grunt. No real reasons except that it looks that it is well documented and very active.
So less than 15 minutes after knowing the existence of Grunt, here is a grunt.js file that resolve my problem.
module.exports = function(grunt) {
// Project configuration.
grunt.initConfig({
concat: {
dist: {
src: ['file1.js', 'file2.js'],
dest: 'built.js'
}
}
});
};
Really looking forward to use more Grunt!
cheers Andreas Köberle!
Related
I have come across a few modules i would like to use in my Angular app but am at a crossroads on how to make work in my angular app as i will need to "require()" in my factory file.
Heres the node module im interested in: https://github.com/TimNZ/node-xero
On this current project i am using Gulp Angular in Yeoman to generate my boilerplate and am having a hard time figuring out how i should make this work if i need to modify any of the gulp scrips.
I was thinking i can just "Browserify" the single file that will use require() but is this the wrong approach? should i just browserify all the project files? is this standard practice?
Any advice is appreciated, currently has me at a stand still.
All the modules i want to use in relation to Xero all seem to be node modules.
The simplest starting point would be to use Browserify to build a standalone bundle that uses a global of your choice.
To do this, you could create a JS file that requires the node module(s) you want to use. You could create a file named bundle-index.js with this content:
exports.xero = require('node-xero');
You could then run this command to build a standalone module:
browserify --standalone the_global bundle-index.js > bundle.js
Where the_global is a name you find appropriate for the global object that will contain the exports. With the bundle.js file included in a script element, you would be able use it like this:
var privateApp = new window.the_global.xero.PrivateApplication({ ... });
Doing things this way would involve the least amount of disruption to your current project. And if you are only needing to use Browserify to require third party libraries that don't change frequently, you could start with a simple manual process for building the standalone bundle.
Note that you can export other required modules by adding additional exports to bundle.js:
exports.xero = require('node-xero');
exports.someOtherModule = require('some-other-module');
(function(){
var xero = require('node-xero');
angular
.module('app')
.factory('myFactory', myFactory);
function myFactory(){
var helper = {
myMethod: myMethod,
};
return helper;
function myMethod(){
xero.doStuff();
}
}
})();
Is there a way to invoke Browserify (via Gulp) so that it includes a different file when requireing a module with a given name?
Briefly, the end result I would like is for my Browserify entry point, main.js:
var myPlatformSpecificImplmentation = require('./platform');
// go to town
to use the contents of ./path/to/platform-a.js when I run gulp js:platform-a and ./path/to/platform-b.js when I run gulp js:platform-b.
If I were using RequireJS, this would be as simple as modifying the paths option accordingly:
paths: {
platform: './path/to/platform-a'
}
It would be great if I could somehow generate these modules dynamically via gulp's built-in streaming mechanism. In that case, I could, say, pipe a file into gulp-template and on into Browserify.
Thanks
One solution would be to use my pathmodify plugin like so:
gulpfile.js
var
pathmod = require('pathmodify'),
paths = {a: '/path/to/platform-a.js', b: '/path/to/platform-b.js'};
function platform (version) {
return function () {
return browserify('./main')
.plugin(pathmod(), {mods: [
pathmod.mod.id('app/platform', paths[version])
]})
.bundle()
.pipe(...);
};
}
gulp.task('js:platform-a', platform('a'));
gulp.task('js:platform-b', platform('b'));
main.js
var myPlatformSpecificImplmentation = require('app/platform');
I've illustrated this with your require() string changed to app/platform because that allows the simplest implementation of pathmodify without collisions with other ./platform relative paths in other files. But this can be implemented with pathmodify without risking collision (by testing the parent module [main.js in this case] pathname). If it's important to you to keep the ./platform string I'll illustrate that.
Or you could use a transform. Take a look at makeRequireTransform() in benbria/browserify-transform-tools if you don't want to roll your own.
It would be great if I could somehow generate these modules dynamically via gulp's built-in streaming mechanism. In that case, I could, say, pipe a file into gulp-template and on into Browserify.
That's not out of the question, but it's not really easy to do. To do it without touching disk, you'd need to do something like create / gulp.src() a vinyl file and run it through whatever gulp plugins, then convert it to a stream to feed to browserify.
I'm attempting to use a node module in a typescript application and I've included it in my TypeScript code via require().
I build my typescript to JS and all works well if I run it through node which is great.
I'm looking for a way to use the same code and run it through the browser. Browserify seemed like a sensible solution.
I popped a post build action on my VS project and I'm building a 206KB file via browserify (so it's certainly doing something!). Unfortunately my tiny bit of Typescript doesn't seem to be accessible when it's been browserified.
I'm not that familiar with what browserify should be generating so not quite sure whether what it's wrapped my .js code in is correct (can post snippets if it helps).
So my question is twofold really, I'm looking for the answer to either:
Is there a recommended way to write TypeScript post 0.9 to allow it to be run after it's been browserified?
Is there a way to simple tell TypeScript to pull in the 'require' dependency on its own?
Any thoughts or info in this area would be greatly appreciated.
EDIT:
Just to clarify, I'm generating my .js from the .ts during save/build, and in a post build action, pointing browserify to the output. An abridged js output looks like this:
var TestModule;
(function (TestModule) {
function launchDrone() {
var exports = require('ar-drone');
var client = exports.createClient();
}
})(TestModule || (TestModule = {}));
When I generate the browserified file from that, I can't access TestModule or launchDrone in any context (or certainly not from window. ) is there some trick to accessing the browserified code?
It looks like you potentially are not exporting TestModule? Your TestModule file should look like this:
module TestModule {
export function launchDrone() {
var exports = require('ar-drone');
var client = exports.createClient();
}
}
export = TestModule;
This way you should be able to launch TestModule and TestModule.launchDrone from the window.
I want to publish a module to several component manager systems: npmjs, bower, etc... plus I want to create downloadable builds as well, for example one in AMD style for requirejs, one in commonJS style, one for the global namespace in browser, minified for each of them, etc... These are more than 10 builds.
I currently have a single AMD build, and I wrote unit tests for it using karma, jasmine and requirejs in an amd style. What do you suggest, how to generate the other builds and the tests for them?
I mean I cannot decide what should I have as a base of transformations. There is a common part in every output package, and there is a package dependent part either.
AMD - requirejs (I am not sure about using the config options)
define(["module", "dependency"], function (module, dependency) {
var m = {
config: function (options){
//...
},
//...
//do something with the dependency
};
m.config(module.config()); //load config options set by require.config()
return m;
});
commonJS
var dependency = require("dependency");
module.exports = {
config: function (options){
//...
},
//...
//do something with the dependency
};
global
var m = (function (){
return {
config: function (options){
//...
},
//...
//do something with the dependency
};
})(dependency);
I don't know, should I develop the common code and build before every test, or should I develop one of the packages, test it, and write a transformation from that into the other builds?
I intend to use gulp for creating the builds and call unit tests automatically for each of them before automatically publishing them. Ohh and ofc I need an auto version number change as well. Btw. is it necessary to call unit tests after the building procedure, what do you think? I just want to be sure, that not a buggy code is published...
There are transformation libraries for gulp:
https://github.com/phated/gulp-wrap-amd (commonjs to amd)
https://github.com/gfranko/amdclean (amd to standard js)
https://github.com/phated/gulp-wrap-umd (commonjs to umd I guess)
https://github.com/adamayres/gulp-wrap (not sure of its capabilities yet, maybe universal with custom templates, maybe nothing)
So it is easy to transform one package format to the two other formats. This can be done with the tests and the code as well. The commonjs can be tested with node and jasmine-node, the standard js can be tested with karma and jasmine, the amd can be tested with karma, requirejs and jasmine.
It would be a poor choice to create a common descriptor and convert that before every test. I don't want to create another language, like coffeescript, etc... so conversion between packages is okay.
Calling unit tests before publishing is okay. There are no other common package types, so this 3 packages will be enough for every component manager I will use.
I am unsure about the versioning. It is not hard to set it manually, but maybe a system like travis or maven could help more... I'll figure out and edit this answer.
I'm working on a mobile website (it's not a single page app) which has a very small JS footprint (less than 10KB minified and gzipped). There are no libraries or external dependencies, all the code is vanilla javascript written in house. It's logically separated into several files that are concatenated before deplyment in order to reduce the number of HTTP requests. There is no explicit namespacing in the files. That is, they look something like:
// crossbrowser.js
function getScrollOffset() {
// implementation
}
function ...
This is less than ideal, there is no explicit dependency resolution and the scope can get easilly polluted from inside the functions. There is no processing done to check this (lint or compiler). As a first step I thought implementing an explicit module system could safeguard against this and make the code better.
Reading into the CommonJS module format and loaders like RequireJS, Lab.js and others, as far as I understood, when using modules on the browser side, they all expect to load them via XHR. I don't want that, I want to keep the single script format which contains all the modules. My file would look something like:
var define = function () { /* ... */ };
var require = function () { /* ... */ };
define("crossbrowser", function (require, exports, module) {
exports.getScrollOffset = function() {
//
};
// etc.
});
define("foo", function (require, exports, module) {
var crossbrowser = require('crossbrowser');
exports.getNewOffset = function () {
var offset = crossbrowser.getScrollOffset();
// do something
return offset;
}
});
window.addEventListener('DOMContentLoaded', function () {
// really dumb example, but I hope it gets the point across
var crossbrowser = require('crossbrowser'),
foo = require('foo');
crossbrowser.scrollTo(foo.getNewOffset());
});
The question is wether any of the loaders work this way or do I have to write my own implementation of define and require?
One of the benefit with loaders like requirejs is that you can use optimizers to combine all your modules into one minified script during your build process, see RequireJS Optimizer
This would allow you to develop in a modular structure but deploy an optimized version
Have a look at webmake.
Webmake allows to use CommonJS modules in the browser. All js files are merged into one js file. The loader is very lightweight. Webmake also works with CoffeeScript.
If you don't need asynchronous loading, you don't need an AMD loader. For example, if you use r.js to combine modules and you have to load in production code the whole RequireJS library. Why not simply going with a compiler? Just take a look at the slides on CommonJS Compiler http://www.slideshare.net/dsheiko/modular-javascript-with-commonjs-compiler and here the sources/docs http://dsheiko.github.io/cjsc/