I would like to create 1 single javascript file out of multiple files and modules to ship as one single js file for a small library that will be used in the browser (and reuse some part on nodejs).
How can I achieve this?
(function() {
// Modules are minified
var blake = require('blakejs'); // The "entire" module in here ~350 lines...
// Other files are obfuscated
// File1.js
var first = function() {}
first.prototype.doHash = function() {}
// File2.js
var second = function() {}
second.prototype.doSomethingElse = function() {}
//OtherFile.js
//Webworker.js
}).call(this)
I'm using Javascript Obfuscator to obfuscate the script, but I'd like to apply other minification/obfuscation settings to modules, as they wouldn't need obfuscation, only minification.
gulp.task('minify', function () {
return gulp
.src('./src/file1.js')
.pipe(rename({suffix: '.min'}))
.pipe(javascriptObfuscator())
.pipe(gulp.dest('public'))
})
Is it possible?
Related
It is possible to concatenate several JavaScript files into one and then enclose the resulting code into a self invoking function, so all the code from these JS files turns into:
;(function () { /* code form the JS files */ })();
In this way there is no pollution of the global namespace at all and now there can be internal modules defined in separate JS files that can be accessed across JS files.
Aside from making debugging of the internal modules less convinient (as it won't be possible to browse internal objects, call their functions manually from the console, etc.) are there any potential downsides this approach can have?
Or, is there a better way to allow internal modules to be defined in separate JS files that can be accessed across JS files?
Just in case, the gulp code that concatenates JavaScript files into one and then encloses the resulting code into a self invoking function:
var gulp = require("gulp");
var sourcemaps = require("gulp-sourcemaps");
var uglify = require('gulp-uglify');
var concat = require("gulp-concat");
var insert = require("gulp-insert");
gulp.task("default", function () {
return gulp.src("js/app/*.js")
.pipe(sourcemaps.init())
.pipe(concat("app.js"))
.pipe(insert.transform(function(contents, file) {
return ';(function () {' + contents + '})();';
}))
.pipe(uglify())
.pipe(sourcemaps.write("."))
.pipe(gulp.dest("build/assets/js"));
});
I'm configuring Grunt with grunt-contrib-concat to concatenate like 20 javascript files. They have to be in a specific order and I'm wondering if there is a neat way to do this, without messing up my Gruntfile.js.
What I did and what worked well, was declaring an variable called 'libraries' with a function which returned a string with all the files in the right order.
var libraries = new (function () {
return [
'/javascript/libs/jquery.min.js',
'/javascript/libs/jquery.address.js',
'/javascript/libs/jquery.console.js'
];
});
And then concat (simplified, just an example):
concat: {
libs: {
files: {
'libs.js' : [libraries],
},
},
main: {
files: {
'main.js' : [main]
}
}
},
So when I call 'libraries' in my task configuration everything works fine, but I would like to declare this list in a separate file.
Unfortunately I couldn't find anything, nor do I know if this is even possible. Hope that someone could help me out! Thanks in advance :-)
I found a solution! Since Grunt is build on NodeJS, it's possible to use module.exports. What I did was setting an external file called libraries.js, which is in my Grunt directory.
var exports = module.exports = {};
exports.customLibrary = function () {
return [
// Path to a library
// Path to another library
// and so on...
];
};
exports.mainScripts = function () {
return [
// Path to a library
// Path to another library
// and so on...
];
};
Then I import this module by declaring a variable in Gruntfile.js
var libraries = require('../javascript/libraries.js');
To use the methods declared in libraries.js I set two more variables which returns a string with all the necessary files in the desired order:
var customLibrary = libraries.customLibrary();
var mainScripts = libraries.mainScripts();
I use these variables to define the source in the concat task. Hope this is helpful!
I managed to accomplish my task using a gulp plugin called gulp-insert like this:
gulp.task('compile-js', function () {
// Minify and bundle client scripts.
var scripts = gulp.src([
srcDir + '/routes/**/*.js',
srcDir + '/shared/js/**/*.js'
])
// Sort angular files so the module definition appears
// first in the bundle.
.pipe(gulpAngularFilesort())
// Add angular dependency injection annotations before
// minifying the bundle.
.pipe(gulpNgAnnotate())
// Begin building source maps for easy debugging of the
// bundled code.
.pipe(gulpSourcemaps.init())
.pipe(gulpConcat('bundle.js'))
// Buffer the bundle.js file and replace the appConfig
// placeholder string with a stringified config object.
.pipe(gulpInsert.transform(function (contents) {
return contents.replace("'{{{appConfigObj}}}'", JSON.stringify(config));
}))
.pipe(gulpUglify())
// Finish off sourcemap tracking and write the map to the
// bottom of the bundle file.
.pipe(gulpSourcemaps.write())
.pipe(gulp.dest(buildDir + '/shared/js'));
return scripts.pipe(gulpLivereload());
});
What I'm doing is reading our app's configuration file which is managed by the config module on npm. Getting our config file from server-side code is a snap using var config = require('config');, but we're a single-page app and frequently need access to the configuration settings on the client-side. To do that I stuff the config object into an Angular service.
Here's the Angular service before gulp build.
angular.module('app')
.factory('appConfig', function () {
return '{{{appConfigObj}}}';
});
The placeholder is in a string so that it's valid JavaScript for some of the other gulp plugins that process the file first. The gulpInsert utility lets me insert the config like this.
.pipe(gulpInsert.transform(function (contents) {
return contents.replace("'{{{appConfigObj}}}'", JSON.stringify(config));
}))
This works but feels a little hacky. Not to mention that it has to buffer the whole bundled file just so I can perform the operation. Is there a more elegant way to accomplish the same thing? Preferably one that allows the stream to keep flowing smoothly without buffering the whole bundle at the end? Thanks!
Have you checked gulp-replace-task?
Something like
[...]
.pipe(gulpSourcemaps.init())
.pipe(replace({
patterns: [{
match: '{{{appConfigObj}}}',
replacement: config
}],
usePrefix: false
})
.pipe(gulpUglify())
[...]
Admittedly, this feels a bit hacky, too, but maybe slightly better... I'm using envify and gulp-env in a React project. You could do something like this.
gulpfile.js:
var config = require('config');
var envify = require('envify');
gulp.task('env', function () {
env({
vars: {
APP_CONFIG: JSON.stringify(config)
}
});
});
gulp.task('compile-js', ['env'], function () {
// ... replace `gulp-insert` with `envify`
});
factory:
angular.module('app')
.factory('appConfig', function () {
return process.env.APP_CONFIG;
});
I have multiple sets of js modules that I would like to concat into separate files. I don't want to have to create a seperate concat task for each file. It would make more sense to be able to pass arguments into the gulp task "concat". Unfortunately gulp doesn't allow arguments to be passed into tasks(I'm sure for good reason).
Any ideas of how I can accomplish this?
Use Case
A specific scenario would be website that has a global.js file for all pages as well as page specific js files.
Creating a task for each page specific js file will quickly make the gulpfile.js hard to manage as the site grows.
My dev invironment:
I have a dev/js/ directory which has multiple sub-directories. Each sub-directory contains modules for a specific js file. So each sub-directory needs to be concatenated into it's own file within lib/js/.
Perhaps requirejs?
Maybe I should just look into using a module loader like requirejs.
I needed to take modules from my source sub-directory (src/modules/), concatenate a specific file to each individually (src/distribution), then pipe the result to a sub-directory in my distribution folder (dist/js/modules/).
I wasn't sure how many modules would end up being written for this project so I wanted to do it dynamically and found this to be the best (simplest) solution:
gulp.task("modules:js", () => {
let modules = fs.readdirSync("src/modules");
let concatModule = (module) => {
return gulp.src([
'src/distribution',
module
])
.pipe(concat(module))
.pipe(gulp.dest("build/js/modules"));
}
for (let module of modules) {
concatModule(module);
};
});
You could make concatJS a higher-order function:
var concatJS = function (src, filename, dest) {
return function() {
gulp.src(src)
.pipe(concat(filename))
.pipe(gulp.dest(dest));
};
};
gulp.task('concat-1', concatJS('src/module-1', 'module-1.js', 'build/module-1'));
gulp.task('concat-2', concatJS('src/module-2', 'module-2.js', 'build/module-2'));
//etc...
Note: You'd probably be better off using a bundler like browserify or webpack. Since asking this question I have switched to browserify rather than trying to roll my own solution.
Improved Solution:
var fs = require("fs");
/* other requires omitted */
/* Set JS dev directory in one place */
var jsSrc = "dev/js/";
var jsDest = "lib/js/";
var concat = function (path) {
path = path.replace(/\\/g, "/");
var src = path.replace(/(\/[^\/]+?\.js$)|(\/$)/, "/*.js";
var filename = src.match(/\/([^\/]+?)(\/[^\/]+?\.js$)/)[1] + ".js";
gulp.src(src)
.pipe(concat(filename)
.pipe(gulp.dest(jsDest));
});
/* The concat task just runs the concat function for
* each directory in the javascript development directory.
* It will take a performance hit, but allows concat to be
* run as a dependency in a pinch.
*/
gulp.task("concat", function () {
var dirArr = fs.readdirSync(jsDev);
for (var d in dirArr) {
var path = jsDev+dirArr[d]+"/";
concat(path);
}
});
/* Run "concat" as a dependency of the default task */
gulp.taks("default", ["concat"], function () {
var JSWatcher = gulp.watch([jsSrc+"**/*.js"]);
JSWatcher.on("change", function (event) {
concat(event.path);
});
});
Alright, I think this works. It's a little bit of a hack though, and doesn't work for all use cases.
... removed previous example to save space ...
I have this js file that contains some functions, for example a file x.js.
I need to reuse the functions in x.js in other files so I don't write the same function twice. I tried the "dojo/request/script" but it doesn't work.
Is there any suggestion on how to import these functions?
Ideally, you should make the file into an AMD module. For example, if your my/AB.js contains this:
function f1(x) { /*..*/ }
function f2(y) { /*..*/ }
You should rewrite it to:
define([], function() {
return {
f1: function(x) { /*..*/ },
f2: function(y) { /*..*/ }
}
});
Then, in your other javascript file(s), you can use require to load it.
require(['my/AB'], function(AB) {
AB.f1('foo');
AB.f2('bar');
});
(Notice that there's no .js extension in my/AB!)
This has the benefit that your functions in AB.js are no longer in the global scope, i.e. they will not collide with other people's functions. It also allows you to use the Dojo build system to deploy optimized, combined files later on.
However, this is not always possible. Perhaps other JS files are relying on f1 and f2 being in the global scope. But you can load non-AMD, regular files with Dojo's loader too, by simply adding the .js extension.
require(['my/AB.js'], function() {
window.f1('foo');
window.f2('bar');
});
Note that in all these cases, I've assumed your code is in a directory called my. Whatever your actual directory is called, you need to tell the Dojo loader about it in the dojoConfig.
<script type="text/javascript">
var dojoConfig = {
baseUrl: ".....",
// your other config parameters,
packages: [ 'my', '../folder/relative/to/baseUrl/my' ]
};
</script>
<!-- your inclusion of dojo.js somewhere below this -->
PS: You can actually load JS files from arbitrary locations too:
require(['http://example.com/js/my/AB.js'], function() {
window.f1('foo');
window.f2('bar');
});