Grunt Concat same file multiple times - javascript

I'd like to use grunt-contrib-concat for application frontend HTML templating purposes and this would be useful for me.
I'd like to define page partials and and concatenate them inside an output file that is going to be compiled by handlebars.
I've got everything set up, however Concat doesn't allow me to use the same file more than once.
Basically concat is filtering the sources so they don't occur more than once. The second partial1.hbs will not be concatenated.
pageconcat: {
src: [
'app/templates/partial1.hbs',
'app/templates/partial2.hbs',
'app/templates/partial1.hbs'
],
dest: 'app/result.hbs'
}
Is there any way to do this?
Update 1
After playing around with grunt's console output function, I was able to debug (of some sort) the concat plugin. Here's what I found out: The input array is deduplicated by grunt for some reason.
Update 2
The deduplication occurs in the foreach file loop that grunt uses. I've managed to bypass that (see answer). I do not know how reliable my solution is but it's a workaround and it works well if you don't put the wrong input.

You may be able to use the file array format to set up two different source sets. Something like this:
{
"files": [{
"src": [
"app/templates/partial1.hbs",
"app/templates/partial2.hbs"
],
"dest": "app/result.hbs"
}, {
"src": [
"app/result.hbs",
"app/templates/partial1.hbs"
],
"dest": "app/result.hbs"
}]
}
added "app/result.hbs" to second source set, as was pointed out in the comments.
Thanks.

Solution
After some debugging I came up with a solution. Certainly not the best, but it works fine, as it should.
I edited the concat.js plugin file inside the node_modules folder the following way:
grunt.registerMultiTask('concat', ...){
var self = this;
//several lines of code
//...
//replace f.src.filter(..) wtih
self.data.src.filter(..);
}

Related

How to get all required modules from node.js as a single text file or string?

I need to get all files from some require stack and this include all requires inside the required too.
Example:
file.js
require("./b");
require("./c");
//require("./d"); // this is a comment, need to prevent that
AST
[{
path: "absolute_dir/b.js",
name: "7saf7fs6asf7" // hash
},
...]
ouput (with the AST i can get all files by his name and put them in one single file, like a bundler)
require("7saf7fs6asf7");
require("sa8d78as8d7f");
I don't know how to do this in a modular logic. PLZ help me :)

Webpack Hashing *after* uglification

We're using the style-loader in webpack, which, when compiled seems to place information about the current directory in the source code that injects style tags when modules are loaded/unloaded. It looks roughly like this:
if(false) {
// When the styles change, update the <style> tags
if(!content.locals) {
module.hot.accept("!!./../../../../../node_modules/css-loader/index.js!./../../../../../node_modules/sass-loader/index.js?includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/node-bourbon/node_modules/bourbon/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/bourbon-neat/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/infl-fonts!./campaigns.scss", function() {
var newContent = require("!!./../../../../../node_modules/css-loader/index.js!./../../../../../node_modules/sass-loader/index.js?includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/node-bourbon/node_modules/bourbon/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/bourbon-neat/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/infl-fonts!./campaigns.scss");
if(typeof newContent === 'string') newContent = [[module.id, newContent, '']];
update(newContent);
});
}
// When the module is disposed, remove the <style> tags
module.hot.dispose(function() { update(); });
}
The thing to note is the directory listed in the accept string.
Now, this code ultimately gets removed once uglify is run (because of the if (false)) which is great. Where my problem lies however is that this compilation happens on 2 machines and the chunk hashing appears to happen before uglification, because the hash generated on 2 different machines (or even on the same machine but when in a different folder) is different. This obviously won't work if I'm deploying to, say, a production machine and need both of these machines to serve up an asset with the same digest.
Just to clarify, when not minified, the code is different, thus generates a different hash, but minification does in fact make the files identical, however the chunk hashing appears to have happened before minification.
Does anyone know how I can get the chunkhash to be generated after the uglify plugin is run. I'm using a config like so:
...
output: {
filename: '[name]-[chunkhash].js'
...
with the command:
webpack -p
edit
So after looking over this. I'm seeing now that this has to do with us adding includePaths to our style loader, it looks like this:
var includePaths = require('node-neat').includePaths;
module: {
loaders: [
{ test: /\.scss$/, loader: "style!css!sass?includePaths[]=" + includePaths },
]
}
So I think we know why we're getting these absolute URLs, but I think the original question still stands, IMO webpack should be hashing chunks AFTER minification, not before.

Laravel/Elixir watch doesn't trigger copy

My gulpfile.js looks as below. If I execute gulp watch, the copy task does not get executed on change of app.js. In fact, if I have the copy task alone, gulp watch will execute copy once and then exit. What am I doing wrong?
elixir(function (mix) {
mix.scripts(
[
'../../../bower_components/angular/angular.js'
]
).copy('resources/assets/js/app.js', 'public/js/app.js')
});
In order for Elixir to watch all the files inside the assets folder you need to instruct it where to look for additional resources. For that you simply need to use the registerWatcher method.
Unfortunately Laravel documentation leaves a lot to desire so this option is often overlooked because it's not properly documented.
elixir.config.registerWatcher("default", "resources/assets/**", null);
elixir(function (mix) {
mix.scripts(
[
'../../../bower_components/angular/angular.js'
]
).copy('resources/assets/js/app.js', 'public/js/app.js')
});

How Do You Get Around Javascript File Order Using Gulp Or A Javascript Framework?

I'm using gulp to build a single javascript file with gulp-concat and gulp-uglify.
Original Files
//File 1
var Proj = Proj || {};
//File 2
Proj.Main = (function() {
var Method = function(){ /*Code*/ };
return { "Method":Method };
})();
//File 3
Proj.Page = (function() {
var Method = Proj.Main.Method;
return { "Method":Method };
})();
Gulp returns a bad minified file because these files are being concatenated in the wrong order. I know I can specify the order in .src([]) but I don't want to maintain the array as I add javascript files.
Is there a way to create references to these "namespaces" without having to worry about the order of the files concatenated? Or, is there a way for gulp to handle concatenation with the knowledge of these namespaces auto-magically?
EDIT:
I know I can specify the file order inside the .src([]). I want to develop without having to worry about the file order, whether it be through a gulp package or a javascript framework. Thank you for responses that help but I need a definitive "No. You cannot do this." or "Yes. Here's how..." to mark the thread as answered.
Well, one option is to try gulp-order.
Also, check out this answer to "gulp concat scripts in order?".
Basically, it mentions what you already said, about having to explicitly name the files in the order you want them to come in. I know you don't want to do that, but how else would gulp know which order you want your files in?
One thing worth pointing out, though, is that you have a group of files where the order doesn't matter, and then, say, 2 files where the order does matter, you can do something like this:
gulp.src([
'utils/*.js',
'utils/some-service.js',
'utils/something-that-depends-on-some-service'
])
gulp-concat doesn't repeat files, so everything that's not some-service.js or something-that-depends-on-some-service.js will get concatenated first, and then the last two files will be concatenated in the proper order.
Since it hasn't been mentioned, implementing webpack or browserify will absolutely solve this problem without implementing some sort of hacky feeling solution.
Here is a simple example of how to use it:
var source = require('vinyl-source-stream'), //<--this is the key
browserify = require('browserify');
function buildEverything(){
return browserify({
//do your config here
entries: './src/js/index.js',
})
.bundle()
.pipe(source('index.js')) //this converts to stream
//do all processing here.
//like uglification and so on.
.pipe(gulp.dest('bundle.js'));
}
}
gulp.task('buildTask', buildEverything);
And inside your files you use require statements to indicate which files require others.

Updating file references in a json file via a grunt task

I'm a JavaScript developer and fairly new to creating a build process from scratch. I chose to use Grunt for my current project and have created a GruntFile that does about 90% of what I need it to do and it works great, except for this one issue. I have several JavaScript files that I reference while I'm developing a chrome extension in the manifest.json file. For my build process I am concatenating all of these files and minifying it into one file to be included in manifest.json. Is there anyway to update the file references in the manifest.json file during the build process so it points to the minified version?
Here is a snippet of the src manifest file:
{
"content_scripts": [{
"matches": [
"http://*/*"
],
"js": [
"js/lib/zepto.js",
"js/injection.js",
"js/plugins/plugin1.js",
"js/plugins/plugin2.js",
"js/plugins/plugin3.js",
"js/injection-init.js"
]
}],
"version": "2.0",
}
I have a grunt task that concatenates and minifies all the js files listed above into one file called injection.js and would like a grunt task that can modify the manifest file so it looks like this:
{
"content_scripts": [{
"matches": [
"http://*/*"
],
"js": [
"js/injection.js"
]
}],
"version": "2.0",
}
What I've done for now is have 2 versions of the manifest file, one for dev and one for build, during the build process it copies the build version instead. This means I need to maintain 2 versions which I'd rather not do. Is there anyway to do this more elegantly with Grunt?
Grunt gives its own api for reading and writing files, i feel that better than other dependencies like fs:
Edit/update json file using grunt with command grunt updatejson:key:value after putting this task in your gruntjs file
grunt.registerTask('updatejson', function (key, value) {
var projectFile = "path/to/json/file";
if (!grunt.file.exists(projectFile)) {
grunt.log.error("file " + projectFile + " not found");
return true;//return false to abort the execution
}
var project = grunt.file.readJSON(projectFile);//get file as json object
project[key]= value;//edit the value of json object, you can also use projec.key if you know what you are updating
grunt.file.write(projectFile, JSON.stringify(project, null, 2));//serialize it back to file
});
I do something similar - you can load your manifest, update the contents then serialize it out again. Something like:
grunt.registerTask('fixmanifest', function() {
var tmpPkg = require('./path/to/manifest/manifest.json');
tmpPkg.foo = "bar";
fs.writeFileSync('./new/path/to/manifest.json', JSON.stringify(tmpPkg,null,2));
});
I disagree with the other answers here.
1) Why use grunt.file.write instead of fs? grunt.file.write is just a wrapper for fs.writeFilySync (see code here).
2) Why use fs.writeFileSync when grunt makes it really easy to do stuff asynchronously? There's no doubt that you don't need async in a build process, but if it's easy to do, why wouldn't you? (It is, in fact, only a couple characters longer than the writeFileSync implementation.)
I'd suggest the following:
var fs = require('fs');
grunt.registerTask('writeManifest', 'Updates the project manifest', function() {
var manifest = require('./path/to/manifest'); // .json not necessary with require
manifest.fileReference = '/new/file/location';
// Calling this.async() returns an async callback and tells grunt that your
// task is asynchronous, and that it should wait till the callback is called
fs.writeFile('./path/to/manifest.json', JSON.stringify(manifest, null, 2), this.async());
// Note that "require" loads files relative to __dirname, while fs
// is relative to process.cwd(). It's easy to get burned by that.
});

Categories