I'm following the official docs, but my gulp tasks aren't running in series.
gulp.task("mytask", ["foo", "bar", "baz"]);
gulp.task("foo", function (callback) {
gulp
.src("...")
.pipe(changed("..."))
.pipe(gulp.dest(function (file) {
// ...stuff
return "...";
}))
.on("end", function() {
// ...stuff
callback();
});
});
gulp.task("bar", function (callback) {
//...
});
gulp.task("baz", function (callback) {
//...
});
But my output looks like this:
Starting 'mytask'...
Starting 'foo'...
Starting 'bar'... // <-- foo is not done yet!
Finished 'foo'
Finished 'bar'
Starting 'baz'...
Finished 'baz'
Finished 'mytask'
How do I get them to run in order?
If you want them to run in series you currently have to use the task dependency system, e.g.:
gulp.task("mytask", ["foo", "bar", "baz"]);
gulp.task("foo", function (callback) {
//...
callback(...);
});
gulp.task("bar", ['foo'], function (callback) {
//...
callback(...);
});
gulp.task("baz", ['bar'], function (callback) {
//...
callback(...);
});
It's clunky. I think it's going to be addressed in a future version.
Depending on the situation you could return a promise or event stream instead of passing in and calling a callback.
I suppose I should mention that the run-sequence module is an option as of right now. But the task dependency system illustrated above is the mechanism currently provided by gulp itself. See this comment re: run-sequence and the future of task sequencing in gulp.
This answer should be updated to reflect Gulp 4 way of running tasks in a series.
If you want Gulp tasks to run in a series you should use gulp.series to run them, and if you want them to run parallel gulp.parallel.
In using Gulp series you would do something like the following:
gulp.task("mytask", gulp.series(foo, bar, baz));
Those other tasks would probably no longer be tasks but instead consts, like
const foo = () => {
return gulp.src("...")
.pipe(changed("..."))
.pipe(gulp.dest(function (file) {
// ...stuff
return "...";
}));
}
hence the reason why the series lists the constants instead of some strings. In moving to Gulp 4 there would probably be other problems arising, but the fixes for a simple gulp file like this one are easy to make.
simple tutorial on gulp 4 https://codeburst.io/switching-to-gulp-4-0-271ae63530c0
Related
It appears that my custom commands implemented for Nightwatch.js are getting executed asynchronously. A console.log message inside the custom command is showing up after a console.log message issued from after the command is invoked. I can't find any reference in the nightwatch documentation about how these commands are executed, but since they appear to be asynchronous, I'm not sure how I can wait to make sure one command has completed before the next one is executed (because this doesn't not appear to be the case).
Here is my custom command ("foo"):
exports.command = function () {
console.log('Command executed');
}
And my test function:
module.exports['my test'] = function(browser) {
browser.resizeWindow(400, 600);
browser.foo();
console.log('Test function returning');
};
When I run this, the logs show up as:
Test function returning
Command executed
Which is the opposite order of what I would expect if my custom function was getting executed synchronously.
If you want your custom command to work properly (and be synch), you need to call at least one Nightwatch.js command inside your custom command.
Try this:
exports.command = function () {
console.log('Command executed');
this.execute(function() {});
}
If you want more in-depth details, you can follow this issue:
https://github.com/nightwatchjs/nightwatch/issues/1123
Your command should take callback as argument:
exports.command = function (callback) {
console.log('Command executed');
if (callback) {
callback();
}
}
and execute like this:
browser.foo(function() {
console.log('Test function returning');
});
Another solution is to use the browser.perform method:
module.exports['my test'] = function(browser) {
browser.resizeWindow(400, 600);
browser.foo();
browser.perform(function()
console.log('Test function returning');
});
};
This is a horrible design by the Nightwatch team. I just spent hours trying to debug this issue and the only solution that worked for me was to use a perform(). This is ridiculous. Why in the world would the Nightwatch developers have custom commands run asynchronously.
I'm trying to build several jsons using Modernizr at once, but it appears to break the scope of my function.
It's very hard to explain so have a look at this example, give it a go if you don't believe me:
[1,2,3,4,5].forEach(function(i){
require("modernizr").build({}, function (result) {
console.log(i);
});
})
outputs:
5
5
5
5
5
Instead of the expected 1, 2, 3, 4, 5, as would any similar function.
I have not come across this behaviour before in all my years of coding in ECMAScript like languages, and have built my project (and previous projects) around the idea that you cannot break a function's scope like that.
It breaks any system based on promises or even just simple callbacks.
It's baffled me all day, and I can't find an appropriate fix for it.
I'm have a very hard time even conceptualizing what it is that's causing this to happen.
Please help.
EDIT:
OK, it appears you're all hung up on the forEach...
Here's another example that will make it a little clearer:
function asd(i){
require("modernizr").build({}, function (result) {
console.log(i);
});
}
asd(1);
asd(2);
asd(3);
asd(4);
outputs
4
4
4
4
What on earth is happening?
The issue specific to Modernizr had to to with a global variable being clobbered.
the build command is basically a large requirejs configuration function, all powered by a large config object. There is some basic things that are true, always, that are established at the top of the function
{
optimize: 'none',
generateSourceMaps: false,
optimizeCss: 'none',
useStrict: true,
include: ['modernizr-init'],
fileExclusionRegExp: /^(.git|node_modules|modulizr|media|test)$/,
wrap: {
start: '\n;(function(window, document, undefined){',
end: '})(window, document);'
}
}
Then, since Modernizr works in both the browser and in node without changes, there needs to be a way for it to know if it should be loading its dependencies via the filesystem or via http. So we add some more options like basePath inside of a environment check
if (inBrowser) {
baseRequireConfig.baseUrl = '/i/js/modernizr-git/src';
} else {
baseRequireConfig.baseUrl = __dirname + '/../src';
}
At this point, the config object gets passed into requirejs.config, which wires up require and allows us to start calling build.
Finally, after all of that has been created, we have a build function that also ends up modifying the config object yet again for build specific settings (the actual detects in your build, regex to strip out some AMD crud, etc).
So here is a super simplified pseudocode version of what is ended up happening
var config = {
name: 'modernizr'
}
if (inBrowser) {
config.env = 'browser';
} else {
config.env = 'node';
}
requirejs.config(config);
module.exports = function(config, callback) {
config.out = function (output) {
//code to strip out AMD ceremony, add classPrefix, version, etc
callback(output)
}
requirejs.optimize(config)
}
spot the problem?
Since we are touching the .out method of the config object (whose scope is the entire module, and therefore its context is saved between build() calls) right before we run the asynchronous require.optimize function, the callback you were passing was rewriting the .out method every time build is called.
This should be fixed in a couple hours in Modernizr
The function block is called asynchronously, so this behavior is expected because this call is much slower than the walk of your foreach, so when you reach the function (result) {} block iis already five
Quite the same problem as described in Node.JS: How to pass variables to asynchronous callbacks? here and you should be able to use the same solution
[1,2,3,4,5].forEach(function(i){
(function(i) {
require("modernizr").build({}, function (result) {
console.log(i);
});
})(i);
})
untested but somethign like that should work
I was having trouble getting gulp run-sequence to execute both functions that i gave it (it would only execute one of them. I had something like this:
gulp.task('wire', function(){
gulp.src('./index.html')
.pipe(wiredep())
.pipe(gulp.dest('.'));
});
var filesToInject = ['./app/**/*.js', './app/**/*.css'];
gulp.task('inject', function (){
var target = gulp.src('./index.html');
var sources = gulp.src(filesToInject, {read: false});
return target.pipe(inject(sources))
.pipe(gulp.dest('.'));
});
gulp.task('wire-inject', function(callback){
sequence('wire', 'inject', callback);
});
That would only run the inject task. Only when I added a return before gulp.src in the wire task would it execute. I also added callback in both function params and passed it to the last pipe in both tasks per the documentation.
gulp.task('wire', function(callback){
gulp.src('./index.html')
.pipe(wiredep())
.pipe(gulp.dest('.'), callback);
});
var filesToInject = ['./app/**/*.js', './app/**/*.css'];
gulp.task('inject', function (callback){
var target = gulp.src('./index.html');
var sources = gulp.src(filesToInject, {read: false});
return target.pipe(inject(sources))
.pipe(gulp.dest('.'), callback);
});
That did not throw any errors, however it doesn't change anything if I take it out. What does this return statement do that magically makes my sequence run completely. What are these callbacks in the documentation that I just pass a reference to with out parentheses to execute them? Do I still need them even though they seemingly do nothing?
Gulp needs to know when an asynchronous task is done. You have 3 ways to do this:
Return an event stream, which is what you do when you have return gulp.src(...).pipe(...); in your task's function.
Return a promise.
Have the function that defines you task take a parameter. Gulp will call your function with a callback that you should call when the task is over.
If you do not do this, then Gulp will not know that the task is asynchronous. So it will consider the task done as soon as its function returns. This has all kinds of consequences. Gulp could exit before a task is done. Or it could start a task B that depends on task A before task A is complete. In some cases, you may not see a problem immediately but as your gulpfile gets more complex, you run more chances of getting bad behavior. run-sequence is not special: it needs to know when a task is complete to do its job correctly.
Note that callback in this code which you show in your question is useless:
return target.pipe(inject(sources))
.pipe(gulp.dest('.'), callback);
callback won't be called. Replace it with function () { console.log("Hello!"); callback(); }. You won't see Hello! on the console because the callback is not called. However, the code does work overall because you return the stream. Gulp uses the stream you return to determine that the task is over, not the callback (which is never called anyway).
Is there a way to catch when a GruntJS task fails and act upon it?
The --force flag doesn't help, because I need to know if something broke along the way, and do something about it.
I tried some arrangement similar to a try-catch, however it doesn't work. That is because grunt.registerTask pushes tasks into a queue - the execution is not synchronous.
grunt.registerTask('foo', "My foo task.", function() {
try {
grunt.task.run('bar');
} catch (ex) {
// Handle the failure without breaking the task queue
}
});
Creative javascript ideas are welcome as well as GruntJS know-how.
This ugly beast should work for you:
grunt.registerMultiTask('foo-task', 'my foo task', function() {
try {
console.log(this.data);
throw new Error('something bad happened!');
} catch (e) {
console.log('Ooops:', e);
}
return true;
});
grunt.registerTask('foo', 'Runs foo.', function() {
grunt.config('foo-task', {
hello: 'world'
});
grunt.task.run('foo-task');
});
Run it via: grunt foo
Output:
Running "foo" task
Running "foo-task:hello" (foo-task) task
world
Ooops: Error: something bad happened!
at Object.<anonymous>
<stacktrace here>
Done, without errors.
I haven't tested it but like in any Javascript code you could add an Event Listener to grunt object and fire event from tasks when you need it...
Don't know if this could exactly fit your question but i would give it a try.
Hope it helps!
I have an app in nodejs. In it, I define some global variables that are shared across multiple files. For example:
//common.js
async = requires("async");
isAuthenticated = function() {
//...
return false;
};
//run.js
require("common.js");
async.series([function () {
isAuthenicated();
}], function () {
console.log("done");
});
I want the async and isAuthenticated variables to be minified, but minified to the same thing in all files. It would look like the following:
//common.min.js
a = requires("async");
b = function() {
//...
return false;
};
//run.min.js
require("common.js");
a.series([function () {
b();
}], function () {
console.log("done");
});
How to do it in uglifyjs?
I'm currently looping through the files and using the command uglifyjs $file -m "sort,toplevel" -c > $file.min on each.
Don't use globals.
Use var async = reuqire('async') where needed.
Use module.exports in your specific modules you require.
Use something like browserify to generate a single js.
Uglify (or use a browserify transform named uglifyify)
For example, the simplest form (without using uglifyify)
$ browserify run.js | uglifyjs -c > run.min.js
Note that if you use your own code, like common.js, you should require it using a relative path, var common = require("./common").
I suggest you use the exports syntax:
// common.js code
exports.isAuthenticated = function() {
//...
return false;
};
And of course use it just as you would with async.js:
//run.js
var common = require("./common");
var async = require("async")
async.series([function () {
common.isAuthenicated();
}], function () {
console.log("done");
});
assuming both common.js & run.js reside in the same directory.
related question: How to get minified output with browserify?
A Side Note
The way you used async.series in your question has no real advantage. You could have just:
//run.js
var common = require("./common");
common.isAuthenicated();
console.log("done");
in Async series you usually call async functions:
async.series([
function(callback){
// do some stuff ...
callback(null, 'one');
},
function(callback){
// do some more stuff ...
callback(null, 'two');
}
],
// optional callback
function(err, results){
// results is now equal to ['one', 'two']
});
so, I would expect to see something like:
// common.js code
exports.isAuthenticated = function(callback) {
//...
callback(null, false);
};
and then
//run.js
var common = require("./common");
var async = require("async")
async.series([common.isAuthenicated], function (err, results) {
console.log("done with", results[0]);
});
I usually prefer a different "syntax"
// an example using an object instead of an array
async.series({
one: function(callback){
setTimeout(function(){
callback(null, 1);
}, 200);
},
two: function(callback){
setTimeout(function(){
callback(null, 2);
}, 100);
}
},
function(err, results) {
// results is now equal to: {one: 1, two: 2}
});
But it's your call.
The async examples were taken from https://github.com/caolan/async#seriestasks-callback
You would want to concat the files before you go ahead and uglify them. Concatenation is the process of combining multiple files of code into one monolithic creature that knows everything about all parts of your code. This is often done in conjunction with uglyfying for several reasons, mainly for performance benefits (your app runs a lot faster if you only send 1 file to the client).
That being said, this is typically a practice that is done when your serving code to a client, not necessarily for back-end / server-side logic. Ideally no one but you or people with access to whatever service you're using to deploy said server code should see that side of your code. If your main concern is to prevent reverse-engineering, or make your code unreadable, I suggest obfuscating your code.
"This is omega site. Best encrypted level he has. Looks like obfuscated code to conceal its true purpose. Security through obscurity." - Q Skyfall 2012
If your globals are confined to common.js, you may try
uglifyjs --define-from-module common.js $file...
and remove require()s.
In NodeJs there is the concept of defining global variables like posted in this thread:
global.myGlobalVar = "something visible to all modules";
I am too using uglify in my node apps, and it turned out that when using global.xyz, xyz does not get uglified.
disclaimer: I am totally aware that exposing global info is an anti pattern. But sometimes there is a good reason for it.
Hope that helps!