Uglify the global variable - javascript

I have an app in nodejs. In it, I define some global variables that are shared across multiple files. For example:
//common.js
async = requires("async");
isAuthenticated = function() {
//...
return false;
};
//run.js
require("common.js");
async.series([function () {
isAuthenicated();
}], function () {
console.log("done");
});
I want the async and isAuthenticated variables to be minified, but minified to the same thing in all files. It would look like the following:
//common.min.js
a = requires("async");
b = function() {
//...
return false;
};
//run.min.js
require("common.js");
a.series([function () {
b();
}], function () {
console.log("done");
});
How to do it in uglifyjs?
I'm currently looping through the files and using the command uglifyjs $file -m "sort,toplevel" -c > $file.min on each.

Don't use globals.
Use var async = reuqire('async') where needed.
Use module.exports in your specific modules you require.
Use something like browserify to generate a single js.
Uglify (or use a browserify transform named uglifyify)
For example, the simplest form (without using uglifyify)
$ browserify run.js | uglifyjs -c > run.min.js
Note that if you use your own code, like common.js, you should require it using a relative path, var common = require("./common").
I suggest you use the exports syntax:
// common.js code
exports.isAuthenticated = function() {
//...
return false;
};
And of course use it just as you would with async.js:
//run.js
var common = require("./common");
var async = require("async")
async.series([function () {
common.isAuthenicated();
}], function () {
console.log("done");
});
assuming both common.js & run.js reside in the same directory.
related question: How to get minified output with browserify?
A Side Note
The way you used async.series in your question has no real advantage. You could have just:
//run.js
var common = require("./common");
common.isAuthenicated();
console.log("done");
in Async series you usually call async functions:
async.series([
function(callback){
// do some stuff ...
callback(null, 'one');
},
function(callback){
// do some more stuff ...
callback(null, 'two');
}
],
// optional callback
function(err, results){
// results is now equal to ['one', 'two']
});
so, I would expect to see something like:
// common.js code
exports.isAuthenticated = function(callback) {
//...
callback(null, false);
};
and then
//run.js
var common = require("./common");
var async = require("async")
async.series([common.isAuthenicated], function (err, results) {
console.log("done with", results[0]);
});
I usually prefer a different "syntax"
// an example using an object instead of an array
async.series({
one: function(callback){
setTimeout(function(){
callback(null, 1);
}, 200);
},
two: function(callback){
setTimeout(function(){
callback(null, 2);
}, 100);
}
},
function(err, results) {
// results is now equal to: {one: 1, two: 2}
});
But it's your call.
The async examples were taken from https://github.com/caolan/async#seriestasks-callback

You would want to concat the files before you go ahead and uglify them. Concatenation is the process of combining multiple files of code into one monolithic creature that knows everything about all parts of your code. This is often done in conjunction with uglyfying for several reasons, mainly for performance benefits (your app runs a lot faster if you only send 1 file to the client).
That being said, this is typically a practice that is done when your serving code to a client, not necessarily for back-end / server-side logic. Ideally no one but you or people with access to whatever service you're using to deploy said server code should see that side of your code. If your main concern is to prevent reverse-engineering, or make your code unreadable, I suggest obfuscating your code.
"This is omega site. Best encrypted level he has. Looks like obfuscated code to conceal its true purpose. Security through obscurity." - Q Skyfall 2012

If your globals are confined to common.js, you may try
uglifyjs --define-from-module common.js $file...
and remove require()s.

In NodeJs there is the concept of defining global variables like posted in this thread:
global.myGlobalVar = "something visible to all modules";
I am too using uglify in my node apps, and it turned out that when using global.xyz, xyz does not get uglified.
disclaimer: I am totally aware that exposing global info is an anti pattern. But sometimes there is a good reason for it.
Hope that helps!

Related

In Node.js, how do you store callback data from a module's function in the main app?

I made a node web app and it works. However, I built it monolithic, and I'm attempting to break it out into modules for practice.
Long question short, in a module, how do I define a variable from callback results and store it in a way that it is available to the main app? is it as simple as storing the results in the module to a global variable and exporting that as well?
In my monolthic version:
app.js
var resultsArray = [];
function getDatafromHTTP(page){
callback(data){
resultsArray.push(data);//push json elements to array
if(page < 50){page++;getDatafromHTTP(page);}
}
}
getDatafromHTTP(0);
app.get(some function that displays the resultsArray)
The getDatafromHTTP function runs 50 times via the page variable.
Now that I tried to break it down:
app.js
var resultsArray =[];
var getDatafromHTTP = require(module.js).getDatafromHTTP
getDatafromHTTP(0);
app.get(some function that displays the resultsArray)
module.js
exports.getDatafromHTTP = function(page){
callback(data){
resultsArray.push(data);//push json elements to array
if(page < 50){page++;getDatafromHTTP(page);}
}
}
//error resultsArray not Defined.
I get why resultsArray is not defined in the module, and understand I can make resultsArray a variable in the module itself. If it was a return, i would simply define a variable in the main app based on the return of the function. But since the function gets data via a callback and not a return, whats the "right" way to get that data back into the main app and available for the app.get function? either as it builds, or after the 50 function runs completes?
There are a number of different ways to do this, but the most common pattern I've seen is to just add the value as a new property of app (or similar). So for example:
module.exports = function(app) {
return function getHttpData(callback) {
request.get('http://example.com/', (err, results, body) => {
if (err) return cb(err);
app.httpData = body;
return cb(null, body);
});
}
}
Then in your calling code you'd do something like:
var app = express();
var getHttpData = require('./get-http-data.js')(app); // <-- note I'm passing the app object in here
// sometime later
getHttpData((err, data) => {
console.log(data);
});
Obviously, I'm leaving off a few steps like some of the require statements and such, but hopefully you get the idea.
All that said, often you want to avoid global variables like that. In particular, storing things that potentially change like that in memory will lead to bugs later if your app has to scale to more than one process (each instance of the app would fetch separately, and possibly wind up with different states).
module.js
function getDataFromHTTP() {
return callback(data) {
return data;
}
}
module.exports = getDataFromHTTP;
app.js
var getDataFromHTTP = require('module.js');
var resultsArray = [];
for (var page=0; page<50; page++) {
resultsArray.push(getDataFromHTTP);
}

Using the --globals variable with MochaJS testing suite

I am trying to do a variant of end-to-end testing on a sailjs system using mocha. What I want to do is to simulate the action program flow by doing things like creating a user, and then performing other action with that user.
I'd like to be able to separate my tests into separate files that run in order in relation to different operations, such as "register new user" etc. To do this I need to be able to pass values between testing files.
Mocha contains an option setting --globals <value1, value2, etc>. Here is the description from the docs:
--globals allow the given comma-delimited global [names]
However, I've been unable to get this to work. Here's what I've tried. I have a bootstrap.test.js file that does basic before and after operations, starting and stopping sails:
var Sails = require('sails'),
sails;
before(function(done) {
Sails.lift({
log: {
level: 'error'
}
}, function(err, server) {
sails = server;
if (err) return done(err);
// here you can load fixtures, etc.
done(err, sails);
});
});
after(function(done) {
// here you can clear fixtures, etc.
sails.lower(done);
});
Then let's say I have two test files a.js and b.js that will run consecutively and for testing purposes contain very little:
a.js:
var user = 'some user';
b.js:
console.log( user );
If I then run mocha --globals, I get the error:
ReferenceError: user is not defined
What am I doing wrong here? I have been unable to find anywhere on the web a description of how this would be used.
You've misunderstood the purpose of --globals. It may be used when you use --check-leaks. The --check-leaks option checks whether the tests are leaking variables into the global space. Consider this suite:
it("one", function () {
foo = 1;
});
If you run it with mocha --check-leaks you'll get an error because the test creates a new foo global. You can prevent the error with mocha --check-leaks --globals foo. In large projects there are perhaps leaks that are deemed to be okay and so using the --globals option allows turning off errors for those cases that are okay.
Now, how can you achieve what you wanted to do? You cannot create a global variable if you use var. Your a.js would have to be:
user = 'some user';
Moreover, by default Mocha does not guarantee an order when it loads test files. You could use --sort to guarantee that a.js is loaded first but you are then forced to use a name that will guarantee this order. I prefer to use --require a which tells Mocha to require module a before it starts reading the test files. Your module could be called z and would still be loaded before any test file. At any rate, by removing var and using --require your test files will see your global. (I tried it: it works.)
This seemed to work for me using sailsjs (I think there would be a better way to do this), one of the first test that I run is to register a user:
describe('#Should register test user', function () {
it('should register user', function (done) {
request(sails.hooks.http.app)
.post('/auth/registertest')
.send({
"email": "email#foo.com.au",
"password": "foo",
"firstname": "foo",
"lastname": "foo",
"location": {
"name": "foo",
"id": "54d733795ed3f5140b0a761b"
}
})
.expect(200).end(function (err, res) {
if (err) return done(err);
global.email = "email#tpg.com.au";
global.token = res.body.token;
done();
})
});
});
Then I am able to access the global email and token variables in other test files that run after the register user test, for example the below calls my api and uses the global token variable in the Authorization header.
describe('#auth access to users bunches', function () {
it('should get 200', function (done) {
request(sails.hooks.http.app)
.get('/bunch/byuser')
.set('Authorization', 'bearer ' + token)
.expect(200).end(function (err, res) {
if (err) return done(err);
done();
})
});
});

Gulp tasks not running in series

I'm following the official docs, but my gulp tasks aren't running in series.
gulp.task("mytask", ["foo", "bar", "baz"]);
gulp.task("foo", function (callback) {
gulp
.src("...")
.pipe(changed("..."))
.pipe(gulp.dest(function (file) {
// ...stuff
return "...";
}))
.on("end", function() {
// ...stuff
callback();
});
});
gulp.task("bar", function (callback) {
//...
});
gulp.task("baz", function (callback) {
//...
});
But my output looks like this:
Starting 'mytask'...
Starting 'foo'...
Starting 'bar'... // <-- foo is not done yet!
Finished 'foo'
Finished 'bar'
Starting 'baz'...
Finished 'baz'
Finished 'mytask'
How do I get them to run in order?
If you want them to run in series you currently have to use the task dependency system, e.g.:
gulp.task("mytask", ["foo", "bar", "baz"]);
gulp.task("foo", function (callback) {
//...
callback(...);
});
gulp.task("bar", ['foo'], function (callback) {
//...
callback(...);
});
gulp.task("baz", ['bar'], function (callback) {
//...
callback(...);
});
It's clunky. I think it's going to be addressed in a future version.
Depending on the situation you could return a promise or event stream instead of passing in and calling a callback.
I suppose I should mention that the run-sequence module is an option as of right now. But the task dependency system illustrated above is the mechanism currently provided by gulp itself. See this comment re: run-sequence and the future of task sequencing in gulp.
This answer should be updated to reflect Gulp 4 way of running tasks in a series.
If you want Gulp tasks to run in a series you should use gulp.series to run them, and if you want them to run parallel gulp.parallel.
In using Gulp series you would do something like the following:
gulp.task("mytask", gulp.series(foo, bar, baz));
Those other tasks would probably no longer be tasks but instead consts, like
const foo = () => {
return gulp.src("...")
.pipe(changed("..."))
.pipe(gulp.dest(function (file) {
// ...stuff
return "...";
}));
}
hence the reason why the series lists the constants instead of some strings. In moving to Gulp 4 there would probably be other problems arising, but the fixes for a simple gulp file like this one are easy to make.
simple tutorial on gulp 4 https://codeburst.io/switching-to-gulp-4-0-271ae63530c0

Calling an exported function from within the same module

If you have a function like this in a module:
dbHandler.js
exports.connectSQL = function(sql, connStr, callback){
////store a connection to MS SQL Server-----------------------------------------------------------------------------------
sql.open(connStr, function(err, sqlconn){
if(err){
console.error("Could not connect to sql: ", err);
callback(false); //sendback connection failure
}
else{
callback(sqlconn); //sendback connection object
}
});
}
Can you call this from inside the same module it's being defined? I want to do something like this:
later on inside dbHandler.js
connectSQL(sql, connStr, callback){
//do stuff
});
Declare the function like a regular old function:
function connectSQL(sql, connStr, callback){
////store a connection to MS SQL Server------------------------------------
sql.open(connStr, function(err, sqlconn){
// ...
and then:
exports.connectSQL = connectSQL;
Then the function will be available by the name "connectSQL".
There are any number of ways to accomplish this, with Pointy's being my preferred method in most circumstances, but several others depending on the situation may be appropriate.
One thing you will see often is something like this:
var connectSQL = exports.connectSQL = function(sql, connStr, callback) { /*...*/ };
Technically, though I've never actually seen someone do this, you could use the exports object inside your module without issue:
// later on inside your module...
exports.connectSQL('sql', 'connStr', function() {});
Beyond that, it comes down to whether it matters whether you have a named function, like in Pointy's example, or if an anonymous function is ok or preferred.

Which is a better way of writing callbacks?

Just by seeing what I've wrote now, I can see that one is much smaller, so in terms of code golf Option 2 is the better bet, but as far as which is cleaner, I prefer Option 1. I would really love the community's input on this.
Option 1
something_async({
success: function(data) {
console.log(data);
},
error: function(error) {
console.log(error);
}
});
Option 2
something_async(function(error,data){
if(error){
console.log(error);
}else{
console.log(data);
}
});
They are not exactly the same. Option 2 will still log the (data), whereas Option 1 will only log data on success. (Edit: At least it was that way before you changed the code)
That said, Option 1 is more readable. Programming is not / should not be a competition to see who can write the fewest lines that do the most things. The goal should always be to create maintainable, extendable (if necessary) code --- in my humble opinion.
Many people will find option#1 easier to read and to maintain - two different callback functions for two different purposes. It is commonly used by all Promise Libraries, where two arguments will be passed. Of course, the question Multiple arguments vs. options object is independent from that (while the object is useful in jQuery.ajax, it doesn't make sense for promise.then).
However, option#2 is Node.js convention (see also NodeGuide) and used in many libraries that are influenced by it, for example famous async.js. However, this convention is discussable, top google results I found are WekeRoad: NodeJS Callback Conventions and Stackoverflow: What is the suggested callback style for Node.js libraries?.
The reason for the single callback function with an error argument is that it always reminds the developer to handle errors, which is especially important in serverside applications. Many beginners at clientside ajax functions don't care forget about error handling for example, asking themselves why the success callback doesn't get invoked. On the other hand, promises with then-chaining are based on the optionality of error callbacks, propagating them to the next level - of course it still needs to be catched there.
In all honesty, I prefer to take them one step further, into Promises/Futures/Deferreds/etc...
Or (/and) go into a "custom event" queue, using a Moderator (or an observer/sub-pub, if there is good reason for one particular object to be the source for data).
This isn't a 100% percent of the time thing. Sometimes, you just need a single callback. However, if you have multiple views which need to react to a change (in model data, or to visualize user-interaction), then a single callback with a bunch of hard-coded results isn't appropriate.
moderator.listen("my-model:timeline_update", myView.update);
moderator.listen("ui:data_request", myModel.request);
button.onclick = function () { moderator.notify("ui:data_request", button.value); }
Things are now much less dependent upon one big callback and you can mix and match and reuse code.
If you want to hide the moderator, you can make it a part of your objects:
var A = function () {
var sys = null,
notify = function (msg, data) {
if (sys && sys.notify) { sys.notify(msg, data); }
},
listen = function (msg, callback) {
if (sys && sys.listen) { sys.listen(msg, callback); }
},
attach = function (messenger) { sys = messenger; };
return {
attach : attach
/* ... */
};
},
B = function () { /* ... */ },
shell = Moderator(),
a = A(),
b = B();
a.attach(shell);
b.attach(shell);
a.listen("do something", a.method.bind(a));
b.notify("do something", b.property);
If this looks a little familiar, it's similar behaviour to, say Backbone.js (except that they extend() the behaviour onto objects, and others will bind, where my example has simplified wrappers to show what's going on).
Promises would be the other big-win for usability, maintainable and easy to read code (as long as people know what a "promise" is -- basically it passes around an object which has the callback subscriptions).
// using jQuery's "Deferred"
var ImageLoader = function () {
var cache = {},
public_function = function (url) {
if (cache[url]) { return cache[url].promise(); }
var img = new Image(),
loading = $.Deferred(),
promise = loading.promise();
img.onload = function () { loading.resolve(img); };
img.onerror = function () { loading.reject("error"); };
img.src = url;
cache[url] = loading;
return promise;
};
return public_function;
};
// returns promises
var loadImage = ImageLoader(),
myImg = loadImage("//site.com/img.jpg");
myImg.done( lightbox.showImg );
myImg.done( function (img) { console.log(img.width); } );
Or
var blog_comments = [ /* ... */ ],
comments = BlogComments();
blog_comments.forEach(function (comment) {
var el = makeComment(comment.author, comment.text),
img = loadImage(comment.img);
img.done(el.showAvatar);
comments.add(el);
});
All of the cruft there is to show how powerful promises can be.
Look at the .forEach call there.
I'm using Image loading instead of AJAX, because it might seem a little more obvious in this case:
I can load hundreds of blog comments, if the same user makes multiple posts, the image is cached, and if not, I don't have to wait for images to load, or write nested callbacks. Images load in any order, but still appear in the right spots.
This is 100% applicable to AJAX calls, as well.
Promises have proven to be the way to go as far as async and libraries like bluebird embrace node-style callbacks (using the (err, value) signature). So it seems beneficial to utilize node-style callbacks.
But the examples in the question can be easily be converted into either format with the functions below. (untested)
function mapToNodeStyleCallback(callback) {
return {
success: function(data) {
return callback(null, data)
},
error: function(error) {
return callback(error)
}
}
}
function alterNodeStyleCallback(propertyFuncs) {
return function () {
var args = Array.prototype.slice.call(arguments)
var err = args.shift()
if (err) return propertyFuncs.err.apply(null, [err])
return propertyFuncs.success.apply(null, args)
}
}

Categories