Let's assume I have an AMD module that conditionally requires a second module in some environments:
define(["require"], function(require) {
var myObj = {
foo: console.error.bind(console)
};
if(browserEnv)
require(["./conditional-polyfill"],function(polyfill){
myObj.foo = console.log.bind(console,polyfill) ;
});
return myObj; //returns before conditional require is satisfied
});
The question is: How can I delay the define() call to return/callback AFTER the conditional require has been completed?
I.e. the code below fails:
require(["module-from-above"],function(logger){
logger.foo("Hello!"); //console.error gets called
});
My thoughts on solutions to this issue:
If I inline ./polyfill, everything would work. However, that just circumvents the problem and doesn't work for every case. I want it modularized for a reason.
I could return a Deferred object instead of myObj that gets fulfilled by ./conditional-polyfill later. This would work, but it's really ugly to call loggerDeferred.then(function(logger){ ... }); all the time.
I could make a AMD loader plugin for this module. And call the callback as soon as everything is ready. Again, this would work, but own loader plugins don't work with my build tool.
All solutions I can think of are more hacks than good code. However, I think that my issue isn't too far-fetched. So, how to handle this?
Push conditional outside of "factory function" (the name commonly used in AMD community to refer to the require's and define's callback function)
;(function() {
function factory(require, polyfill){
var myObj = {
foo: console.error.bind(console)
}
if(polyfill){
myObj.foo = console.log.bind(console, polyfill)
}
return myObj
}
var need = ['require']
if(browserEnv){
need.push("./conditional-polyfill")
}
define(need, factory)
})();
I would use a Deferred, as you say.
The deferred pattern is the ideal solution to this kind of issue, because it allows you to tie complex async operations together in a consistent way.
It will make your code a bit larger, but its a simple solution compared to modifying the loader and the build tools.
Related
Reworded:
A common pattern is to pass callback functions, such as with Mongoose's save (just for example and simplified - no error handling):
someMethod(req:Request, res:Response){
document.save( function(err){ res.status(200).send({message: 'all good'})});
}
I'd like to externalize the callback. You can do this this way:
var respond = function(err:any, res:Response){
res.status(200).send({message: 'all good'});
}
someMethod(req:Request, res:Response){
document.save( function(err){ respond(err, res)});
}
...but ideally I'd like to do this by just passing a function like respond without having to create a call back function to enclose respond. I wanted to know if this is possible. Since the anonymous function has access to res, I thought there might be some way to gain access to res in a function defined externally. It appears there is not a way to do this so I'll live with wrapping it.
My original question was trying to isolate the specific issue I was interested in - which is to gain access to the caller's variables implicitly. Doesn't seem like that is possible. Fair enough.
Original Question:
I'd like to externalize a bit of code I use frequently and I'm having trouble understanding closure in the context of a Typescript method. Take a look:
var test = function(){
console.log("Testing external: "+JSON.stringify(this.req.body));
}
class Handler {
static post(req: Request, res: Response){
(function(){
console.log("TESTING anon: "+JSON.stringify(req.body));
}) ();
test();
}
}
Besides the fact that this does nothing useful, in this bit of code, the inline anonymous function has access to the req object, but the test() function does not. this in test is undefined. Removing this to match the inline function doesn't help.
I believe if I were to bind on this for the call I'd just end up with a reference to the Handler class when I really want to bind on the post method.
My motivation for doing this is that I want to make a function that can be passed as a callback to a bunch of different request handlers. When I write the functions inline it all works, but when I externalize it I can't get a closure over the variables in the enclosing method. I've read "You Don't Know JS: this & Object Prototypes", and in pure Javascript I can manage to make these sorts of things work but I'm obviously doing something wrong here (it may not be Typescript related, maybe I'm just messing it up).
So bottomline - is there a way I can externalize the handler and get access to the method variables as if I were writing it inline? I could just create an inline anonymous function as the callback that calls the external function with all the variables I need, but I want to really understand what is happening here.
This is not an answer, but will hopefully give me enough feedback to give you one because its not at all clear what you're actually trying to accomplish here and whether or not you actually understand what the terms mean is an open question since you use them correctly one minute and sketchily the next.
var test = function(){
console.log("Testing external: " + JSON.stringify(this.req.body));
}
In strict mode this will throw an error, in sloppy it will try to access the req property of the global object which is not likely what you want.
(function(){
console.log("TESTING anon: "+JSON.stringify(req.body));
}) ();
The IFFE wrapper is completely unnecessary, it literally adds nothing to the party. So why include it?
static post(req: Request, res: Response){
console.log("TESTING anon: "+JSON.stringify(req.body));
test(); // is this the spot where you are 'in-lining?'
}
What I think you want is this:
var test = function(reqBody) {
console.log("Testing external: " + JSON.stringify(reqBody));
};
class Handler {
static post(req: Request, res: Response) {
test(req.body);
}
}
I'm trying to build several jsons using Modernizr at once, but it appears to break the scope of my function.
It's very hard to explain so have a look at this example, give it a go if you don't believe me:
[1,2,3,4,5].forEach(function(i){
require("modernizr").build({}, function (result) {
console.log(i);
});
})
outputs:
5
5
5
5
5
Instead of the expected 1, 2, 3, 4, 5, as would any similar function.
I have not come across this behaviour before in all my years of coding in ECMAScript like languages, and have built my project (and previous projects) around the idea that you cannot break a function's scope like that.
It breaks any system based on promises or even just simple callbacks.
It's baffled me all day, and I can't find an appropriate fix for it.
I'm have a very hard time even conceptualizing what it is that's causing this to happen.
Please help.
EDIT:
OK, it appears you're all hung up on the forEach...
Here's another example that will make it a little clearer:
function asd(i){
require("modernizr").build({}, function (result) {
console.log(i);
});
}
asd(1);
asd(2);
asd(3);
asd(4);
outputs
4
4
4
4
What on earth is happening?
The issue specific to Modernizr had to to with a global variable being clobbered.
the build command is basically a large requirejs configuration function, all powered by a large config object. There is some basic things that are true, always, that are established at the top of the function
{
optimize: 'none',
generateSourceMaps: false,
optimizeCss: 'none',
useStrict: true,
include: ['modernizr-init'],
fileExclusionRegExp: /^(.git|node_modules|modulizr|media|test)$/,
wrap: {
start: '\n;(function(window, document, undefined){',
end: '})(window, document);'
}
}
Then, since Modernizr works in both the browser and in node without changes, there needs to be a way for it to know if it should be loading its dependencies via the filesystem or via http. So we add some more options like basePath inside of a environment check
if (inBrowser) {
baseRequireConfig.baseUrl = '/i/js/modernizr-git/src';
} else {
baseRequireConfig.baseUrl = __dirname + '/../src';
}
At this point, the config object gets passed into requirejs.config, which wires up require and allows us to start calling build.
Finally, after all of that has been created, we have a build function that also ends up modifying the config object yet again for build specific settings (the actual detects in your build, regex to strip out some AMD crud, etc).
So here is a super simplified pseudocode version of what is ended up happening
var config = {
name: 'modernizr'
}
if (inBrowser) {
config.env = 'browser';
} else {
config.env = 'node';
}
requirejs.config(config);
module.exports = function(config, callback) {
config.out = function (output) {
//code to strip out AMD ceremony, add classPrefix, version, etc
callback(output)
}
requirejs.optimize(config)
}
spot the problem?
Since we are touching the .out method of the config object (whose scope is the entire module, and therefore its context is saved between build() calls) right before we run the asynchronous require.optimize function, the callback you were passing was rewriting the .out method every time build is called.
This should be fixed in a couple hours in Modernizr
The function block is called asynchronously, so this behavior is expected because this call is much slower than the walk of your foreach, so when you reach the function (result) {} block iis already five
Quite the same problem as described in Node.JS: How to pass variables to asynchronous callbacks? here and you should be able to use the same solution
[1,2,3,4,5].forEach(function(i){
(function(i) {
require("modernizr").build({}, function (result) {
console.log(i);
});
})(i);
})
untested but somethign like that should work
In my main.js, I am reading a file asynchronously. Once my file is loaded, I set some objects in GLOBAL namespace and use them in my required modules (using GLOBAL namespace or not is a different story, I am using it anyway).
My required module immediately expects that variable to exist at the time of loading. So how do I make it wait till my file reading is complete in the main.js? Do I simply require module in the callback of readFile? Or there's a better way to do it?
example:
fs.readFile('./file', function (data) {
// do something
GLOBAL.obj = data;
});
require('./module.js');
module.js
obj.someFunction();
Your gut feeling of disliking that solution is understandable. (Your stomach is right). The proper way of cleaning this up (and you should take the time – future-you will thank you for it):
go through every one of your ten modules
for each one go through all the functions it exports
for each function figure out, what globals they actually depend on.
add those as arguments to the function.
if they take a lot of arguments now, consider grouping them into objects, creating useful models.
if a bunch of functions all depend on the same set of variables, you can also consider creating a factory function
create a function that takes the formerly global variables as arguments and wrap all of the module's code into that function.
make that function the single export of your module. It serves as a factory function and creates the context for all the other functions in that module. It should return whatever the module exported before.
Example
// DB used to be a global
module.exports = function(DB) {
function getUser(user, cb) {
DB.get('user', db);
}
return {getUser: getUser};
};
You can then use it like this:
var module = require('module')(DB);
module.getUser(myUser, function(){}};
Yes, just follow the rule #1 of async programming. Stuff that depends on callback happening must be executed in that callback. Since your require depends on the variable set in async, you can only use your require inside it:
fs.readFile('./file', function (data) {
// do something
GLOBAL.obj = data;
require('./module.js');
});
I'm looking at the code of Polymers observe-js, and I can't quite grasp how it works. I'm not talking about the dirty checking, but on the invocation of it - when does the check being performed?
It looks like the magic is here:
var runEOM = hasObserve ? (function(){
return function(fn) {
return Promise.resolve().then(fn);
}
})() :
(function() {
return function(fn) {
eomTasks.push(fn);
};
})();
Since Promise.resolve().then(fn) invokes fn at the end of the callback stack this line defers fs, similarly to setTimeout(0, fn) or process.nextTick(fn)
But runEOM is never used in the file!
Can anybody spread light on this?
It seems that dirty checking is performed manually from the outside by calling the global.Platform.performMicrotaskCheckpoint interface method: Line 792
I think the concept is to notify observers manually after some piece of initial job is done or something in that clue.
I am arguably a pretty big noob in NodeJS so maybe this is obvious to anyone else except me ;)
Reading https://github.com/mikeal/request/blob/master/request.js#L71 I dont get how the callback parameter is passed along (https://github.com/mikeal/request#requestoptions-callback). I guess it has something to do with the arguments that are processed in Request.prototype.init, but when doing a short test I could not reproduce the behaviour:
var test = function(a) {this.init(a);}
test.prototype.init = function(a){
for(var i in arguments) console.log(arguments[i]);
}
new test('bla', 'blub');
results in
bla
{}
so I really dont get how exactly the callback is set up and I would love to find out.
The actual function that is exported when you require request is a wrapper function from the package index.js file, which initializes the options object, and then uses it to instantiate an instance of Request. This should be evident when you consider that no new keyword is required to utilize the function, even though it is clearly a constructor.