I currently have a few js files in nodejs which are loaded as module and augment the app object (using express).
So their signatures look like:
module.exports = function(app, callback) {
// ...
callback();
}
So currently as I have about 5 of them my code would look like:
require("./setup/a")(app, function() {
require("./setup/b")(app, function(){
require("./setup/c")(app, function(){
require("./setup/d")(app, function(){
require("./setup/e")(app, function(){
startApp();
})
})
})
})
});
Now that looks unsightly as its the "pyramid of doom", however I am not entirely sure how I need to change this pattern to use Q, as I was assuming I would use Q.fcall(...a).then(...b).etc.done(). However I am unsure how I pass the app into it and if I need to return the callback for it to process as a promise.
Ideally I do not want to start whacking Q all through my code I only want it in the places where I want to remove the pyramid use cases, so in the above example how do I use Q with promises to pass the app into each required module and then start the app at the end?
Assuming your modules don't already use promises you can do something like this:
module.exports = function(app) {
// do some stuff with app
return new Promise(function(resolve,reject){
// when ready to resolve after some actions on app
resolve(); // you can also return a value here as a cb param
});
};
Promise.all(["./setup/a","./setup/b","./setup/c"].map(require.bind(null,app)))
.then(startApp);
You should however use promises at the lowest level possible, which would mean that you can simply return the promise which you used in the process:
module.exports = function(app){
return something(app).then(function(){
return somethingElseAsyncWithApp(app);
});
};
So the promise constructor isn't required. Note that this answer uses native promises but will also work on libraries that use that syntax like Bluebird. For Q change new Promise to new Q.Promise and Promise.all to Q.all.
Alternatively, you can change every require(x) to Q.fcall(require,x) and use Q.all on that directly, but that is both slow (Although Q is slow anyway) and more error prone than promisifying the modules directly. It is best to promisify the lowest level API possible.
Promises are not the silver bullet to callback pyramid of doom. I have seen code where even with promises it looked like a pyramid of doom.
You could get rid of the pyramid and stay with the callback style by doing something like this :
// var app;
// ...etc
var paths = ['a','b','c','d'];
setupNext();
function setupNext() {
var p = paths.pop(); // or shift
var next = paths.length > 0 ? setupNext : startApp
require(p)(app, next);
}
function startApp() {}
Related
I have this code jQuery code fragment:
$.get('/api/' + currentPage).done(function(data) { ... })
.fail(...)
I want to replace $.get('/api/'+currentPage) with a promise that always succeeds and returns a specific value for data. Something like:
let myData = { ... } // value of data I want to pass to the done function
(new AlwaysSucceeds(myData)).done(function(data) { ... })
.fail(...)
I could cobble up a dummy object, or I could extract out the done function but I want to keep changes to the code to a minimum.
Is there a way to do this?
UPDATE: To help clarify what's going, the code I am working with is (here). Normally this app is served from a nodejs server which implements the /api/... call, but I am converting it to be served
from a static page server. I know what is going to be returned from
the $.get call. To keep changes to the code clean I simply want to
change that line to:
let myData = {...}
// $.get('/api/' + currentPage) -- comment out the $.get call
(SOMETHINGHERE(myData)).done(function(data) {
The SOMETHINGHERE expression needs to implement .done(f)
which will call the function f with myData and then return
some object which implements .fail(...) which does nothing.
You can just replace $.get(...) with a function that returns a promise that is already resolved with the data you already have. And, the shortest way to get an already resolved jQuery promise, resolved with a particular value, is this:
$.when(myData).done(...)
The more text book way to do it in jQuery is:
$.Deferred().resolve(myData).done(...)
And, if you care to switch your logic to the the ES6 standard (instead of the non-standard jQuery promise behaviors), then you could use this:
Promise.resolve(myData).then(...).catch(...)
You can achieve this by implementing AlwaysSuceeds constructor function. Please see below example.
function AlwaysSucceeds(data) {
this.data = data;
}
AlwaysSucceeds.prototype.done = function(fn) {
fn(this.data);
return this;
}
AlwaysSucceeds.prototype.fail = function(fn) {
return this;
}
var myData = {
a: 1
};
(new AlwaysSucceeds(myData)).done(function(data) {
console.log(data)
}).fail(function(data){
})
Since jQuery Ajax functions just return $.Deferred objects, you can just substitute an immediately-resolved Deferred:
$.Deferred().resolve(myData).then(...)
In this particular case, if you want to make it easy to switch between synchronous and asynchronous code, and you have access to async/await, you can just use those directly:
try {
const data = await Promise.resolve($.get('/api/' + currentPage));
// code in done
} catch (err) {
// code in fail
}
would become
try {
const data = myData;
// code in done
} catch (err) {
// code in fail (never runs unless other code throws exceptions)
}
It's not clear what you actually want but be carufull using jQuery Deferred with native promises, the deferred has some non standard methods that native promises don't have.
So to be save I always assume there is a thenable, something that has a then with that you can pretty much do whatever you want.
jQuery Deferred do not behave like native promises either (depending on version):
$.Deferred().reject("hello world")
.then(
undefined
,x=>x
)
.then(
x=>console.log("Never happens",x)
)
Promise.reject("hello world")
.then(
undefined
,x=>x
);
.then(
x=>console.log("Well behaved",x)
);
Promise.resolve().then(x=>{throw "nope"})
.then(undefined,err=>console.warn(err));
$.Deferred().resolve().then(x=>{throw "nope"})//crashes
.then(undefined,err=>err);
So it will be saver to use native promises and polyfill with something that behaves like native.
To answer the question about non failing promise, if you want to make a request but return a default when it rejects and keep returning the same once resolves or rejects you can do:
const get = (p=>{
(url) => {
p = p ||
//return native promise or properly polyfilled one
Promise.resolve($.get(url))
.then(
undefined,
_=> {defaultobject:true}
);
return p;
}
})();
Your get function will return a native promise so no fail, done and other things that are non standard. Combining "promises" from different libraries and native promises it would be best to only use then
Would anyone know how to return the result of a Promise from a cloud code module? I am using the examples here but it keeps telling me that options is undefined (or nothing if I check with if(options) first.
I am calling the function with module.function as a promise, but still not getting results.
And ideas?
Edit: I can FORCE it to work but calling:
module.function({},{
success:function(res){
//do something
},
error:function(err){
//handle error
}
})
but this isn't really great since 1) I have to stick the empty object in there 2) I can't force the object to work like a promise and therefore lose my ability to chain.
Not sure if the issue is with modules or promises. Here's some code that illustrates both, creating a module with a function that returns a promise, then calling that from a cloud function.
Create a module like this:
// in 'cloud/somemodule.js'
// return a promise to find instances of SomeObject
exports.someFunction = function(someValue) {
var query = new Parse.Query("SomeObject");
query.equalTo("someProperty", someValue);
return query.find();
};
Include the module by requiring it:
// in 'cloud/main.js'
var SomeModule = require('cloud/somemodule.js');
Parse.Cloud.define("useTheModule", function(request, response) {
var value = request.params.value;
// use the module by mentioning it
// the promise returned by someFunction can be chained with .then()
SomeModule.someFunction(value).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
});
I have a controller where I can allow everything except one function ( let's call it thatOneFunction() ) to run asynchronously. The reason for that is thatOneFunction() relies on roughly 10 promises to have been resolved otherwise it does not have the data in the scope it needs to execute.
I have seen multiple examples online of how you can specifically grab the promise of individual gets/queries/services but what I have not seen is a way to check that all promises at once have been solved - perhaps by virtue of an Angular state variable that I am just not aware of? Something like:
var currentPromises = $state.promises;
currentPromises.then(thatOneFunction());
Or if not already a built-in variable is there a way to dynamically generate a list of all promises outstanding at the current moment that I can then use .then()? It would assist a lot because I will have many pages in this application that have this structure.
What I have tried/am currently doing"
As of right now I have a work around but it requires some hardcoding that I'd rather not continue. It is based on inserting a call to a function that checks a counter of how many times that function has been called, compares it to how many promises should have called that function already (hardcoded after counting the the promises in the controller) and if it has already gone through all the required promises it then calls thatOneFunction().
Thanks for any help/pointers.
Example bits of my code (real names/variables changed):
Controller
$scope.runAfterAllPromises = function() {
*code dependedent on all promises here*
}
MySvc.loadList(function (results) {
$scope.myList = results;
});
$scope.runAfterAllPromises();
Service
app.service("MySvc", ['$resource', function ($resource) {
this.loadList = function (callBackFunction) {
var stateRes = $resource('/api/ListApi');
stateRes.query(function (ListResults) {
if (callBackFunction && typeof (callBackFunction) === 'function') {
callBackFunction(ListResults)
};
})
You can chain promises, if that helps?
so something like this:
var promises = [];
promises.push(yourPromise1());
promises.push(yourPromise2());
promises.push(yourPromise3());
$q.all(promises).then(function(){ ...});
This what you mean?
Short answer: No, no such object exists.
Now, how to achieve what you want to achieve:
var allPromise = $q.all([promise1, promise2, promiseN]);
allPromise.then(success).catch(error);
Edit: After your "A man can dream comment" I thought to add this:
You could override the $q.defer function, and track all deferred objects created, and get their promises.
Obviously this could have some issues with deferreds created, but never resolved and such, but may be the simplest way to achieve your desired functionality. If you want to go the whole way, you can look at angular decorators which provide a framework of functionality for extending angular providers
I'd like to use the MongoDB native JS driver with bluebird promises. How can I use Promise.promisifyAll() on this library?
The 2.0 branch documentation contains a better promisification guide https://github.com/petkaantonov/bluebird/blob/master/API.md#promisification
It actually has mongodb example which is much simpler:
var Promise = require("bluebird");
var MongoDB = require("mongodb");
Promise.promisifyAll(MongoDB);
When using Promise.promisifyAll(), it helps to identify a target prototype if your target object must be instantiated. In case of the MongoDB JS driver, the standard pattern is:
Get a Db object, using either MongoClient static method or the Db constructor
Call Db#collection() to get a Collection object.
So, borrowing from https://stackoverflow.com/a/21733446/741970, you can:
var Promise = require('bluebird');
var mongodb = require('mongodb');
var MongoClient = mongodb.MongoClient;
var Collection = mongodb.Collection;
Promise.promisifyAll(Collection.prototype);
Promise.promisifyAll(MongoClient);
Now you can:
var client = MongoClient.connectAsync('mongodb://localhost:27017/test')
.then(function(db) {
return db.collection("myCollection").findOneAsync({ id: 'someId' })
})
.then(function(item) {
// Use `item`
})
.catch(function(err) {
// An error occurred
});
This gets you pretty far, except it'll also help to make sure the Cursor objects returned by Collection#find() are also promisified. In the MongoDB JS driver, the cursor returned by Collection#find() is not built from a prototype. So, you can wrap the method and promisify the cursor each time. This isn't necessary if you don't use cursors, or don't want to incur the overhead. Here's one approach:
Collection.prototype._find = Collection.prototype.find;
Collection.prototype.find = function() {
var cursor = this._find.apply(this, arguments);
cursor.toArrayAsync = Promise.promisify(cursor.toArray, cursor);
cursor.countAsync = Promise.promisify(cursor.count, cursor);
return cursor;
}
I know this has been answered several times, but I wanted to add in a little more information regarding this topic. Per Bluebird's own documentation, you should use the 'using' for cleaning up connections and prevent memory leaks.
Resource Management in Bluebird
I looked all over the place for how to do this correctly and information was scarce so I thought I'd share what I found after much trial and error. The data I used below (restaurants) came from the MongoDB sample data. You can get that here: MongoDB Import Data
// Using dotenv for environment / connection information
require('dotenv').load();
var Promise = require('bluebird'),
mongodb = Promise.promisifyAll(require('mongodb'))
using = Promise.using;
function getConnectionAsync(){
// process.env.MongoDbUrl stored in my .env file using the require above
return mongodb.MongoClient.connectAsync(process.env.MongoDbUrl)
// .disposer is what handles cleaning up the connection
.disposer(function(connection){
connection.close();
});
}
// The two methods below retrieve the same data and output the same data
// but the difference is the first one does as much as it can asynchronously
// while the 2nd one uses the blocking versions of each
// NOTE: using limitAsync seems to go away to never-never land and never come back!
// Everything is done asynchronously here with promises
using(
getConnectionAsync(),
function(connection) {
// Because we used promisifyAll(), most (if not all) of the
// methods in what was promisified now have an Async sibling
// collection : collectionAsync
// find : findAsync
// etc.
return connection.collectionAsync('restaurants')
.then(function(collection){
return collection.findAsync()
})
.then(function(data){
return data.limit(10).toArrayAsync();
});
}
// Before this ".then" is called, the using statement will now call the
// .dispose() that was set up in the getConnectionAsync method
).then(
function(data){
console.log("end data", data);
}
);
// Here, only the connection is asynchronous - the rest are blocking processes
using(
getConnectionAsync(),
function(connection) {
// Here because I'm not using any of the Async functions, these should
// all be blocking requests unlike the promisified versions above
return connection.collection('restaurants').find().limit(10).toArray();
}
).then(
function(data){
console.log("end data", data);
}
);
I hope this helps someone else out who wanted to do things by the Bluebird book.
Version 1.4.9 of mongodb should now be easily promisifiable as such:
Promise.promisifyAll(mongo.Cursor.prototype);
See https://github.com/mongodb/node-mongodb-native/pull/1201 for more details.
We have been using the following driver in production for a while now. Its essentially a promise wrapper over the native node.js driver. It also adds some additional helper functions.
poseidon-mongo - https://github.com/playlyfe/poseidon-mongo
Recently I work on a new project and this project use JavaScript callbacks in nodejs. Now we use KOA but the problem happens when we try to use ES6 Generators and callbacks.
//Calback function
function load(callback){
result = null;
//Do something with xmla4js and ajax
callback(result);
return result;
}
Now in KOA I need to call load and response json to client so i use this code below :
router= require('koa-router');
app = koa();
app.use(router(app));
app.get('load',loadjson);
function *loadJson(){
var that = this;
load(function(result){
that.body = result;
});
}
but i get this error :
_http_outgoing.js:331
throw new Error('Can\'t set headers after they are sent.');
^
Error: Can't set headers after they are sent.
at ServerResponse.OutgoingMessage.setHeader (_http_outgoing.js:331:11)
at Object.module.exports.set (G:\NAP\node_modules\koa\lib\response.js:396:16)
at Object.length (G:\NAP\node_modules\koa\lib\response.js:178:10)
at Object.body (G:\NAP\node_modules\koa\lib\response.js:149:19)
at Object.body (G:\NAP\node_modules\koa\node_modules\delegates\index.js:91:31)
at G:\NAP\Server\OlapServer\index.js:40:19
at G:\NAP\Server\OlapServer\OLAPSchemaProvider.js:1599:9
at _LoadCubes.xmlaRequest.success (G:\NAP\Server\OlapServer\OLAPSchemaProvider.js:1107:13)
at Object.Xmla._requestSuccess (G:\NAP\node_modules\xmla4js\src\Xmla.js:2113:50)
at Object.ajaxOptions.complete (G:\NAP\node_modules\xmla4js\src\Xmla.js:2024:34)
Just to clarify things, let's write your callback as
//Calback function
function load(callback){
setTimeout(function() {
var result = JSON.stringify({ 'my': 'json'});
callback(/* error: */ null, result);
}, 500);
}
in Koa world, this is called a thunk, meaning that it is an asynchronous function that takes only one argument: a callback with the prototype (err, res). you can check https://github.com/visionmedia/node-thunkify for a better explanation.
now you have to write your middleware with
function *loadJson(){
this.type = 'application/json';
this.body = yield load;
}
this is mainly because KOA is generator based, if your on the top of the middleware it does not support callbacks. so its not waiting for the function to finish. best solution would be to convert your function into a promise. promise works great with KOA.
I had a very similar problem using braintree (regular callbacks) and koa. Based on your code, the only change I needed to do was with the load function and how it was called.
router = require('koa-router');
app = koa();
app.use(router(app));
app.get('/load',loadjson);
function *loadJson(){
this.body = yield load;
}
// Callback function
function load(callback) {
// Prepare some data with xmla4js and ajax
whatever_inputs = {...};
final_method(whatever_inputs, callback);
}
The explanation by Jerome and Evan above is absolutely correct, and thunkify looks like a suitable process for automatically doing it.
While thunks were a nice idea, in my view a Promise is a better long-term approach. Many libraries are already moving to promises for async instead of the old node standard callback(err, data), and they're dead-simple to wrap around any async code to make a promise. Other devs will have experiences with Promises and naturally understand your code, while most would have to look up what a "thunk" is.
e.g. here I am wrapping the not-yet-promise-based jsdom up in a promise, so I can yield it in my koa generator.
const jsdom = require('node-jsdom');
const koa = require('koa');
const app = koa();
app.use(function *() {
this.body = yield new Promise((resolve, reject) => jsdom.env({
url: `http://example.org${this.url}`,
done(errors, { document }) {
if (errors) reject(errors.message);
resolve(`<html>${document.body.outerHTML}</html>`);
},
}));
});
app.listen(2112);
Semantically, promises and generators go hand-in-hand to really clarify async code. A generator can be re-entered many times and yield several values, while a promise means "I promise I'll have some data for you later". Combined, you get one of the most useful things about Koa: the ability to yield both promises and synchronous values.
edit: here's your original example wrapped with a Promise to return:
const router = require('koa-router');
const { load } = require('some-other-lib');
const app = koa();
app.use(router(app));
app.get('load', loadjson);
function* loadJson() {
this.body = yield new Promise(resolve => {
load(result => resolve(result));
});
}
To bypass Koa's built-in response handling, you may explicitly set this.respond = false;. Use this if you want to write to the raw res object instead of letting Koa handle the response for you.
Header is already written by built-in response handling before your callback is invoked.