Assume I got an Ember obj. When doing any kind of sync with backend there is a possiblity to use a promise chain:
obj.save().then(function(res){
// Success callback
}, function(res){
// Fail callback
});
Is there a done/always callback for Ember.js promise chain with .then()?
I've tried adding a third parameter function, but it did not help.
http://emberjs.com/api/classes/Ember.PromiseProxyMixin.html#method_finally
Ember -> jQuery
.then() -> .done()
.catch() -> .fail()
.finally() -> .always()
Example (in the router):
var self = this;
var modelType = this.store.createRecord('modelType', {/* model attrs */});
modelType.save().then(function(model){
self.transitionTo('model.show', model);
}).catch(function(){
console.log('Failure to Save: ', reason);
}).finally({
self.hideSpinner()
});
Unfortunately there isn't. But you can create your own modifying the RSVP.Promise prototype:
Ember.RSVP.Promise.prototype.always = function(func) {
return this.then(func, func);
}
So you can do the following:
// will show success
Ember.RSVP.resolve('success').always(function(msg) {
alert(msg)
})
// will show error
Ember.RSVP.reject('error').always(function(msg) {
alert(msg)
})
I hope it helps
Ember uses the RSVP.js library for promises, and RSVP does not support always due to not being part of the Promises/A(+) specs.
If you need it, #wycats suggests the following approach:
Ember.RSVP.Promise.prototype.andThen = function(success, error, always) {
return this.then(function(value) {
var ret = success(value);
always(value);
return ret;
}, function(reason) {
var ret = error(reason);
always(reason);
return ret;
});
};
gorner's solution works but for Ember Data you have to add the following as well:
Ember.PromiseProxyMixin.reopen({
andThen: function() {
var promise = this.get('promise');
return promise['andThen'].apply(promise, arguments);
}
});
The reason is that the DS.Model.save() function returns a PromiseObject (see http://emberjs.com/api/data/classes/DS.PromiseObject.html), which doesn't implement Ember.RSVP.Promise but instead implements Ember.PromiseProxyMixin. So you have to make the andThen function available in that mixin in order for it to work with promises when saving models.
Related
As a simplified case, I have two async functions, foo and bar. bar needs the result of foo, i.e. bar depends on foo. I have no idea about which function will be called first.
If bar is invoked first, bar will call foo and start itself right after foo is done.
If foo is invoked first and done, bar can use the result of foo.
If foo is invoked first and bar is invoked before foo is done, bar needs to wait for foo's result. (Don't invoke a new call to foo, just wait for the already-fired call to foo)
How can I achieve this?
Is it possible to register an async function dependency chain (something like the dependency in require.js define['foo'], function() { bar(); })?
Can I use $.deferred() to achieve it?
How?
In circumstances like this, the standard approach is to cache the lower level promise.
Typically you will establish, in some suitable outer scope, a js plain object as a promise cache, and always look there first before calling your async process.
var promiseCache = {};
function foo() {
if(!promiseCache.foo) {
promiseCache.foo = doSomethingAsync();
}
return promiseCache.foo;
}
function bar() {
return foo().then(doSomethingElseAsync);
}
Of course, there's nothing to prevent you also caching the higher level promise, if appropriate.
function bar() {
if(!promiseCache.bar) {
promiseCache.bar = foo().then(doSomethingElseAsync);
}
return promiseCache.bar;
}
EDIT: forceRefresh feature
You can force a function to refresh its cached promise by passing an (extra) parameter.
function foo(any, number, of, other, arguments, forceRefresh) {
if(forceRefresh || !promiseCache.foo) {
promiseCache.foo = doSomethingAsync();
}
return promiseCache.foo;
}
By making forceRefresh the last argument, leaving it out is the same as passing false and foo will use the cached promise if available. Alternatively, pass true to guarantee that doSomethingAsync() be called and the cached value be refreshed.
EDIT 2: setName()/getName()
With the forceRefresh mechanism in place in getName() :
setName(newName).then(getName.bind(null, true)); //set new name then read it back using forceRefresh.
Alternatively, omit the forceRefresh mechanism and, assuming the cache property to be promiseCache.name :
setName(newName).then(function() {
promiseCache.name = $.when(newName);//update the cache with a simulated `getName()` promise.
});
The first method is more elegant, the second more efficient.
You can simply think of both functions as independent. That way, you don't go daisy-chaining dependencies that operate asynchronously. You can then have one other module that uses them.
Since they do async stuff, consider using promises. You can use jQuery's deferreds for compatibility. Think of deferreds as read/write while promises are read-only.
// foo.js
define(function(){
return function(){
return new Promise(function(resolve, reject){
// Do async stuff. Call resolve/reject accordingly
});
};
});
// bar.js
define(function(){
return function(){
return new Promise(function(resolve, reject){
// Do async stuff. Call resolve/reject accordingly
});
};
});
// Your code (Excuse the CommonJS format. Personal preference)
define(function(require){
// Require both functions
var foo = require('foo');
var bar = require('bar');
// Use them
foo(...).then(function(response){
return bar();
}).then(function(){
// all done
});;
});
Try creating an object property with possible values undefined , "pending" , true ; call deferred.resolve() when obj.active is true , deferred.reject() when obj.active is "pending"
var res = {
active: void 0
};
var foo = function foo(state) {
var t;
var deferred = function(type) {
return $.Deferred(function(dfd) {
if (res.active === "pending" || state && state === "pending") {
res.active = "pending";
dfd.rejectWith(res, [res.active])
} else {
res.active = state || "pending";
t = setInterval(function() {
console.log(res.active)
}, 100);
setTimeout(function() {
clearInterval(t)
res.active = true;
dfd.resolveWith(res, [res.active])
}, 3000);
}
return dfd.promise()
})
.then(function(state) {
console.log("foo value", state);
return state
}, function(err) {
console.log("foo status", err)
return err
})
}
return deferred()
}
var bar = function bar(result) {
var deferred = function(type) {
return $.Deferred(function(dfd) {
if (result && result === true) {
setTimeout(function() {
dfd.resolveWith(result, [true])
}, 1500)
} else {
dfd.rejectWith(res, [res.active || "pending"])
};
return dfd.promise()
})
}
return deferred().then(function(data) {
console.log("bar value", data);
}, function(err) {
console.log("bar status", err);
})
}
$("button").click(function() {
$(this).is(":first")
? foo().then(bar, bar)
: bar(res.active === true ? res.active : "pending")
.then(foo, foo).then(bar, bar)
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js">
</script>
<button>foo</button>
<button>bar</button>
Not sure I understood correctly the question. But here is my take at it:
Put you function foo into a variable
var foo_fn = function foo(foo_args){// Your async code goes here}
foo is async and returns something at some point. In your definition of foo, I recommend that you use promises, the concept is designed to manage composition of asynchronous functions in a clean and scalable way. jQuery implementation of the concept is convenient in a lot of simple use cases but suffers from some drawbacks which make it interesting for you at some point to use one of the many promises library which follow the Promises/A specification. For more information, you can refer to :
Cf. https://thewayofcode.wordpress.com/2013/01/22/javascript-promises-and-why-jquery-implementation-is-broken/ and https://blog.domenic.me/youre-missing-the-point-of-promises
so, say foo takes args, and returns a promise which later resolves into some value.
var foo_fn = function foo(foo_args) {
return foo_fn.promise = new RSVP.Promise (resolve, reject) {
// Your async code goes here
}
}
Here I use the RSVP promise library but any promise library following the Promises/A specification could do the job.
When bar is called, you can just do:
function bar (bar_args) {
var foo_promise = foo_fn.promise;
// if foo was called, whether the computation is in progress or finished,
// the foo_fn.promise field will be non-empty, as foo returns immediately
// with a promise anytime it is called
``
if (!foo.promise) {
// foo has not yet been called so call it
foo_promise = foo(foo_args);
}
foo_promise.then (function (foo_result) {/*some async code here*/})
}
NOTE : That solution is quite similar to the one proposed by Roamer-1888. One difference is that in Roamer proposal, the foo function will always return the same value after performing once its asyncronous computation. Don't know if this is the intended behaviour. In my implementation, foo executes the async. computation every time it is called. bar will use the latest computed value that is stored in the field foo_fn.promise. Older computations are lost, possible computation in progress is not taken into account.
If you are going to have this pattern often used in your code, you can also create a function working on the model of the define
function in require.js.
You will need :
a registry to hold the dependencies functions (foo in your example)
the dependant function (bar in your example) will need to accept the dependencies functions computed value as part of their signature. For example, a hash of the dependencies could be passed as first parameter, so bar signature could be: {foo: foo_result}, other_bar_args...
the dependencies function must follow the model of my previous answer, i.e. register their promise value as a property on themselves when they execute.
Reminder : you need to name those dependencies functions to reference them inside their body, and then add that object to the registry.
In the define function body, you wrap the dependent function into another one which :
Get all dependencies from the registry
Get all dependencies values, executing the dependencies when necessary (similarly to my previous answer). This means you end up having a list of promises, whose results you then congregate together (RSVP.hash for example with RSVP promise library). I believe jQuery has a similar function with jQuery.when
you call the dependent function (bar) with this hash of results as a first argument, other arguments being the same as the wrapped function
that wrapped function is the new bar, so when bar is called, it will be the wrapped function which will be called.
A bit lengthy but it should work. If you want to see some code, let me know if this is what you were looking for. In any case, if you are going to have complex async. in your code, it could be interesting for you to use a compliant promise library. $.deferred is also to be used only when you have nothing better at sight as it makes it harder for you to track the behaviour of your functions : you need to keep track of all places where this deferred appears to be able to reason about your program.
I need to do this: browser have to make N requests to the server, requests mustn't be async, next requests are starting after previous requests will stop.
I can write some function A with for i < N i++ and calling this function A again recursively to do this, but it is not beautifull at all. Also, this called callback hell. I want some more beautifull solution.
I found deffered objects. Some says, it can help me to escape callback hell. I want something like this. setTimeout there is imitate one async request:
function foo1(some) {
debugger;
setTimeout(function foo1async() {
debugger;
deffered.resolve();
}, 500);
return deffered.promise;
}
function foo2(some) {
debugger;
setTimeout(function foo2async() {
debugger;
deffered.reject();
}, 500);
return deffered.promise;
}
function foo3() {
debugger;
setTimeout(function foo3async() {
debugger;
deffered.resolve();
}, 500);
return deffered.promise;
}
var deffered;
function doChain() {
debugger;
deffered = $q.defer();
var promise = deffered.promise;
promise.then(foo1);
promise.then(foo2);
promise.then(foo3);
promise["finally"](function () {
debugger;
});
deffered.resolve();
}
I expect foo1 to be called, then foo1async will be called and resolve deffered object.
foo2 must be called, then foo2async is called.
3.Now I expect, that foo3 wouldn't start, because deffered is rejected in foo2async. After that I expect foo in finally section called.
Actually, I have this:
foo1, foo2 and foo3 are called. Then foo in finally section called. Then foo1async, foo2async and foo3async funtions are called.
How I can get what I am expecting?
Actually, I will have something like this:
for(var i = 0; i < N; i++) {
(function (iter) {
promise.then(function () {
foo(iter);
});
})(i);
}
You got a few things wrong here.
First, you use a deferred to convert a callback-based async function into a promise-based - but each one needs its own deferred.promise and thus its own deferred. Actually, I prefer to use the $q constructor instead:
function fooN(input){
return $q(function(resolve, reject){
setTimeout(function(){
resolve(input + "; some more data");
}, 500);
});
}
(you could use var deferred = $q.defer() as well)
fooN now returns a promise, so you don't need to use $q.defer() anymore.
In fact, if the async function already was promise-based, like $timeout or $http, then you wouldn't have needed a deferred at all, for ex:
function fooN(input){
return $timeout(function(){
return input + "; some more data";
}, 500);
})
So, let's assume that foo1, foo2 and foo3 are implemented like fooN - all returning promises.
To make the calls sequential, you would need to chain promises - not to attach multiple handlers to the some root promise.
I'll break it down for you:
function doChain(){
var foo1Promise = foo1();
var foo2AfterFoo1Promise = foo1Promise.then(foo2);
var foo3AfterFoo2Promise = foo2AfterFoo1Promise.then(foo3);
var promise = foo3AfterFoo2Promise.then(function(finalData){
return doSomeProcessing(finalData); // if needed
});
promise.catch(function(error){
// "rethrow", if can't handle
return $q.reject({msg: "Some error occurred"});
})
return promise;
}
Or, the same, more concise:
function doChain(p){
return foo1(p)
.then(foo2)
.then(foo3)
.then(function(finalData){
return doSomeProcessing(finalData);
})
.catch(function(error){
return $q.reject({msg: "Some error occurred"});
});
}
A "promised" return value of each function is an input to the next chained function.
You can use $q.all method. For instance:
var promises = [promise1, promise2, ...];
$q.all(promises).then(function () {
// do something
});
What happens now is that all foo* promises depend on the single promise; when it gets resolved all are triggered. In ASCII art the dependencies are:
┎ foo1
promise ╁ foo2
┖ foo3
What you want is:
function doChain() {
foo1()
.then(foo2)
.then(foo3)
;
}
No need for the extra promise. No callback hell either!
I have a problem with testing my function that I use with typeahead.js (https://github.com/angular-ui/bootstrap/blob/master/src/typeahead/typeahead.js). I usually know how to resolve a promise in tests, but not with the following function:
$scope.getSuggestion = function ( name, length ) {
return $http.get( 'api/autocomplete/?contains=' + name )
.then( function ( response ) {
return response.data.slice( 0, length || 7 );
});
};
My test looks like this:
describe('Suggestions', function () {
it('should be possible to get suggestions', function () {
$httpBackend.expectGET('api/autocomplete?title__contains=Foo').respond([
{ name: 'Foobar' },
{ name: 'Foobala' },
{ name: 'Foolooloo' }
]);
var suggestions = $scope.getSuggestion( 'Foo' );
$rootScope.$apply();
// Here should be a test.
console.log(suggestions);
})
});
But suggestion only is the promise object Object{then: function (callback, errback) { ... }}.
Where did I mess up!?
suggestions is a promise, it is not an actual value, you need to call then() to get the value out of it. That is
suggestions.then(function(data) {
// Here should be a test.
console.log(data);
});
Update:
Try this:
describe('Suggestions', function () {
it('should be possible to get suggestions', function () {
$httpBackend.expectGET('api/autocomplete?title__contains=Foo').respond([
{ name: 'Foobar' },
{ name: 'Foobala' },
{ name: 'Foolooloo' }
]);
var suggestions;
$scope.getSuggestion( 'Foo' ).then(function(data) {
suggestions = data;
});
$httpBackend.flush();
$rootScope.$apply(); // might be optional
// Here should be a test.
console.log(suggestions);
})
});
Regarding the $httpBackend service, in order to 'force' it to respond to the request of the $http service, it is sufficient to cal $httpBackEnd.flush();. I believe the $rootScope.$apply(); is redundant here.
Regarding the return value of the $scope.getSuggestion method, note that it will not return you the data from the server; it will return you a promise object which will be resolved once the request to the server is fulfilled.
Also note that promises can be chained, therefore the result of $http(...).then(....) in your method is still a promise.
Finally, the return statement in the callback passed as argument in then in your controller (return response.data.slice( 0, length || 7 );) will not be of much use; when the promise is resolved and this callback is invoked, you won't be able to get that return value.
You can of course provide code in the callback passed to then inside the controller method if there is something you need to do every time you call the getSuggestion method. If the 'client' of that method however needs the data the $http service returns, he will have to register his own callback to retrieve them.
So, to actually get the response data in your test (the 'client' of your controller method), you need to register the respective callbacks inside the test.
You can use the standard then method of the promise "interface" which expects you to pass two callbacks; the first will be called if the request succeeds (in your case if the $httpBackend is 'trained' to respond with a status in the 2XX range), the second one in case of an error. Alternatively you could use the AngularJS specific success and error methods since the $http service returns an HttpPromise.
You can check out this approach here: http://jsfiddle.net/yianisn/rgk97/
See also here and here.
I have a handler (callback), an object to handle and four functions, which collect the data to object. In my case I wish to asynchronously call four data retrievers and when execution of all four is complete, handle the resulting object (something similar to the following):
var data = {};
function handle (jsObj) {}
// data retrieving
function getColorData () {}
function getSizeData () {}
function getWeightData () {}
function getExtraData () {}
data.color = getColorData();
data.size = getSizeData();
data.weight = getWeightData();
data.extra = getExtraData();
handle( data );
Of course, this code will not work properly. And if I chain data retrieving functions, they will be called one after another, right?
All four functions should be called asynchronously, cause they are being executed for too long to call them one by one.
Updated:
Thanks to everybody for your suggestions! I prefered $.Deferred(), but I found it slightly difficult to make it work the way I need. What I need is to asynchronously make a view, which requires four kinds of data (extraData, colorData, sizeData & weightData) and I have three objects: App, Utils & Tools.
Just a small description: view is created by calling App.getStuff passed App.handleStuff as a callback. Callback in the body of App.getStuff is called only $.when(App.getExtraData(), App.getColorData(), App.getSizeData(), App.getWeightData()). Before that Utils.asyncRequest passed Tools.parseResponse as a callback is called.
So, now the question is should I create four deferred objects inside each App.get*Data() and also return deferred.promise() from each of them?
And should I deferred.resolve() in the last function in my order (Tools.parseResponse for App.getExtraData in my example)?
var view,
App,
Utils = {},
Tools = {};
// Utils
Utils.asyncRequest = function (path, callback) {
var data,
parseResponse = callback;
// do something with 'data'
parseResponse( data );
};
// Tools
Tools.parseResponse = function (data) {
var output = {};
// do something to make 'output' from 'data'
/* So, should the deferred.resolve() be done here? */
deferred.resolve(output);
/// OR deferred.resolve();
/// OR return output;
};
// App
App = {
// Only one method really works in my example
getExtraData : function () {
var deferred = new jQuery.Deferred();
Utils.asyncRequest("/dir/data.txt", Tools.parseResponse);
return deferred.promise();
},
// Others do nothing
getColorData : function () { /* ... */ },
getSizeData : function () { /* ... */ },
getWeightData : function () { /* ... */ }
};
App.getStuff = function (callback) {
$.when(
App.getExtraData(),
App.getColorData(),
App.getSizeData(),
App.getWeightData()
)
.then(function (extraData, colorData, sizeData, weightData) {
var context,
handleStuff = callback;
// do something to make all kinds of data become a single object
handleStuff( context );
});
};
App.handleStuff = function (stuff) { /* ... */ };
/// RUN
view = App.getStuff( App.handleStuff );
I did not expect the code in my example above to work, it is for illustrative purposes.
I've been trying to solve this for quiet a long time and it still gives no result. The documentation for jQuery.Deferred() and discussions around this, unfortunately, did not help me. So, I would be very glad and greatful for any help or advise.
Conceptually, you would use a counter that gets incremented as each asynchronous call completes. The main caller should proceed after the counter has been incremented by all the asynchronous calls.
I think what you're looking for are Promises / Deferreds.
With promises you can write something like:
when(getColorData(), getSizeData(), getWeightData(), getExtraData()).then(
function (colorData, sizeData, weightData, extraData) {
handle(/*..*/);
}
)
The get*Data() functions will return a promise that they fulfill when their assynchronous call is complete.
Ex:
function getData() {
var promise = new Promise();
doAjax("getData", { "foo": "bar" }, function (result) {
promise.resolve(result);
});
return promise;
}
The when simply counts the number arguments, if all it's promises are resolved, it will call then with the results from the promises.
jQuery has an OK implementation: http://api.jquery.com/jQuery.when/
What I could suggest for this scenario would be something like that.
write a function like this
var completed = 0;
checkHandler = function() {
if(completed == 4) {
handle(data);
}
}
where completed is the number of positive callbacks you must receive.
As soon as every function receives a callback you can increment the "completed" counter and invoke the checkHandler function. and you're done!
in example
function getColorData() {
$.get('ajax/test.html', function(data) {
completed++;
checkHandler();
});
}
I want to say when this function close() is finished run this function init(). But it's not working for me.
$.when(close(toolTip)).done(init(toolTip, anchor));
I am not using the $.when for anything ajax related, just trying to make sure close() is finished before I call init(), and no I can't stick init() at the end of close(). Any ideas?
ok here is close()
var close = function (toolTip) {
toolTip.fadeOut('fast', function (e) {
if (typeof e !== 'undefined') {
//Re-set values applied when initted
var toolTipBd = toolTip.find('.bd:first');
toolTip.css('width', '');
toolTipBd.css('max-height', '');
toolTip.css('max-height', '');
toolTipBd.css('overflowY', '');
}
});
};
No where in close() can it call init().
Your close() implementation should be like this:
var close = function (toolTip) {
var d = $.Deferred();
toolTip.fadeOut('fast', function (e) {
if (typeof e !== 'undefined') {
//Re-set values applied when initted
var toolTipBd = toolTip.find('.bd:first');
toolTip.css('width', '');
toolTipBd.css('max-height', '');
toolTip.css('max-height', '');
toolTipBd.css('overflowY', '');
}
d.resolve();
});
return d.promise();
};
$.when works with Deferred's. It returns a new Deferred which will resolve when all the Deferred's you provided resolve.
As close() doesn't seem to be returning a Promise, when will resolve straight away (per the docs for when().
However, if close() is synchronous, you don't need when() at all. If it is asynchronous, you need to return a Promise, and resolve it when your animation or whatever has completed;
function close(what) {
var promise = jQuery.Deferred();
what.fadeOut('slow', function () {
promise.resolve();
});
return promise.promise();
}
... but you still don't need $.when as only 1 promise is involved. $.when is only useful when multiple promises are at play.
close(toolTip).done(function () {
init(toolTip, anchor);
});
Note also that done(init(tooltip, anchor)) will call init immediately, and pass the result of that function invocation to done(); instead, you need to pass a function to done. As init needs parameters, we've fixed this by introducing an anonymous function. If init didn't need any parameters, it'd have been as simple as:
close(toolTip).done(init);
Simply return toolTip:
return toolTip.fadeOut(...
using the callback to resolve a deferred object can result in odd results if there are more than one elements selected for whatever reason.
This works because jQuery objects have a .promise method that when called, return a promise object that resolves when all active animations are completed. $.when calls .promise on all passed in arguments.
You'll also need to call init differently, for example,
$.when(close(toolTip)).done(function(){
init(toolTip, anchor);
});
And, as pointed out by others, you could then shorten that to
close(toolTip).promise().done(function(){
init(toolTip, anchor);
});