I have a service where I exchange data, in this service I keep a promise created by $interval, nothing fancy:
$rootScope.recursivePosition = null;
$rootScope.trackMe = function(){
var extensionName = $state.current.extensionName;
if ($rootScope.tracking === false) {
$rootScope.tracking = true;
$rootScope.recursivePosition = $interval(function(){
someService.getAllPositions($rootScope.content[extensionName]);
}, 2000);
} else {
$interval.cancel($rootScope.recursivePosition);
console.log("recursivePosition cancel");
console.dir($rootScope.recursivePosition);
$rootScope.tracking = false;
}
};
The thing is, inside that service I have another promise (from $cordovaGeolocation) when I cancel the first promise ($rootScope.recursivePosition) it still works for a while, like 4 seconds more. Can I control this behavior ?
A promise inside a promise can't be cancelled. But a promise derived from that promise can be cancelled.
function getAllPositions(x) {
var defer = $q.defer();
var derivedPromise = defer.promise;
derivedPromise.cancel = function () {
defer.reject('cancelled');
});
$cordovaGeolocation(x)
.then(function onSuccess(value) {
defer.resolve(value);
}).catch(function onReject(error) {
defer.reject(error);
});
return derivedPromise;
};
The above example returns a promise derived from the $cordovaGeolocation promise. Attached to it is a method named cancel which rejects the promise if called before $cordovaGeolocation resolves.
Asynchronous operations started by $cordovaGeolocation can't be stopped but operations chained from its promise can be cancelled.
Related
I'm using Angular 1.5.8. The views in my app require different combinations of the same 3 ajax requests. Some views require data from all three, others require data from two, or even one single endpoint.
I'm working on a function that will manage the retrieval of this data, requiring the app to only call each endpoint once. I want the ajax requests to be called as needed, but only when needed. Currently I've created a function which works, but seems like it could use improvement.
The following function is contained within the $rootScope. It uses the fetchData() function to cycle through the get requests as requested. When data is retrieved, it is stored in the global variable $rootScope.appData and then fetchData() is called again. When all data is retrieved the deferred promise is resolved and the data is returned to the controller.
$rootScope.appData = {};
$rootScope.loadAppData = function(fetch) {
var deferred = $q.defer();
function getUser() {
$http
.get('https://example.com/api/getUser')
.success(function(result){
$rootScope.appData.currentUser = result;
fetchData();
});
}
function getPricing() {
$http
.get('https://example.com/api/getPricing')
.success(function(result) {
$rootScope.appData.pricing = result;
fetchData();
});
}
function getBilling() {
$http
.get('https://example.com/api/getBilling')
.success(function(result) {
$rootScope.appData.billing = result;
fetchData();
});
}
function fetchData() {
if (fetch.user && !$rootScope.appData.currentUser) {
getUser();
} else if (fetch.pricing && !$rootScope.appData.pricing) {
getPricing();
} else if (fetch.billing && !$rootScope.appData.billing) {
getBilling();
} else {
deferred.resolve($rootScope.appData);
}
}
if ($rootScope.appData.currentUser && $rootScope.appData.pricing &&$rootScope.appData.billing) {
deferred.resolve($rootScope.appData);
} else {
fetchData();
}
return deferred.promise;
};
An object fetch is submitted as an attribute, this object shows which ajax requests to call. An example call to the $rootScope.loadAppData() where only user and pricing data would be requested would look like this:
$rootScope.loadAppData({user: true, pricing: true}).then(function(data){
//execute view logic.
});
I'm wondering:
Should the chaining of these functions be done differently? Is the fetchData() function sufficient, or is this an odd way to execute this functionality?
Is there a way to call all needed Ajax requests simultaneously, but wait for all required calls to complete before resolving the promise?
Is it unusual to store data like this in the $rootScope?
I'm aware that this function is not currently handling errors properly. This is functionality I will add before using this snippet, but isn't relevant to my question.
Instead of using the .success method, use the .then method and return data to its success handler:
function getUserPromise() {
var promise = $http
.get('https://example.com/api/getUser')
.then( function successHandler(result) {
//return data for chaining
return result.data;
});
return promise;
}
Use a service instead of $rootScope:
app.service("myService", function($q, $http) {
this.loadAppData = function(fetchOptions) {
//Create first promise
var promise = $q.when({});
//Chain from promise
var p2 = promise.then(function(appData) {
if (!fetchOptions.user) {
return appData;
} else {
var derivedPromise = getUserPromise()
.then(function(user) {
appData.user = user;
//return data for chaining
return appData;
});
return derivedPromise;
);
});
//chain from p2
var p3 = p2.then(function(appData) {
if (!fetchOptions.pricing) {
return appData;
} else {
var derivedPromise = getPricingPromise()
.then(function(pricing) {
appData.pricing = pricing;
//return data for chaining
return appData;
});
return derivedPromise;
);
});
//chain from p3
var p4 = p3.then(function(appData) {
if (!fetchOptions.billing) {
return appData;
} else {
var derivedPromise = getBillingPromise()
.then(function(user) {
appData.billing = billing;
//return data for chaining
return appData;
});
return derivedPromise;
);
});
//return final promise
return p4;
}
});
The above example creates a promise for an empty object. It then chains three operations. Each operations checks to see if a fetch is necessary. If needed a fetch is executed and the result is attached to the appData object; if no fetch is needed the appData object is passed to the next operation in the chain.
USAGE:
myService.loadAppData({user: true, pricing: true})
.then(function(appData){
//execute view logic.
}).catch(functon rejectHandler(errorResponse) {
console.log(errorResponse);
throw errorResponse;
});
If any of the fetch operations fail, subsequent operations in the chain will be skipped and the final reject handler will be called.
Because calling the .then method of a promise returns a new derived promise, it is easily possible to create a chain of promises. It is possible to create chains of any length and since a promise can be resolved with another promise (which will defer its resolution further), it is possible to pause/defer resolution of the promises at any point in the chain. This makes it possible to implement powerful APIs. -- AngularJS $q Service API Reference - Chaining Promises
Found a good way to answer question 2 in the original post. Using $q.all() allows the promises to execute simultaneously, resolving once they all complete, or failing as soon as one of them fails. I've added this logic into a service thanks to #georgeawg. Here's my re-write putting this code into a service, and running all calls at the same time:
services.factory('appData', function($http, $q) {
var appData = {};
var coreData = {};
appData.loadAppData = function(fetch) {
var deferred = $q.defer();
var getUser = $q.defer();
var getPricing = $q.defer();
var getBilling = $q.defer();
if (!fetch.user || coreData.currentUser) {
getUser.resolve();
} else {
$http
.get('https://example.com/api/getUser')
.success(function(result){
coreData.currentUser = result;
getUser.resolve();
}).error(function(reason) {
getUser.reject(reason);
});
}
if (!fetch.billing || coreData.billing) {
getBilling.resolve();
} else {
$http
.get('https://example.com/api/getBilling')
.success(function(result) {
coreData.billing = result;
getBilling.resolve();
}).error(function(reason) {
getBilling.reject(reason);
});
}
if (!fetch.pricing || coreData.pricing) {
getPricing.resolve();
} else {
$http
.get('https://example.com/api/getPricing')
.success(function(result) {
coreData.pricing = result;
getPricing.resolve();
}).error(function(reason) {
getPricing.reject(reason);
});
}
$q.all([getPricing.promise, getUser.promise, getBilling.promise]).then(function(result) {
deferred.resolve(coreData);
}, function(reason){
deferred.reject(reason);
});
return deferred.promise;
};
return appData;
});
Problem 1: only one API request is allowed at a given time, so the real network requests are queued while there's one that has not been completed yet. An app can call the API level anytime and expecting a promise in return. When the API call is queued, the promise for the network request would be created at some point in the future - what to return to the app? That's how it can be solved with a deferred "proxy" promise:
var queue = [];
function callAPI (params) {
if (API_available) {
API_available = false;
return doRealNetRequest(params).then(function(data){
API_available = true;
continueRequests();
return data;
});
} else {
var deferred = Promise.defer();
function makeRequest() {
API_available = false;
doRealNetRequest(params).then(function(data) {
deferred.resolve(data);
API_available = true;
continueRequests();
}, deferred.reject);
}
queue.push(makeRequest);
return deferred.promise;
}
}
function continueRequests() {
if (queue.length) {
var makeRequest = queue.shift();
makeRequest();
}
}
Problem 2: some API calls are debounced so that the data to be sent is accumulated over time and then is sent in a batch when a timeout is reached. The app calling the API is expecting a promise in return.
var queue = null;
var timeout = 0;
function callAPI2(data) {
if (!queue) {
queue = {data: [], deferred: Promise.defer()};
}
queue.data.push(data);
clearTimeout(timeout);
timeout = setTimeout(processData, 10);
return queue.deferred.promise;
}
function processData() {
callAPI(queue.data).then(queue.deferred.resolve, queue.deferred.reject);
queue = null;
}
Since deferred is considered an anti-pattern, (see also When would someone need to create a deferred?), the question is - is it possible to achieve the same things without a deferred (or equivalent hacks like new Promise(function (resolve, reject) {outerVar = [resolve, reject]});), using the standard Promise API?
Promises for promises that are yet to be created
…are easy to build by chaining a then invocation with the callback that creates the promise to a promise represents the availability to create it in the future.
If you are making a promise for a promise, you should never use the deferred pattern. You should use deferreds or the Promise constructor if and only if there is something asynchronous that you want to wait for, and it does not already involve promises. In all other cases, you should compose multiple promises.
When you say
When the API call is queued, the promise for the network request would be created at some point in the future
then you should not create a deferred that you can later resolve with the promise once it is created (or worse, resolve it with the promises results once the promise settles), but rather you should get a promise for the point in the future at which the network reqest will be made. Basically you're going to write
return waitForEndOfQueue().then(makeNetworkRequest);
and of course we're going to need to mutate the queue respectively.
var queue_ready = Promise.resolve(true);
function callAPI(params) {
var result = queue_ready.then(function(API_available) {
return doRealNetRequest(params);
});
queue_ready = result.then(function() {
return true;
});
return result;
}
This has the additional benefit that you will need to explicitly deal with errors in the queue. Here, every call returns a rejected promise once one request failed (you'll probably want to change that) - in your original code, the queue just got stuck (and you probably didn't notice).
The second case is a bit more complicated, as it does involve a setTimeout call. This is an asynchronous primitive that we need to manually build a promise for - but only for the timeout, and nothing else. Again, we're going to get a promise for the timeout, and then simply chain our API call to that to get the promise that we want to return.
function TimeoutQueue(timeout) {
var data = [], timer = 0;
this.promise = new Promise(resolve => {
this.renew = () => {
clearTimeout(timer);
timer = setTimeout(resolve, timeout);
};
}).then(() => {
this.constructor(timeout); // re-initialise
return data;
});
this.add = (datum) => {
data.push(datum);
this.renew();
return this.promise;
};
}
var queue = new TimeoutQueue(10);
function callAPI2(data) {
return queue.add(data).then(callAPI);
}
You can see here a) how the debouncing logic is completely factored out of callAPI2 (which might not have been necessary but makes a nice point) and b) how the promise constructor only concerns itself with the timeout and nothing else. It doesn't even need to "leak" the resolve function like a deferred would, the only thing it makes available to the outside is that renew function which allows extending the timer.
When the API call is queued, the promise for the network request would
be created at some point in the future - what to return to the app?
Your first problem can be solved with promise chaining. You don't want to execute a given request until all prior requests have finished and you want to execute them serially in order. This is exactly the design pattern for promise chaining. You can solve that one like this:
var callAPI = (function() {
var p = Promise.resolve();
return function(params) {
// construct a promise that chains to prior callAPI promises
var returnP = p.then(function() {
return doRealNetRequest(params);
});
// make sure the promise we are chaining to does not abort chaining on errors
p = returnP.then(null, function(err) {
// handle rejection locally for purposes of continuing chaining
return;
});
// return the new promise
return returnP;
}
})();
In this solution, a new promise is actually created immediately with .then() so you can return that promise immediately - there is no need to create a promise in the future. The actual call to doRealNetRequest() is chained to this retrurned .then() promise by returning its value in the .then() handler. This works because, the callback we provide to .then() is not called until some time in the future when the prior promises in the chain have resolved, giving us an automatic trigger to execute the next one in the chain when the prior one finishes.
This implementation assumes that you want queued API calls to continue even after one returns an error. The extra few lines of code around the handle rejection comment are there to make sure the chain continues even where a prior promise rejects. Any rejection is returned back to the caller as expected.
Here's a solution to your second one (what you call debounce).
the question is - is it possible to achieve the same things without a
deferred (or equivalent hacks like new Promise(function (resolve,
reject) {outerVar = [resolve, reject]});), using the standard Promise
API?
As far as I know, the debouncer type of problem requires a little bit of a hack to expose the ability to trigger the resolve/reject callbacks somehow from outside the promise executor. It can be done a little cleaner than you propose by exposing a single function that is within the promise executor function rather than directly exposing the resolve and reject handlers.
This solution creates a closure to store private state that can be used to manage things from one call to callAPI2() to the next.
To allow code at an indeterminate time in the future to trigger the final resolution, this creates a local function within the promise executor function (which has access to the resolve and reject functions) and then shares that to the higher (but still private) scope so it can be called from outside the promise executor function, but not from outside of callAPI2.
var callAPI2 = (function() {
var p, timer, trigger, queue = [];
return function(data) {
if (!p) {
p = new Promise(function(resolve) {
// share completion function to a higher scope
trigger = function() {
resolve(queue);
// reinitialize for future calls
p = null;
queue = [];
}
}).then(callAPI);
}
// save data and reset timer
queue.push(data);
clearTimeout(timer);
setTimeout(trigger, 10);
return p;
}
})();
You can create a queue, which resolves promises in the order placed in queue
window.onload = function() {
(function(window) {
window.dfd = {};
that = window.dfd;
that.queue = queue;
function queue(message, speed, callback, done) {
if (!this.hasOwnProperty("_queue")) {
this._queue = [];
this.done = [];
this.res = [];
this.complete = false;
this.count = -1;
};
q = this._queue,
msgs = this.res;
var arr = Array.prototype.concat.apply([], arguments);
q.push(arr);
msgs.push(message);
var fn = function(m, s, cb, d) {
var j = this;
if (cb) {
j.callback = cb;
}
if (d) {
j.done.push([d, j._queue.length])
}
// alternatively `Promise.resolve(j)`, `j` : `dfd` object
// `Promise` constructor not necessary here,
// included to demonstrate asynchronous processing or
// returned results
return new Promise(function(resolve, reject) {
// do stuff
setTimeout(function() {
div.innerHTML += m + "<br>";
resolve(j)
}, s || 0)
})
// call `cb` here, interrupting queue
.then(cb ? j.callback.bind(j, j._queue.length) : j)
.then(function(el) {
console.log("queue.length:", q.length, "complete:", el.complete);
if (q.length > 1) {
q.splice(0, 1);
fn.apply(el, q[0]);
return el
} else {
el._queue = [];
console.log("queue.length:", el._queue.length
, "complete:", (el.complete = !el._queue.length));
always(promise(el), ["complete", msgs])
};
return el
});
return j
}
, promise = function(t) {
++t.count;
var len = t._queue.length,
pending = len + " pending";
return Promise.resolve(
len === 1
? fn.apply(t, t._queue[0]) && pending
: !(t.complete = len === 0) ? pending : t
)
}
, always = function(elem, args) {
if (args[0] === "start") {
console.log(elem, args[0]);
} else {
elem.then(function(_completeQueue) {
console.log(_completeQueue, args);
// call any `done` callbacks passed as parameter to `.queue()`
Promise.all(_completeQueue.done.map(function(d) {
return d[0].call(_completeQueue, d[1])
}))
.then(function() {
console.log(JSON.stringify(_completeQueue.res, null, 2))
})
})
}
};
always(promise(this), ["start", message, q.length]);
return window
};
}(window));
window
.dfd.queue("chain", 1000)
.dfd.queue("a", 1000)
.dfd.queue("b", 2000)
.dfd.queue("c", 2000, function callback(n) {
console.log("callback at queue index ", n, this);
return this
}, function done(n) {
console.log("all done callback attached at queue index " + n)
})
.dfd.queue("do", 2000)
.dfd.queue("other", 2000)
.dfd.queue("stuff", 2000);
for (var i = 0; i < 10; i++) {
window.dfd.queue(i, 1000)
};
window.dfd.queue.apply(window.dfd, ["test 1", 5000]);
window.dfd.queue(["test 2", 1000]);
var div = document.getElementsByTagName("div")[0];
var input = document.querySelector("input");
var button = document.querySelector("button");
button.onclick = function() {
window.dfd.queue(input.value, 0);
input.value = "";
}
}
<input type="text" />
<button>add message</button>
<br>
<div></div>
In the below code, I want sequential executuon of the method saveBulkUploadSinglePacket in the while loop, that means process next packet after completion of the current packet. How to achieve that.
var saveBulkUploadSinglePacket = function(){
while (packetCount<=bulkUploadPackets.length){
$.when(saveBulkUploadSinglePacket(modelToSave)).done(function(arguments){
saveBulkUploadPackets.push(arguments);
packetCount++;
});
}
return saveBulkUploadPackets;
}
var saveBulkUploadSinglePacket = function(modelToSave){
var defer = $.Deferred();
$.when(new SaveBulkUpload().save(modelToSave)).done(function(arguments){
defer.resolve(arguments);
}).fail(function(errorObj){
defer.reject(errorObj);
});
return defer.promise();
}
The standard way to say "perform x when promise is done" is through promise.then(). Keep track of the current promise in a var outside the loop and attach each call to the previous promise with a .then():
var saveBulkUploadSinglePacket = function(){
var lastPromise = $.when();
while (packetCount<=bulkUploadPackets.length){
lastPromise = (
lastPromise
.then(function(){
// make sure saveBulkUploadSinglePacket returns a promise
return saveBulkUploadSinglePacket(modelToSave));
})
.then(function(){
saveBulkUploadPackets.push(arguments);
packetCount++;
// return a resolved promise
return $.when();
})
);
}
lastPromise.resolve(saveBulkUploadPackets);
return lastPromise;
}
At the end of the function I resolved the final promise with the desired return value, then returned the promise. This way you can call saveBulUploadSinglePacket().then(...) to wait for all the promises to complete and handle the result.
To save your pakets sequentially, use Array.prototype.reduce() to build a .then() chain very concisely, as follows :
var saveBulkUploadPackets = function(packets) {
return packets.reduce(function(promise, paket) {
return promise.then(function(results) {
return (new SaveBulkUpload()).save(paket).then(function(res) {
return results.concat(res);
});
});
}, $.when([]));
}
And call like this :
saveBulkUploadPackets(bulkUploadPackets).then(function(results) {
// All done.
// use/log the `results` array here.
});
I need to wait until all promises are resolved or rejected and only then execute a callback. It seems that the current implementation of Q triggers a callback as soon as one promise is rejected, here is the test:
var ps = [];
var d1 = $q.defer();
var d2 = $q.defer();
ps.push(d1.promise, d2.promise);
setTimeout(function () {
d1.reject()
}, 2000)
setTimeout(function () {
d2.resolve()
}, 5000)
$q.all(ps).then(function () {
// is not triggered
}).catch(function () {
//triggered after 2000 ms, I need this triggered after 5000ms
})
How can I achieve what I want?
You can use $q.allSettled() instead of $q.all() if you want to know when all the promises have finished (either fulfilled or rejected).
If you then want to know which promises were rejected, you will have to iterate through the returned results to query which ones were rejected. The Q doc has an example usage for $q.allSettled().
Copied from the Q documentation:
Q.allSettled(promises)
.then(function (results) {
results.forEach(function (result) {
if (result.state === "fulfilled") {
var value = result.value;
} else {
var reason = result.reason;
}
});
});
I have an app using IndexedDB. Originally I made prolific use of callbacks, and decided to clean it up using angularjs promises $q.
FAIL.
http://jsfiddle.net/ed4becky/bumm337e/
angular.module("IDBTest", []);
angular.module("IDBTest")
.service("initSvc", ['$q', function ($q) {
var svc = this;
svc.dbVersion = 1;
svc.open = open;
svc.deleteDB = deleteDB;
var idb = window.indexedDB;
function deleteDB() {
return $q(function (resolve, reject) {
idb.webkitGetDatabaseNames().onsuccess =
function (sender, args) {
if (sender.target.result
&& sender.target.result.length > 0) {
var db = sender.target.result[0];
console.log("deleting " + db);
var request = idb.deleteDatabase(db);
request.onsuccess = function () {
console.log('database ' + db + ' deleted.');
resolve();
};
request.onerror = function () {
reject();
};
} else {
console.log("Nothing to delete");
resolve();
};
};
});
}
function open(dbName) {
return $q(function (resolve, reject) {
var request = idb.open(dbName, svc.dbVersion);
request.onupgradeneeded = function (e) {
var db = e.target.result;
console.log("creating new " + db.name);
e.target.transaction.onerror = function (e) {
console.log(e);
};
db.createObjectStore("table1", {
keyPath: "id"
});
db.createObjectStore("table2", {
keyPath: "id"
});
db.createObjectStore("table3", {
keyPath: "id"
});
};
request.onsuccess = function (e) {
console.log('database ' + dbName + ' open.');
svc.db = e.target.result;
resolve();
};
request.onerror = function () {
reject();
};
});
}
}]);
angular.module('IDBTest')
.factory('$exceptionHandler', ['$log', function ($log) {
return function (exception, cause) {
throw exception;
};
}]);
angular.module('IDBTest')
.run(['initSvc', function (initSvc) {
initSvc.deleteDB()
.then(initSvc.open('testDatabase'))
.then(function () {
console.log(initSvc.db.name + ' initialized');
});
}]);
This fiddle shows my expectation that
Any databases created are deleted.
then
A database is open triggering an onupgradeneeded
then
The database is referenced
Unfortunately the then statments seem to get called BEFORE the promises are resolved in the onsuccess methods of the IDB calls.
To recreate, run the jsfiddle with the console open. May have to run it a couple times to get the exception, but it fails most times, because the last then clause is called before the onsuccess on the database open is called.
Any ideas?
I believe the issue is with the Promise Chain. The documentation from Angular is a bit confusing, but it seems as though if the return value of the callback method is a promise, it will resolve with a value; not a promise. Thus, breaking the chain.
From Angular Promise 'Then' Method Documentation:
then(successCallback, errorCallback, notifyCallback) – regardless of
when the promise was or will be resolved or rejected, then calls one
of the success or error callbacks asynchronously as soon as the result
is available. The callbacks are called with a single argument: the
result or rejection reason. Additionally, the notify callback may be
called zero or more times to provide a progress indication, before the
promise is resolved or rejected.
This method returns a new promise which is resolved or rejected via
the return value of the successCallback, errorCallback (unless that
value is a promise, in which case it is resolved with the value which
is resolved in that promise using promise chaining). It also notifies
via the return value of the notifyCallback method. The promise cannot
be resolved or rejected from the notifyCallback method.
I'm able to get it to initialize with this:
angular.module('IDBTest')
.run(['initSvc', function (initSvc) {
initSvc.deleteDB()
.then(function() {
return initSvc.open('testDatabase');
})
.then(function () {
console.log(initSvc.db.name + ' initialized');
});
}]);