I have an AngularJS function in which I am making 2 async calls. The 2 calls are not dependent but the function should only return when both calls are finished and result is stored in the return var.
I tried different solutions and ending up using the one shown below. But its very slow as its wait for the first one to finish
function myAngularfunction(a, b) {
var defer = $q.defer();
var test= {
a: "",
b: ""
};
msGraph.getClient()
.then((client) => {
// First Async call
client
.api("https://graph.microsoft.com/v1.0/")
.get()
.then((groupResponse) => {
var result = groupResponse;
test.a= result.displayName;
msGraph.getClient()
.then((client) => {
// Second Async call
client
.api("https://graph.microsoft.com/v1.0/teams/")
.get()
.then((groupResponse) => {
var result = groupResponse;
test.b= result.displayName;
defer.resolve(test);
});
});
});
});
return defer.promise;
}
Calling the function
myAngularfunction(a, b).then(function(obj)
How can I wait for the both calls in the same function with out nesting them? Or calling the next one without waiting for the first one to finish.
may be you can try $when like this: $.when(functiton1,function2).done(()=>{}). but you need add deffered in function1 and function 2 .
I think your situation is best suited to use $q.all([promise1, promise2, promise3]).then() syntax.
You are calling msGraph.getClient() multiple times, I think this can be avoided.
Related
I am looking at https://www.promisejs.org/patterns/ and it mentions it can be used if you need a value in the form of a promise like:
var value = 10;
var promiseForValue = Promise.resolve(value);
What would be the use of a value in promise form though since it would run synchronously anyway?
If I had:
var value = 10;
var promiseForValue = Promise.resolve(value);
promiseForValue.then(resp => {
myFunction(resp)
})
wouldn't just using value without it being a Promise achieve the same thing:
var value = 10;
myFunction(10);
Say if you write a function that sometimes fetches something from a server, but other times immediately returns, you will probably want that function to always return a promise:
function myThingy() {
if (someCondition) {
return fetch('https://foo');
} else {
return Promise.resolve(true);
}
}
It's also useful if you receive some value that may or may not be a promise. You can wrap it in other promise, and now you are sure it's a promise:
const myValue = someStrangeFunction();
// Guarantee that myValue is a promise
Promise.resolve(myValue).then( ... );
In your examples, yes, there's no point in calling Promise.resolve(value). The use case is when you do want to wrap your already existing value in a Promise, for example to maintain the same API from a function. Let's say I have a function that conditionally does something that would return a promise — the caller of that function shouldn't be the one figuring out what the function returned, the function itself should just make that uniform. For example:
const conditionallyDoAsyncWork = (something) => {
if (something == somethingElse) {
return Promise.resolve(false)
}
return fetch(`/foo/${something}`)
.then((res) => res.json())
}
Then users of this function don't need to check if what they got back was a Promise or not:
const doSomethingWithData = () => {
conditionallyDoAsyncWork(someValue)
.then((result) => result && processData(result))
}
As a side node, using async/await syntax both hides that and makes it a bit easier to read, because any value you return from an async function is automatically wrapped in a Promise:
const conditionallyDoAsyncWork = async (something) => {
if (something == somethingElse) {
return false
}
const res = await fetch(`/foo/${something}`)
return res.json()
}
const doSomethingWithData = async () => {
const result = await conditionallyDoAsyncWork(someValue)
if (result) processData(result)
}
Another use case: dead simple async queue using Promise.resolve() as starting point.
let current = Promise.resolve();
function enqueue(fn) {
current = current.then(fn);
}
enqueue(async () => { console.log("async task") });
Edit, in response to OP's question.
Explanation
Let me break it down for you step by step.
enqueue(task) add the task function as a callback to promise.then, and replace the original current promise reference with the newly returned thenPromise.
current = Promise.resolve()
thenPromise = current.then(task)
current = thenPromise
As per promise spec, if task function in turn returns yet another promise, let's call it task() -> taskPromise, well then the thenPromise will only resolve when taskPromise resolves. thenPromise is practically equivalent to taskPromise, it's just a wrapper. Let's rewrite above code into:
current = Promise.resolve()
taskPromise = current.then(task)
current = taskPromise
So if you go like:
enqueue(task_1)
enqueue(task_2)
enqueue(task_3)
it expands into
current = Promise.resolve()
task_1_promise = current.then(task_1)
task_2_promise = task_1_promise.then(task_2)
task_3_promise = task_2_promise.then(task_3)
current = task_3_promise
effectively forms a linked-list-like struct of promises that'll execute task callbacks in sequential order.
Usage
Let's study a concrete scenario. Imaging you need to handle websocket messages in sequential order.
Let's say you need to do some heavy computation upon receiving messages, so you decide to send it off to a worker thread pool. Then you write the processed result to another message queue (MQ).
But here's the requirement, that MQ is expecting the writing order of messages to match with the order they come in from the websocket stream. What do you do?
Suppose you cannot pause the websocket stream, you can only handle them locally ASAP.
Take One:
websocket.on('message', (msg) => {
sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
})
This may violate the requirement, cus sendToWorkerThreadPool may not return the result in the original order since it's a pool, some threads may return faster if the workload is light.
Take Two:
websocket.on('message', (msg) => {
const task = () => sendToWorkerThreadPool(msg).then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
This time we enqueue (defer) the whole process, thus we can ensure the task execution order stays sequential. But there's a drawback, we lost the benefit of using a thread pool, cus each sendToWorkerThreadPool will only fire after last one complete. This model is equivalent to using a single worker thread.
Take Three:
websocket.on('message', (msg) => {
const promise = sendToWorkerThreadPool(msg)
const task = () => promise.then(result => {
writeToMessageQueue(result)
})
enqueue(task)
})
Improvement over take two is, we call sendToWorkerThreadPool ASAP, without deferring, but we still enqueue/defer the writeToMessageQueue part. This way we can make full use of thread pool for computation, but still ensure the sequential writing order to MQ.
I rest my case.
I need to call three WS services before calling a local function depending on whether some variables are defined or not, but the function is getting called before the services get any response, because it could take some time. I've even tried with $timeout, but it does not work
$scope.$on('search', function (event, data) {
self.searchDto= data;
if (self.searchDto.userCode) {
self.searchByUserCode(self.searchDto.userCode).then(function (data) {
self.userCode= data.find(function (item) {
return item.mstId === self.searchDto.userCode;
});
});
}
if (self.searchDto.companyCode) {
self.serachByCompanyCode(self.searchDto.companyCode).then(function (data) {
self.companyCode= data.find(function (item) {
return item.mstId === self.searchDto.companyCode;
});
});
}
if (self.searchDto.jobCode) {
self.searchByJobCode(self.searchDto.jobCode).then(function (data) {
self.jobCode= data.find(function (item) {
return item.mstId === self.searchDto.jobCode;
});
});
}
//I tried with this timeout but it didnt work
$timeout(function () {
self.searchPeople();
}, 1000);
});
Does anyone have idea how the searchPeople method can be called after the WS responses?
Use promises and $q.all()
var promises = [];
promises.push(self.searchByUserCode(self.searchDto.userCode).then(function (data) {
self.userCode= data.find(function (item) {
return item.mstId === self.searchDto.userCode;
});
}));
.then() returns a promise. Do that for the 3 service calls and then wait for their completion
$q.all(promises).then(function(){
self.searchPeople();
})
I see that you might not call all of your services. $q.all() will wait for the promise you put in the array. Keep in mind it will also execute your call if none of your services has been executed, if you need at least one to be called, you might want to add a check for promises.length > 0 before $q.all().
That way, if you only call one of your services, the promises array will have one element and upon its completion, will call your local function.
Setting timeout is not a correct approaching here. One solution can be: you should put 3 WS nested and put the function call inside the last WS callback.
It also depends on how much arguments that your searchPeople need. If it only work with fully 3 arguments from WS calls, another solution is putting the function call in all 3 WS callback, and inside function searchPeople, you should add a condition statement to check if we have fully 3 argument before do searching
How to add code that will run only when both processes are completed?
normalise1();
normalise2();
function normalise1() {
return knex("ingredients")
.select("name", "id")
.map(function (ingredient) {
var normalised_name = common.normalise(ingredient.name);
knex('ingredients').where('id', ingredient.id).update({ name_normalised: normalised_name }).then();
});
};
function normalise2() {
return knex("synonyms")
.select("synon_name as name", "id")
.map(function (ingredient) {
var normalised_name = common.normalise(ingredient.name);
knex('synonyms').where('id', ingredient.id).update({ synon_name_normalised: normalised_name }).then();
});
};
I tried something like in different ways
Promise.all([normalise1(), normalise2()])
.then(() => console.log('Done'));
but it didn't work.
Basically console.log('Done') appears before all process is done. I believe that this is because of missing Promise part inside functions, but I cannot figure out exactly how.
The functions are not called when passed to Promise.all(), no Promise is returned from .map().
Call the functions and return knex() from .map() call, which may also require using Promise.all() within the function calls.
I have a couple of asynchronous requests that fetch some data from a url. The problem I am having is that I actually want to delay sending the json response until all requests have come back. The code looks something like this:
getFirstStuff(callback) //imagine this is async
{
console.log('gettingFirstStuff');
callback(stuff);
}
function getFurtherStuff(callback) //imagine this is async
{
console.log('gettingFurtherStuff');
callBack(thing);
}
function getStuff(callBack)
{
getFirstStuff(function(stuff) // async
{
// stuff is an array of 3 items
stuff = stuff.map(function(item) // map is synchronous
{
// For each item in stuff make another async request
getFurtherStuff( function(thing) { // this is also async
stuff.thing = thing;
});
return item;
});
callback(stuff);
});
}
router.get('/getstuff', function(req, res, next) {
getStuff(function(stuff)
{
console.log('finished stuff');
// RETURN RESPONSE AS JSON
res.json(stuff);
});
});
The output will be:
gettingFirstStuff
finished stuff
gettingFurtherStuff
gettingFurtherStuff
gettingFurtherStuff
but it should be:
gettingFirstStuff
gettingFurtherStuff
gettingFurtherStuff
gettingFurtherStuff
finished stuff
I understand that the reason is that getFurtherStuff is async and item will be returned from map before the getFurtherStuff async calls are back with a result. My question is, what is the standard way to wait for these calls to finish before calling the final callback 'callback(stuff)'
There are a bunch of ways to solve this problem. Libraries like async and queue would probably be the best choice, if you have no problem adding dependencies.
The easiest option without external libs is just to count the async jobs and finish when they're all done:
// assuming stuff is an array
var counter = 0;
var jobCount = stuff.length;
// wrap callback in one that checks the counter
var doneCallback = function() {
if (counter >= jobCount) {
// we're ready to go
callback(stuff);
}
};
// run jobs
stuff.map(function(item) {
getFurtherStuff(item, function(itemThing) {
// process async response
stuff.thing = itemThing;
// increment counter;
counter++;
// call the wrapped callback, which won't fire
// until all jobs are complete
doneCallback();
});
});
npm install async
You would then simply throw your functions in to an async.parallel()
More info at https://www.npmjs.com/package/async
I have an RxJS sequence being consumed in the normal manner...
However, in the observable 'onNext' handler, some of the operations will complete synchronously, but others require async callbacks, that need to be waited on before processing the next item in the input sequence.
...little bit confused how to do this. Any ideas? thanks!
someObservable.subscribe(
function onNext(item)
{
if (item == 'do-something-async-and-wait-for-completion')
{
setTimeout(
function()
{
console.log('okay, we can continue');
}
, 5000
);
}
else
{
// do something synchronously and keep on going immediately
console.log('ready to go!!!');
}
},
function onError(error)
{
console.log('error');
},
function onComplete()
{
console.log('complete');
}
);
Each operation you want to perform can be modeled as an observable. Even the synchronous operation can be modeled this way. Then you can use map to convert your sequence into a sequence of sequences, then use concatAll to flatten the sequence.
someObservable
.map(function (item) {
if (item === "do-something-async") {
// create an Observable that will do the async action when it is subscribed
// return Rx.Observable.timer(5000);
// or maybe an ajax call? Use `defer` so that the call does not
// start until concatAll() actually subscribes.
return Rx.Observable.defer(function () { return Rx.Observable.ajaxAsObservable(...); });
}
else {
// do something synchronous but model it as an async operation (using Observable.return)
// Use defer so that the sync operation is not carried out until
// concatAll() reaches this item.
return Rx.Observable.defer(function () {
return Rx.Observable.return(someSyncAction(item));
});
}
})
.concatAll() // consume each inner observable in sequence
.subscribe(function (result) {
}, function (error) {
console.log("error", error);
}, function () {
console.log("complete");
});
To reply to some of your comments...at some point you need to force some expectations on the stream of functions. In most languages, when dealing with functions that are possibly async, the function signatures are async and the actual async vs sync nature of the function is hidden as an implementation detail of the function. This is true whether you are using javaScript promises, Rx observables, c# Tasks, c++ Futures, etc. The functions end up returning a promise/observable/task/future/etc and if the function is actually synchronous, then the object it returns is just already completed.
Having said that, since this is JavaScript, you can cheat:
var makeObservable = function (func) {
return Rx.Observable.defer(function () {
// execute the function and then examine the returned value.
// if the returned value is *not* an Rx.Observable, then
// wrap it using Observable.return
var result = func();
return result instanceof Rx.Observable ? result: Rx.Observable.return(result);
});
}
someObservable
.map(makeObservable)
.concatAll()
.subscribe(function (result) {
}, function (error) {
console.log("error", error);
}, function () {
console.log("complete");
});
First of all, move your async operations out of subscribe, it's not made for async operations.
What you can use is mergeMap (alias flatMap) or concatMap. (I am mentioning both of them, but concatMap is actually mergeMap with the concurrent parameter set to 1.) Settting a different concurrent parameter is useful, as sometimes you would want to limit the number of concurrent queries, but still run a couple concurrent.
source.concatMap(item => {
if (item == 'do-something-async-and-wait-for-completion') {
return Rx.Observable.timer(5000)
.mapTo(item)
.do(e => console.log('okay, we can continue'));
} else {
// do something synchronously and keep on going immediately
return Rx.Observable.of(item)
.do(e => console.log('ready to go!!!'));
}
}).subscribe();
I will also show how you can rate limit your calls. Word of advice: Only rate limit at the point where you actually need it, like when calling an external API that allows only a certain number of requests per second or minutes. Otherwise it is better to just limit the number of concurrent operations and let the system move at maximal velocity.
We start with the following snippet:
const concurrent;
const delay;
source.mergeMap(item =>
selector(item, delay)
, concurrent)
Next, we need to pick values for concurrent, delay and implement selector. concurrent and delay are closely related. For example, if we want to run 10 items per second, we can use concurrent = 10 and delay = 1000 (millisecond), but also concurrent = 5 and delay = 500 or concurrent = 4 and delay = 400. The number of items per second will always be concurrent / (delay / 1000).
Now lets implement selector. We have a couple of options. We can set an minimal execution time for selector, we can add a constant delay to it, we can emit the results as soon as they are available, we can can emit the result only after the minimal delay has passed etc. It is even possible to add an timeout by using the timeout operators. Convenience.
Set minimal time, send result early:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.merge(Rx.Observable.timer(delay).ignoreElements())
}
Set minimal time, send result late:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.zip(Rx.Observable.timer(delay), (item, _))
}
Add time, send result early:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.concat(Rx.Observable.timer(delay).ignoreElements())
}
Add time, send result late:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.delay(delay)
}
Another simple example to do manual async operations.
Be aware that it is not a good reactive practice ! If you only want to wait 1000ms, use Rx.Observable.timer or delay operator.
someObservable.flatMap(response => {
return Rx.Observable.create(observer => {
setTimeout(() => {
observer.next('the returned value')
observer.complete()
}, 1000)
})
}).subscribe()
Now, replace setTimeout by your async function, like Image.onload or fileReader.onload ...