How do I check that every element pass the test if a test is async and can throw an error
const exists = async () => {//can throw an error}
const allExists = [12, 5, 8, 130, 44].every(exists);
You can't use synchronous methods like every with functions that do asynchronous work, because the synchronous method won't wait for the asynchronous result. It's possible to write an async every, though:
async function every(array, callback) {
for (const element of array) {
const result = await callback(element);
if (!result) {
return false;
}
}
return true;
}
Note that because it relies on asynchronous information, it too delivers its result asynchronously. You have to await it (or use .then/.catch, etc.).
That version works in series, only calling the callback for the second entry when the callback for the first has finished its work. That lets you short-circuit (not call the callback for later entries when you already know the answer), but may make it take longer in overall time. Another approach is to do all of the calls in parallel, then check their results:
Promise.all(array.map(callback))
.then(flags => flags.every(Boolean))
.then(result => {
// ...`result` will be `true` or `false`...
});
Related
I just can't get my head around this stuff :(
compatibleApps: async () => {
common.header('Install Compatible Apps')
const compatibleApps = JSON.parse(fs.readFileSync('./data/compatibleApps.json', 'utf8'));
const value = await inquirer.compatibleApps();
for (let element of value.removeAppsList) {
for (let element2 of compatibleApps) {
if (element === element2.name) {
await files.downloadFile(element2)
}
}
}
await adb.installApk()
},
await adb.installApk() is being executed before the all calls of await files.downloadFile(element2 ) have been completed..
Below is the contents of downloadFile, I guess I need to wrap it in a promise?
downloadFile: async (element) => {
option = {
dir: './data/apps',
onDone: (info)=>{
console.log('Latest ' + element.name + ' Downloaded')
},
onError: (err) => {
console.log('error', err);
},
onProgress: (curr, total) => {
},
}
var dd = await dl(element.url, option);
}
Keep in mind that await only does anything useful if you are awaiting a promise that is linked to your actual asynchronous operation. "Linked" in that sentence means that the promise resolves when the asynchronous operation is done or rejected if it has an error.
If the function you are awaiting either returns nothing or returns just a plain value, yet contains asynchronous operations, then the await doesn't actually await anything. It calls the function, initiates those asynchronous operations, the function returns a non-promise value or returns an already resolved promise, the await doesn't have a anything to wait for and just continues executing more lines of code without the expected pause for the asynchronous operations to complete.
So, in your code, the only way that:
await adb.installApk()
executes before any of the calls to:
await files.downloadFile(element2)
is if files.downloadFile() does not actually return a promise that is linked to the asynchronous operations it contains or perhaps if you never execute files.downloadFile(element2) because of the conditionals.
For more specific help, show us the code for files.downloadFile() and confirm that you are getting through your conditionals and executing it at least once.
I'm working on a function that uses Array.reduce, and I need to add an asynchronous API call inside the reduce function. This requires me to use an async function for the callback I pass into reduce, since I'm using await inside the function to wait for the asynchronous API call.
I'm having some trouble writing the reduce correctly. Here's how it currently is (working):
const result = records.reduce((array, currValue) => {
//do stuff
return array
}, [])
Here's what I tried to change it to:
const result = records.reduce(async(array, currentValue) => {
// do stuff
someValue = await asyncCall(currentValue)
array.add(someValue)
return array
}, [])
The error I'm getting is 'No overload matches this call'.
This seems to make sense to me, since reduce takes in a callback that returns an array, and async functions return a callback, not an array. But when I read other examples of how to pass async functions into .reduce, they all seem to just pass an async function into reduce with no problem.
Here are a few links I looked at:
https://advancedweb.hu/how-to-use-async-functions-with-array-reduce-in-javascript/
JavaScript array .reduce with async/await
https://gyandeeps.com/array-reduce-async-await/
The moment I declare the reduction function into async, I get the no matching overloads error, which makes sense to me. I'm not sure how this seems to work for other people.
First: reduce probably isn't the best tool to use for this. It looks like you're just adding entries to an array. reduce is overcomplicated for that task, particularly if you're doing something asynchronous. Instead, a looping construct that you can use in an async function is much, much simpler.
I'll start with reduce, then go to the looping construct.
reduce works synchronously. If you pass an async function in as its callback, the promise that function returns will be the accumulator value seen by the next callback. So if one of the steps in the reduce operation needs to be asynchronous and return a promise, every step after it has to be asynchronous returning a promise (for simplicity, it's best to just make every step return a promise); and the result of the reduce will be a promise for the eventual final value, not the final value itself. You can't make an asynchronous call synchronous, and you can't make a synchronous operation (reduce) wait for an asynchronous result.
So, all of your callbacks will be dealing with promises. It'll look a bit like this:
const result = await records.reduce(async(arrayPromise, currentValue) => {
// −−−−−−−−−−−−^^^^^−−−−−−−−−−−−−−−−−−−−−−^^^^^^^^^^^^
const array = await arrayPromise // <=====
// do stuff
someValue = await asyncCall(currentValue)
array.push(someValue) // <==== `push` rather than `add`, presumably
return array
}, Promise.resolve([]))
// ^^^^^^^^^^^^^^^^−−^
Of course, since that uses await, it has to be in an async function. Otherwise:
records.reduce(async(arrayPromise, currentValue) => {
const array = await arrayPromise // <=====
// do stuff
someValue = await asyncCall(currentValue)
array.push(someValue)
return array
}, Promise.resolve([]))
.then(result => {
// ...use `result` here
})
.catch(error => {
// ...handle/report error here...
})
You're better off with a looping construct that natively supports being part of an async function:
const result = []
for (const currentValue of records) {
someValue = await asyncCall(currentValue)
result.push(someValue)
}
// ...use `result` here...
or even
const result = []
for (const currentValue of records) {
result.push(await asyncCall(currentValue))
}
// ...use `result` here...
If you need to do this in a function that isn't an async function, you'll be dealing explicitly with a promise, which would look like:
(async () => {
const result = []
for (const currentValue of records) {
result.push(await asyncCall(currentValue))
}
return result
})()
.then(result => {
// ...use `result` here
})
.catch(error => {
// ...handle/report error here...
})
I think the simplest thing would be as following
const promises = records.reduce((array, currentValue) => {
// do stuff
someValue = asyncCall(currentValue)
array.add(someValue)
return array
}, [])
const results= Promise.all(promises);
If the use-case for your reduce function is more complicated, please post more code or create a sandbox
jq.run('.', '/path/to/file.json').then(console.log) is asynchronous so when I try to use it I get this: Promise { <pending> } AND THEN I get the result, but it's too late... so how can I fix this ?
I try to wait with await but I don't know where I can put this keyword. So here's my code:
const jq = require('node-jq')
const filter = '[.root[].A[].AT]'
const jsonPath = './simple.json'
data = jq.run(filter, jsonPath).then((output) => {
console.log(output)
}).catch((err) => {
console.error(err)
})
fs.appendFile('./jqTest.txt', data + "\r\n", function (err) {
if (err) throw err;
console.log("complete!")
});
The whole point of asynchronous APIs is that you can't write
data = getResultsAsynchronously();
doStuffWith(data);
...
(Unless you use await, which is slightly magical.)
Instead, traditional asynchronous APIs take a function to call when the result is ready:
getResultsAsynchronously(function (data) {
doStuffWith(data);
...
});
I.e. all the code that follows the original function call in the synchronous version is instead put into a callback function and passed into getResultsAsynchronously.
Promises still follow this general pattern, but let you decouple starting the asynchronous operation itself from deciding how to handle the result. That is, you can start an asynchronous operation first and register a callback that handles the results later, in a second step:
promise = getResultsAsynchronously();
// and later:
promise.then(function (data) {
doStuffWith(data);
...
});
However, you don't have to separate the two steps if you don't want to:
getResultsAsynchronously().then(function (data) {
doStuffWith(data);
...
});
.then also returns a promise, to which you can attach further callbacks by calling .then or .catch.
In your code,
data = jq.run(filter, jsonPath).then(...).catch(...)
data is just another promise, but one without any useful return value inside (because your then and catch callbacks don't return any value).
To fix your logic, it should look like this:
jq.run(filter, jsonPath).then((data) => {
fs.appendFile('./jqTest.txt', data + "\r\n", (err) => {
if (err) throw err;
console.log("complete!")
});
}).catch((err) => {
console.error(err)
});
To recap: Asynchronous results are only available inside callback functions. You can't use the return value like with a synchronous operation.
That said, async / await let you convert asynchronous code into synchronous code (or at least something that looks synchronous). However, this trick only works "on the inside": The external interface is still asynchronous, you can just write more normal looking code internally.
For example:
// await is only available inside async functions, so let's define one:
(async function () {
// magic happens here:
let data = await jq.run(filter, jsonPath);
fs.appendFile('./jqTest.txt', data + "\r\n", (err) => {
if (err) throw err;
console.log("complete!")
});
})(); // ... and invoke it immediately
Internally, JavaScript rewrites
x = await f();
doStuffWith(x);
...
into something that looks like
return f().then((x) => {
doStuffWith(x);
...
});
i.e. await lets you pull the contents of a callback function out into straight line code. Ultimately the whole async function still returns a promise, however.
I have a function with multiple forEach loops:
async insertKpbDocument(jsonFile) {
jsonFile.doc.annotations.forEach((annotation) => {
annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
return jsonFile;
}
I need to make sure that the async code in the forEach loop calling the this.addVertex function is really done before executing the next one.
But when I log variables, It seems that the this.addRelation function is called before the first loop is really over.
So I tried adding await terms before every loops like so :
await jsonFile.doc.annotations.forEach(async (annotation) => {
await annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
await annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
But same behavior.
Maybe it is the log function that have a latency? Any ideas?
As we've discussed, await does not pause a .forEach() loop and does not make the 2nd item of the iteration wait for the first item to be processed. So, if you're really trying to do asynchronous sequencing of items, you can't really accomplish it with a .forEach() loop.
For this type of problem, async/await works really well with a plain for loop because they do pause the execution of the actual for statement to give you sequencing of asynchronous operations which it appears is what you want. Plus, it even works with nested for loops because they are all in the same function scope:
To show you how much simpler this can be using for/of and await, it could be done like this:
async insertKpbDocument(jsonFile) {
for (let annotation of jsonFile.doc.annotations) {
for (let entity of annotation.entities) {
await this.addVertex(entity);
}
for (let relation of annotation.relations) {
await this.addRelation(relation);
}
}
return jsonFile;
}
You get to write synchronous-like code that is actually sequencing asynchronous operations.
If you are really avoiding any for loop, and your real requirement is only that all calls to addVertex() come before any calls to addRelation(), then you can do this where you use .map() instead of .forEach() and you collect an array of promises that you then use Promise.all() to wait on the whole array of promises:
insertKpbDocument(jsonFile) {
return Promise.all(jsonFile.doc.annotations.map(async annotation => {
await Promise.all(annotation.entities.map(entity => this.addVertex(entity)));
await Promise.all(annotation.relations.map(relation => this.addRelation(relation)));
})).then(() => jsonFile);
}
To fully understand how this works, this runs all addVertex() calls in parallel for one annotation, waits for them all to finish, then runs all the addRelation() calls in parallel for one annotation, then waits for them all to finish. It runs all the annotations themselves in parallel. So, this isn't very much actual sequencing except within an annotation, but you accepted an answer that has this same sequencing and said it works so I show a little simpler version of this for completeness.
If you really need to sequence each individual addVertex() call so you don't call the next one until the previous one is done and you're still not going to use a for loop, then you can use the .reduce() promise pattern put into a helper function to manually sequence asynchronous access to an array:
// helper function to sequence asynchronous iteration of an array
// fn returns a promise and is passed an array item as an argument
function sequence(array, fn) {
return array.reduce((p, item) => {
return p.then(() => {
return fn(item);
});
}, Promise.resolve());
}
insertKpbDocument(jsonFile) {
return sequence(jsonFile.doc.annotations, async (annotation) => {
await sequence(annotation.entities, entity => this.addVertex(entity));
await sequence(annotation.relations, relation => this.addRelation(relation));
}).then(() => jsonFile);
}
This will completely sequence everything. It will do this type of order:
addVertex(annotation1)
addRelation(relation1);
addVertex(annotation2)
addRelation(relation2);
....
addVertex(annotationN);
addRelation(relationN);
where it waits for each operation to finish before going onto the next one.
foreach will return void so awaiting it will not do much. You can use map to return all the promises you create now in the forEach, and use Promise.all to await all:
async insertKpbDocument(jsonFile: { doc: { annotations: Array<{ entities: Array<{}>, relations: Array<{}> }> } }) {
await Promise.all(jsonFile.doc.annotations.map(async(annotation) => {
await Promise.all(annotation.entities.map(async (entity) => {
await this.addVertex(entity);
}));
await Promise.all(annotation.relations.map(async (relation) => {
await this.addRelation(relation);
}));
}));
return jsonFile;
}
I understand you can run all the addVertex concurrently. Combining reduce with map splitted into two different set of promises you can do it. My idea:
const first = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.entities.map(this.addVertex));
return acc;
}, []);
await Promise.all(first);
const second = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.relations.map(this.addRelation));
return acc;
}, []);
await Promise.all(second);
You have more loops, but it does what you need I think
forEach executes the callback against each element in the array and does not wait for anything. Using await is basically sugar for writing promise.then() and nesting everything that follows in the then() callback. But forEach doesn't return a promise, so await arr.forEach() is meaningless. The only reason it isn't a compile error is because the async/await spec says you can await anything, and if it isn't a promise you just get its value... forEach just gives you void.
If you want something to happen in sequence you can await in a for loop:
for (let i = 0; i < jsonFile.doc.annotations.length; i++) {
const annotation = jsonFile.doc.annotations[i];
for (let j = 0; j < annotation.entities.length; j++) {
const entity = annotation.entities[j];
await this.addVertex(entity);
});
// code here executes after all vertix have been added in order
Edit: While typing this a couple other answers and comments happened... you don't want to use a for loop, you can use Promise.all but there's still maybe some confusion, so I'll leave the above explanation in case it helps.
async/await does not within forEach.
A simple solution: Replace .forEach() with for(.. of ..) instead.
Details in this similar question.
If no-iterator linting rule is enabled, you will get a linting warning/error for using for(.. of ..). There are lots of discussion/opinions on this topic.
IMHO, this is a scenario where we can suppress the warning with eslint-disable-next-line or for the method/class.
Example:
const insertKpbDocument = async (jsonFile) => {
// eslint-disable-next-line no-iterator
for (let entity of annotation.entities) {
await this.addVertex(entity)
}
// eslint-disable-next-line no-iterator
for (let relation of annotation.relations) {
await this.addRelation(relation)
}
return jsonFile
}
The code is very readable and works as expected. To get similar functionality with .forEach(), we need some promises/observables acrobatics that i think is a waste of effort.
I have an RxJS sequence being consumed in the normal manner...
However, in the observable 'onNext' handler, some of the operations will complete synchronously, but others require async callbacks, that need to be waited on before processing the next item in the input sequence.
...little bit confused how to do this. Any ideas? thanks!
someObservable.subscribe(
function onNext(item)
{
if (item == 'do-something-async-and-wait-for-completion')
{
setTimeout(
function()
{
console.log('okay, we can continue');
}
, 5000
);
}
else
{
// do something synchronously and keep on going immediately
console.log('ready to go!!!');
}
},
function onError(error)
{
console.log('error');
},
function onComplete()
{
console.log('complete');
}
);
Each operation you want to perform can be modeled as an observable. Even the synchronous operation can be modeled this way. Then you can use map to convert your sequence into a sequence of sequences, then use concatAll to flatten the sequence.
someObservable
.map(function (item) {
if (item === "do-something-async") {
// create an Observable that will do the async action when it is subscribed
// return Rx.Observable.timer(5000);
// or maybe an ajax call? Use `defer` so that the call does not
// start until concatAll() actually subscribes.
return Rx.Observable.defer(function () { return Rx.Observable.ajaxAsObservable(...); });
}
else {
// do something synchronous but model it as an async operation (using Observable.return)
// Use defer so that the sync operation is not carried out until
// concatAll() reaches this item.
return Rx.Observable.defer(function () {
return Rx.Observable.return(someSyncAction(item));
});
}
})
.concatAll() // consume each inner observable in sequence
.subscribe(function (result) {
}, function (error) {
console.log("error", error);
}, function () {
console.log("complete");
});
To reply to some of your comments...at some point you need to force some expectations on the stream of functions. In most languages, when dealing with functions that are possibly async, the function signatures are async and the actual async vs sync nature of the function is hidden as an implementation detail of the function. This is true whether you are using javaScript promises, Rx observables, c# Tasks, c++ Futures, etc. The functions end up returning a promise/observable/task/future/etc and if the function is actually synchronous, then the object it returns is just already completed.
Having said that, since this is JavaScript, you can cheat:
var makeObservable = function (func) {
return Rx.Observable.defer(function () {
// execute the function and then examine the returned value.
// if the returned value is *not* an Rx.Observable, then
// wrap it using Observable.return
var result = func();
return result instanceof Rx.Observable ? result: Rx.Observable.return(result);
});
}
someObservable
.map(makeObservable)
.concatAll()
.subscribe(function (result) {
}, function (error) {
console.log("error", error);
}, function () {
console.log("complete");
});
First of all, move your async operations out of subscribe, it's not made for async operations.
What you can use is mergeMap (alias flatMap) or concatMap. (I am mentioning both of them, but concatMap is actually mergeMap with the concurrent parameter set to 1.) Settting a different concurrent parameter is useful, as sometimes you would want to limit the number of concurrent queries, but still run a couple concurrent.
source.concatMap(item => {
if (item == 'do-something-async-and-wait-for-completion') {
return Rx.Observable.timer(5000)
.mapTo(item)
.do(e => console.log('okay, we can continue'));
} else {
// do something synchronously and keep on going immediately
return Rx.Observable.of(item)
.do(e => console.log('ready to go!!!'));
}
}).subscribe();
I will also show how you can rate limit your calls. Word of advice: Only rate limit at the point where you actually need it, like when calling an external API that allows only a certain number of requests per second or minutes. Otherwise it is better to just limit the number of concurrent operations and let the system move at maximal velocity.
We start with the following snippet:
const concurrent;
const delay;
source.mergeMap(item =>
selector(item, delay)
, concurrent)
Next, we need to pick values for concurrent, delay and implement selector. concurrent and delay are closely related. For example, if we want to run 10 items per second, we can use concurrent = 10 and delay = 1000 (millisecond), but also concurrent = 5 and delay = 500 or concurrent = 4 and delay = 400. The number of items per second will always be concurrent / (delay / 1000).
Now lets implement selector. We have a couple of options. We can set an minimal execution time for selector, we can add a constant delay to it, we can emit the results as soon as they are available, we can can emit the result only after the minimal delay has passed etc. It is even possible to add an timeout by using the timeout operators. Convenience.
Set minimal time, send result early:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.merge(Rx.Observable.timer(delay).ignoreElements())
}
Set minimal time, send result late:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.zip(Rx.Observable.timer(delay), (item, _))
}
Add time, send result early:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.concat(Rx.Observable.timer(delay).ignoreElements())
}
Add time, send result late:
function selector(item, delay) {
return Rx.Observable.of(item)
.delay(1000) // replace this with your actual call.
.delay(delay)
}
Another simple example to do manual async operations.
Be aware that it is not a good reactive practice ! If you only want to wait 1000ms, use Rx.Observable.timer or delay operator.
someObservable.flatMap(response => {
return Rx.Observable.create(observer => {
setTimeout(() => {
observer.next('the returned value')
observer.complete()
}, 1000)
})
}).subscribe()
Now, replace setTimeout by your async function, like Image.onload or fileReader.onload ...