I am watching videos to learn MongoDB Express.js VueJS Node.js (MEVN) stack.
And I want to create a seed directory and also use promise functions
// const delay = require('delay')
const Promise = require('bluebird')
const songs = require('./songs.json')
const users = require('./users.json')
const bookmarks = require('./bookmarks.json')
const historys = require('./history.json')
sequelize.sync({ force: true })
.then( async function () {
await Promise.all(
users.map( user => {
User.create(user)
})
)
await Promise.all(
songs.map( song => {
Song.create(song)
})
)
//I have to add this line
// ---> await delay(1000)
await Promise.all(
bookmarks.map( bookmark => {
Bookmark.create(bookmark)
})
)
await Promise.all(
historys.map( history => {
History.create(history)
})
)
})
I have four tables with seeds to create, and the last two tables data must be created after the former two tables data. (They are foreign keys)
But every time I run this file, the last two tables data will be created first
The only way I can prevent this is to add delay(1000) between them.
I am wondering if there exists any efficient way to solve this issue~
Thank you.
Race conditions like this one is always caused by that promises weren't properly chained.
A promise should be returned from map callback:
await Promise.all(
users.map( user => User.create(user))
);
etc.
Not returning a value from map is virtually always a mistake. It can be prevented by using array-callback-return ESLint rule.
If User.create(user), etc. were Bluebird promises with default configuration, not chaining them would also result in this warning.
My assumption why your code might fail:
You're not returning the Promises that I guess /(User|Song|Bookmark|History).create/g return to the Promise.all() function, since your map callback is not returning anything.
If you're using Arrow functions with brackets, then you need to explicitly specify the return value (using the familar return keyword).
Otherwise you can just omit the curly brackets.
My suggestion is, that you refactor you're code by utilizing Promise .then()-Chaining.
For you're example, I would suggest something like this:
const Promise = require('bluebird')
const songs = require('./songs.json')
const users = require('./users.json')
const bookmarks = require('./bookmarks.json')
const histories = require('./history.json')
sequelize.sync({
force: true
}).then(() =>
Promise.all(
users.map(user =>
User.create(user)
)
).then(() =>
Promise.all(
songs.map(song =>
Song.create(song)
)
)
).then(() =>
Promise.all(
bookmarks.map(bookmark =>
Bookmark.create(bookmark)
)
)
).then(() =>
Promise.all(
histories.map(history =>
History.create(history)
)
)
)
);
Related
I want to call a number of API endpoints at once (asynchronously). I get an Observable from a single API call, and I want to "await" them all together, and get the results from each call to handle as a collection, preferably with an option to handle the exceptions (if any) from each one. Much like asyncio.gather() but for Angular. I don't want to convert the observables to promises, and I don't want to use deprecated methods such forkJoin() or the async pipe. Is there any other solution?
forkJoin is actually not deprecated but some overrides of it are.
An array of Observables or a dictionary are valid inputs to forkJoin.
This shouldn't throw any warnings about deprecation.
const sources$: Observable<any>[] = [obs1$, obs2$, obs3$]
forkJoin(sources$).subscribe()
So instead of 'converting' the observables to promises (I assume you're referring to toPromise()), you just wrap the subscriptions in promises. Then you use Promise.all() to await them concurrently.
constructor(private http: HttpClient) {}
async waitForResponses() {
const obs1 = this.http.get('api1');
const obs2 = this.http.get('api2');
const obs3 = this.http.get('api3');
const promise1 = new Promise((resolve) =>
obs1.subscribe((result) => resolve(result))
);
const promise2 = new Promise((resolve) =>
obs2.subscribe((result) => resolve(result))
);
const promise3 = new Promise((resolve) =>
obs3.subscribe((result) => resolve(result))
);
const [result1, result2, result3] = await Promise.all<any>([promise1, promise2, promise3]);
console.log(result1);
console.log(result2);
console.log(result3);
}
This awaits all api calls concurrently, but you could await them in any order you like.
Promise.all() returns an array of the results, at the same index as their respective promise. I'm pretty sure that's exactly what you're looking for.
Error handling:
const promise1 = new Promise((resolve, reject) =>
obs1.subscribe({
next: (result) => resolve(result),
error: (err) => reject(err),
})
).catch((err) => console.log(err));
I have been studying promises, await and async functions. While I was just in the stage of learning promises, I realized that the following: When I would send out two requests, there was no guarantee that they would come in the order that they are written in the code. Of course, with routing and packets of a network. When I ran the code below, the requests would resolve in no specific order.
const getCountry = async country => {
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
getCountry('portugal');
getCountry('ecuador');
At this point, I hadn't learned about async and await. So, the following code works the exact way I want it. Each request, waits until the other one is done.
Is this the most simple way to do it? Are there any redundancies that I could remove? I don't need a ton of alternate examples; unless I am doing something wrong.
await fetch(`https://restcountries.com/v2/name/${country}`)
.then(res => res.json())
.then(data => {
console.log(data[0]);
})
.catch(err => err.message);
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('ecuador');
};
getCountryData();
Thanks in advance,
Yes, that's the correct way to do so. Do realize though that you're blocking each request so they run one at a time, causing inefficiency. As I mentioned, the beauty of JavaScript is its asynchronism, so take advantage of it. You can run all the requests almost concurrently, causing your requests to speed up drastically. Take this example:
// get results...
const getCountry = async country => {
const res = await fetch(`https://restcountries.com/v2/name/${country}`);
const json = res.json();
return json;
};
const getCountryData = async countries => {
const proms = countries.map(getCountry); // create an array of promises
const res = await Promise.all(proms); // wait for all promises to complete
// get the first value from the returned array
return res.map(r => r[0]);
};
// demo:
getCountryData(['portugal', 'ecuador']).then(console.log);
// it orders by the countries you ordered
getCountryData(['ecuador', 'portugal']).then(console.log);
// get lots of countries with speed
getCountryData(['mexico', 'china', 'france', 'germany', 'ecaudor']).then(console.log);
Edit: I just realized that Promise.all auto-orders the promises for you, so no need to add an extra sort function. Here's the sort fn anyways for reference if you take a different appoach:
myArr.sort((a, b) =>
(countries.indexOf(a.name.toLowerCase()) > countries.indexOf(b.name.toLowerCase())) ? 1 :
(countries.indexOf(a.name.toLowerCase()) < countries.indexOf(b.name.toLowerCase()))) ? -1 :
0
);
I tried it the way #deceze recommended and it works fine: I removed all of the .then and replaced them with await. A lot cleaner this way. Now I can use normal try and catch blocks.
// GET COUNTRIES IN ORDER
const getCountry = async country => {
try {
const status = await fetch(`https://restcountries.com/v2/name/${country}`);
const data = await status.json();
renderCountry(data[0]); // Data is here. Now Render HTML
} catch (err) {
console.log(err.name, err.message);
}
};
const getCountryData = async function () {
await getCountry('portugal');
await getCountry('Ecuador');
};
btn.addEventListener('click', function () {
getCountryData();
});
Thank you all.
I've poked about SO and found many similar questions/answers but I may be missing something in my basic understanding on how to work with with this stack.
I'm working on a react native project along with RXJS/obervables. At some point I doing file downloads, this part is not a problem. A combo of pre-existing axios-rxjs and react-native-file-system get me where I want. The issue is I'm not sure how to handle it cleanly without async/await which I understand is an anti-pattern.
I want to transform this working code into a clean obervable-style flow.
I've implemented an Epic that does the operations I want as such:
const myDownloadEpic = (
action$,
state$
) =>
action$.pipe(
ofType(myDownloadActionType), // catches the relevant action
map(action => action.payload),
mergeMap(payload =>
downloadManager // a class with the relevant utils to get files
.download(payload), // this axios call returns my Blob file as Observable<Blob>
.pipe(
mergeMap(async response => {
// a promise is returned by RNFS to read the contents of a folder
const files = await RNFS.readDir(RELEVANT_TARGET_PATH)
...
// a promise returned from a custom function that converts my blob to bas64
const data = await convertToBase64(response)
// another promise returned by RNFS to write to the file system
await RNFS.writeFile(FULL_PATH, data, 'base64');
...
})
)
)
)
I've tried splitting this into several pipes, for example, I tried splitting the READ operation into a previous pipe but it ends up looking very verbose. Is there not a clean simple way to "hold" until the promises are done so I can make decisions based on their result?
What would be considered cleaner in this situation?
You can try something like this. It should be roughly equivalent to what you've written above.
const myDownloadEpic = (
action$,
state$
) => action$.pipe(
ofType(myDownloadActionType),
map(action => action.payload),
mergeMap(payload => downloadManager.download(payload)),
mergeMap(response => from(RNFS.readDir(RELEVANT_TARGET_PATH)).pipe(
map(files => ({response, files}))
)),
mergeMap(values => from(convertToBase64(values.response)).pipe(
map(data => ({...values, data}))
)),
mergeMap(({response, files, data}) => RNFS.writeFile(FULL_PATH, data, 'base64'))
);
The from() operator can convert a promise into an observable that will emit the promised value then completes.
If you need to wait until all promises are resolved, I recommend forkJoin() as it won't emit a value until all observables complete.
Lastly, to make the code a little cleaner, I would also recommend declaring separate variables/functions to define your observables for each promise.
const files$ = from(RNFS.readDir(RELEVANT_TARGET_PATH));
const getData = (response: unknown) => from(convertToBase64(response)).pipe(
mergeMap(data=>
from(RNFS.writeFile(FULL_PATH, data, 'base64')).pipe(
mapTo(data)
)
)
);
const myDownloadEpic = (action$, state$) =>
action$.pipe(
ofType(myDownloadActionType),
map(({ payload }) => payload),
mergeMap(payload => downloadManager.download(payload)),
mergeMap(response =>
forkJoin({
files: files$,
data: getData(response)
})
),
map(({ files, data }) => {
// check something with `files` and `data`
})
);
I'm assuming RNFS.writeFile() response is void, so I put it as an effect when subscribing to getData(). The mapTo() operator ignores any emitted value from the source observable, and returns whatever value you put in the parameter.
I'm trying to make multiple fetch calls at once, so far I've used this page to reach a point where I can make multiple calls correctly. However the problem now is if one of those calls returns an error. What I'd like, is that if one URL is good and one is bad, the good URL still returns the JSON object, but the bad URL returns the error. But the catch statement notices one error and stops both calls.
My current code:
let resList = await Promise.all([
fetch(goodURL),
fetch(badURL)
]).then(responses => {
return Promise.all(responses.map(response => {
return response.json();
}))
}).catch(error => {
console.log(error);
});
console.log(resList);
Currently, logging resList returns undefined when I want it to return the JSON object from the good URL
As #jonrsharpe mentioned, .allSettled() is probably your best bet.
You can also basically make your own version with something like this:
const results = await Promise.all(
[a, b, c].map(p =>
p.catch(err => {
/* do something with it if you want */
}
)
);
Basically, instead of having one catch on Promise.all(), you attach a catch to each Promise BEFORE giving it to Promise.all(). If one of them throws an error, it'll trigger its individual catch, which you can do whatever with. As long as you don't return something else, you'll end up with it returning undefined, which you can filter out.
For example, if a and c worked and b errored, your results would look like this:
['a', undefined, 'b']
And it'd be in the same order as you fed the promises in, so you could tell which was the failure.
const a = new Promise((resolve) => setTimeout(() => resolve('a'), 100));
const b = new Promise((_, reject) => setTimeout(() => reject('b'), 200));
const c = new Promise((resolve) => setTimeout(() => resolve('c'), 50));
(async () => {
const results = await Promise.all(
[a, b, c].map(p =>
p.catch(err => console.error('err', err))
)
);
console.log(results);
})();
I have this code:
exports.cleanDB = function() {
return mongoose.connection.db.dropDatabase();
};
Since that is malpractice, I want to iterate through all collections and want to call
mongoose.connection.db.DateTime.remove();
on every one it.
Can somebody help me to create the code together with that return statement?
On another part of the app similar code where I don't know how to rewrite:
exports.cleanDB = function*(req) {
yield mongoose.connection.db.dropDatabase();
Really don't see what's wrong with dropping the database. But if you really must then you can just loop the registered models and do a .remove().
For instance:
// Just similating an async wrapper
(async function() {
try {
const conn = await mongoose.connect(uri,options);
// Loop all registered models
await Promise.all(
Object.entries(conn.models).map(([k,m]) => m.remove())
)
} catch(e) {
console.error(e)
}
})()
Or plain promises:
mongoose.connect(uri,options).then( conn => {
Promise.all(
Object.entries(conn.models).map(([k,m]) => m.remove())
).then( () => /* something */ )
})
You can even do Object.keys if you don't have support for Object.entries()
mongoose.connect(uri,options).then( conn => {
Promise.all(
Object.keys(conn.models).map(k => conn.models[k].remove())
).then( () => /* something */ )
})
Or if you really must, then dig into the database level and wipe all the collections using the .collections() method from Db
(async function() {
try {
const conn = await mongoose.connect(uri,options);
// Get every collection in an array
await Promise.all(
(await conn.db.collections()).map( c => c.remove() )
);
} catch(e) {
console.error(e)
}
})()
Or plain promises:
mongoose.connect(uri,options).then( conn => {
conn.db.collections()
.then( collections => Promise.all(
collections.map( c => c.remove() )
)
.then( () => /* something */ )
})
And that would not matter if the model was registered or not.
So it really depends on which approach you would rather take, and if you already have code that should have processed to load and register each model then using the registered models should be sufficient. Otherwise using the direct driver method to grab references to all collections presently in the database makes sure that even if the model has not yet been registered, then it's collection still has all content removed.
Note that Db.collections() is basically a wrapped version of the output from Db.listCollections() which is actually returning Collection objects instead of just the 'names'.