Async function inside for loop - javascript

I am building an App and I have a problem inside a for loop.
Inside my function I got two arrays as arguments (payload.data.start and payload.data.end) and I am trying to push it inside the mongodb.
My code looks like this
async function emplaceAval (state, payload, blockInfo, context) {
for(var i=0 ; i<payload.data.start.length ; i++) // start and end have the same length
{
const user =await User.findOne({ 'account': payload.data.account })
user.availability.push({start: new Date(payload.data.start[i]+'Z') , end: new Date(payload.data.end[i]+'Z')});
await user.save();
}
}
The problem is that lots of times I lose data. By losing data I mean that the i changes before user.save take place.
I consider to use forEach , but I have two arrays that need to be save together , so I cant .
The second solution I thought is to create an index array . For example if the length of my arrays is 5 , I will create an indexTable=[0 , 1 , 2 , 3 , 4 ] and I will use asyncForEach to this array. But i dont think that this solution is the preferable.
Any ideas? Thanks in advance

From what I can see here the looping is completely unneccesary. MongoDB has a $push operator which allows update of an array without retrieving the document first. This also has an $each option to allow a list of elements to be "pushed" in the singe update.
In short this is just one request and response to the server to await:
// Transpose to array of objects for update
let availability = payload.data.start.map((e,i) =>
({ start: new Date(e+'Z'), end: new Date(payload.data.end[i] + 'Z') })
);
try {
// Perform the **one** update request
let response = await User.updateOne(
{ 'account': payload.data.account },
{ '$push': { 'availability': { '$each': availability } } }
);
// maybe check the response
} catch(e) {
// do something with any error
}
That's all you need do. No need to "loop" and so much less overhead than going back and forth to the server retrieving a document and making changes then putting the document back.

Related

Missing updated record when using map function

This is my code for update many records with mongodb.
const nfts = await waxModel.find();
console.log(nfts.length) // -> 121
// Option 1: using map()
nfts.map(async nft => {
nft.trait_attribute = null;
await nft.save();
});
// Option 2: using loop
for (let i = 0; i < nfts.length; i++) {
nfts[i].trait_attribute = null;
await nfts[i].save();
}
This is my DB, no recored have trait_attribute: null:
This is the first result when I use map(), only have 2 records:
And this is the second result when I use for loop:
I don't know that is the problem. In my previous question: I know map methods don't handle the asynchronous function but it's still have full records not missing many updated records as now.
Thank for your attention.
It looks like your code doesn't handle race conditions well.
The first option saves every nft simultaneously whilst the second one saves them one at a time. A more readable way to do that would be using a for-of loop:
for (const nft of nfts) {
nft.trait_attribute = null;
await nft.save();
}

Discord.js Role Assign and Looping Issues

I am not very efficient with my code which may be the reasons why this keeps failing. I am trying to remove and assign roles to "verified" users. The basic gist of the code is to loop through all "verified" users and assign them appropriate roles according to the data received from the API.
const fetch = require("node-fetch");
var i = 0;
function mainLoop(
guild,
redisClient,
users,
main_list,
Pilot,
Astronaut,
Cadet,
main_guild,
cadet_guild,
guest
) {
setTimeout(function () {
redisClient.GET(users[i], async function (err, reply) {
if (reply != null) {
var json = await JSON.parse(reply);
var uuid = Object.keys(json).shift();
if (Object.keys(main_list).includes(uuid)) {
var tag = users.shift();
var rank = main_list[uuid];
console.log(`${tag}: ${rank}`);
var role = guild.roles.cache.find(
(role) => role.name === `| ✧ | ${rank} | ✧ |`
);
await guild.members.cache.get(tag).roles.remove(guest);
await guild.members.cache.get(tag).roles.remove(Astronaut);
await guild.members.cache.get(tag).roles.remove(Cadet);
await guild.members.cache.get(tag).roles.remove(Pilot);
await guild.members.cache.get(tag).roles.remove(cadet_guild);
await guild.members.cache.get(tag).roles.add(main_guild);
await guild.members.cache.get(tag).roles.add(role);
} else {
var tag = users.shift();
console.log(`${tag}: Guest`);
await guild.members.cache.get(tag).roles.remove(Astronaut);
await guild.members.cache.get(tag).roles.remove(Cadet);
await guild.members.cache.get(tag).roles.remove(Pilot);
await guild.members.cache.get(tag).roles.remove(main_guild);
await guild.members.cache.get(tag).roles.remove(cadet_guild);
await guild.members.cache.get(tag).roles.add(guest);
}
}
i++;
if (i < users.length) {
mainLoop(
guild,
redisClient,
users,
main_list,
Pilot,
Astronaut,
Cadet,
main_guild,
cadet_guild,
guest
);
}
});
}, 5000);
}
The code will fetch api data, map the "verified" users and api data into an array. Then, when it starts looping through the users array, it will only log 3 times and not assign any roles. Any help would be appreciated.
I can provide extra explanation/code if needed.
One possible issue I see here is that you are both incrementing the index i and calling .shift() on the users array. This may be the cause of the problem you are experiencing, as this will entirely skip some of the users in the array. Array.shift() doesn't just return the first element of the array; it removes it from the array.
Consider, for example, that your users array looks like this:
var users = ["Ted", "Chris", "Ava", "Madison", "Jay"];
And your index starts at 0 like so:
var i = 0;
This is what is happening in your code:
Assign roles for users[i]; the index is currently 0, so get users[0] (Ted).
Get Ted's tag via users.shift(). users is now: ["Chris", "Ava", "Madison", "Jay"]
Increment the index with i++. i is now: 1.
Assign roles for users[i]; the index is currently 1, so get users[1] (now Ava, skips Chris entirely).
Get Ava's tag via users.shift() (actually gets Chris' tag). users is now: ["Ava", "Madison", "Jay"]
Increment the index with i++. i is now: 2.
Assign roles for users[i]; the index is currently 2, so get users[2] (now Jay, skips Madison entirely).
And so on, for the rest of the array; about half of the users in the users array will be skipped.
I don't know how many users are supposed to be in your users array, but this could be the reason why so few logs are occurring. Note, however, that this is just one cause of the problem you are experiencing; it is possible that there are more reasons why you are having that issue, such as rate limits.
My recommendation on how to fix this is to not use users.shift() to get the user's tag. Simply use users[i], which will return the proper tag value without messing with the length of the array. Another way to fix this would be to remove the index incrementation, and always use 0 as your index. Use one or the other, but not both.

Multiple Firebase listeners in useEffect and pushing new event into state

I want to retrieve a list of products in relation to the user's position, for this I use Geofirestore and update my Flatlist
When I have my first 10 closest collections, I loop to have each of the sub-collections.
I manage to update my state well, but every time my collection is modified somewhere else, instead of updating my list, it duplicates me the object that has been modified and adds it (updated) at the end of my list and keep the old object in that list too.
For example:
const listListeningEvents = {
A: {Albert, Ducon}
B: {Mickael}
}
Another user modified 'A' and delete 'Ducon', I will get:
const listListeningEvents = {
A: {Albert, Ducon},
B: {Mickael},
A: {Albert}
}
And not:
const listListeningEvents = {
A: {Albert},
B: {Mickael},
}
That's my useEffect:
useEffect(() => {
let geoSubscriber;
let productsSubscriber;
// 1. getting user's location
getUserLocation()
// 2. then calling geoSubscriber to get the 10 nearest collections
.then((location) => geoSubscriber(location.coords))
.catch((e) => {
throw new Error(e.message);
});
//Here
geoSubscriber = async (coords) => {
let nearbyGeocollections = await geocollection
.limit(10)
.near({
center: new firestore.GeoPoint(coords.latitude, coords.longitude),
radius: 50,
})
.get();
// Empty array for loop
let nearbyUsers = [];
// 3. Getting Subcollections by looping onto the 10 collections queried by Geofirestore
productsSubscriber = await nearbyGeocollections.forEach((geo) => {
if (geo.id !== user.uid) {
firestore()
.collection("PRODUCTS")
.doc(geo.id)
.collection("USER_PRODUCTS")
.orderBy("createdDate", "desc")
.onSnapshot((product) => {
// 4. Pushing each result (and I guess the issue is here!)
nearbyUsers.push({
id: product.docs[0].id.toString(),
products: product.docs,
});
});
}
});
setLoading(false);
// 4. Setting my state which will be used within my Flatlist
setListOfProducts(nearbyUsers);
};
return () => {
if (geoSubscriber && productsSubscriber) {
geoSubscriber.remove();
productsSubscriber.remove();
}
};
}, []);
I've been struggling since ages to make this works properly and I'm going crazy.
So I'm dreaming about 2 things :
Be able to update my state without duplicating modified objects.
(Bonus) Find a way to get the 10 next nearest points when I scroll down onto my Flatlist.
In my opinion the problem is with type of nearbyUsers. It is initialized as Array =[] and when you push other object to it just add new item to at the end (array reference).
In this situation Array is not very convenient as to achieve the goal there is a need to check every existing item in the Array and find if you find one with proper id update it.
I think in this situation most convenient will be Map (Map reference). The Map indexes by the key so it is possible to just get particular value without searching it.
I will try to adjust it to presented code (not all lines, just changes):
Change type of object used to map where key is id and value is products:
let nearbyUsersMap = new Map();
Use set method instead of push to update products with particular key:
nearbyUsersMap.set(product.docs[0].id.toString(), product.docs);
Finally covert Map to Array to achieve the same object to use in further code (taken from here):
let nearbyUsers = Array.from(nearbyUsersMap, ([id, products]) => ({ id, products }));
setListOfProducts(nearbyUsers);
This should work, but I do not have any playground to test it. If you get any errors just try to resolve them. I am not very familiar with the geofirestore so I cannot help you more. For sure there are tones of other ways to achieve the goal, however this should work in the presented code and there are just few changes.

Javascript - Push to a new array dynamically?

Hello I am new to javascript so I'm sorry in advance if my explanation is not the best.
I am getting my data back from an array called myData. I have a condition statement which checks the page url and depending on the url I am pushing a specific index of an array to a new array called stateArray
At the moment I am using the push method like this:
stateArray.push(myData[1][5], myData[2][5], myData[3][5], myData[4][5], myData[5][5], myData[6][5], myData[7][5], myData[8][5], myData[9][5], myData[10][5], myData[11][5], myData[12][5], myData[13][5], myData[14][5], myData[15][5], myData[16][5], myData[17][5], myData][5])
the return of stateArray is giving back the data I am expecting but I am going to have ten different conditions and would like to know if there is a way of doing the push 17 times for every condition better?
Every element is the same for the condition. For example
if (url.includes('/states/') {
stateArray.push(myData[1][5], myData[2][5], myData[3][5], myData[4][5], myData[5][5], myData[6][5], myData[7][5], myData[8][5], myData[9][5], myData[10][5], myData[11][5], myData[12][5], myData[13][5], myData[14][5], myData[15][5], myData[16][5], myData[17][5], myData][5])
} else if (url.includes('/homes/) {
stateArray.push(myData[1][6], myData[2][6], myData[3][6], myData[4][6], myData[5][6], myData[6][6], myData[7][6], myData[8][6], myData[9][6], myData[10][6], myData[11][6], myData[12][6], myData[13][6], myData[14][6], myData[15][6], myData[16][6], myData[17][6], myData][6])
} else if (url.incldues('/retail/) {
stateArray.push(myData[1][7], myData[2][7], myData[3][7], myData[4][7], myData[5][7], myData[6][7], myData[7][7], myData[8][7], myData[9][7], myData[10][7], myData[11][7], myData[12][7], myData[13][7], myData[14][7], myData[15][7], myData[16][7], myData[17][5], myData][7])
}
Like I mentioned earlier I currently have 10 conditions and it is very difficult to maintain and update. Is there a way of generating the same results dynamically? I believe this can be done through a loop but I am not familiar with the syntax in regards to pushing at a specific index and ending at a specific index.
My expected outcome is a short handed way of going through each condition and pushing into the new Array.
You can use forEach:
if (url.includes('/states/')) {
myData.forEach(e => stateArray.push(e[5]));
} else if (url.includes('/homes/')) {
myData.forEach(e => stateArray.push(e[6]));
} else if (url.includes('/retail/')) {
myData.forEach(e => stateArray.push(e[7]));
}
(Also note you need a second ) at the end of your if to close off both the if and the includes)

Splicing array to add to an existing array

I am using rss2json to consume an rss feed. There is not a page param to enable pagination. There is a count parameter that I can pass to the request. I am able to load the feed and get results back. I have created a service using ionic to make a request to get the feed:
getRssFeed(rssUrl: string, count: number) {
return new Promise<any>(resolve => {
this.http.get(`${ environment.podcast.baseUrl }?rss_url=${ rssUrl }&api_key=${ environment.podcast.apiKey }&count=${ count }`)
.subscribe(data => {
resolve(data);
}, error => {
console.error('Something really bad happened trying to get rss feed.');
console.error(error);
});
});
}
This works great. I can get the data back - all is well. I am using an infinite scroll component to handle pagination. Again, all is well. I am starting with 10 podcast episodes. I am logging out when I want to load more episodes:
When I scroll, the service makes the correct call, but because the rss2json service does not have a page param, It will return the entire array when I update the count.
So I need to do something like this:
episodes: Array<any>;
count = 10;
...
this.episodes.splice(this.episodes.length, this.count, data.items);
I need to find out how many episodes I already have. The first time I get to the bottom of my list, I'll have 10 (I want to increment +10 each load). So I need to:
Find out how many episodes I currently have (10, 20, 30 etc.)
Make the request to get more episodes
Service returns 20 episodes -- but it will always start at zero.
Slice the first 10, 20, ?? episodes that are returned, add the remaining 10 to the end of the list.
I am not sure how to achieve this and could use some direction.
Here is how I am requesting more episodes:
this.myPodcastService.getRssFeed(this.rssUrl, this.count)
.then(data => {
if (data) {
// console.log('data', data.items);
// data is an object
// data.items is an array of episodes
// this.episodes.splice(this.episodes.length, this.count, data.items);
} else {
...
}
...
});
For example, the first time I get to the end of my episodes, I'll have 10 on the page. I want to go out, get 10 more episodes. So I need to increment my count variable to 20 and pass that in as the count param.
The service will return 20 items. The first 10 I want to delete (They are already on screen). I only need the last 10 episodes...
Now I'll have 20 episodes. The next time I scroll, I'll need to increment my count to 30. The service will return an array of 30 items. I will need to delete (splice) the first 20; leaving only the last 10 -- then add that to the episodes array.
The logging should show something like:
this.episodes[10]
this.episodes[20]
this.episodes[30]
I hope that makes sense. I know what I'm trying to achieve, I'm struggling how to actually do it. Thank you for any suggestions!
EDIT/SOLUTION
Thank you so much for the suggestion! In case someone else comes across this, here is what I came up with that is doing exactly what I need.
// load more episodes using infinite scroll.
loadMoreEpisodes(event) {
console.log('--> loading more episodes');
this.count = (this.count + this.count); // 10, 20, 30...
this.myPodcastService.getRssFeed(this.rssUrl, this.count)
.then(data => {
if (data) {
// append the new episodes to the existing array
this.episodes.push(...data.items.splice(-this.episodes.length, this.count));
event.target.complete();
console.log('this.episodes', this.episodes);
} else {
this.alertCtrl.create({
header: 'Error',
subHeader: 'Something bad happened',
message: 'Something internet related happened & we couldn\'t load the playlist.',
buttons: [{ text: 'Ok', role: 'cancel' }]
}).then(alert => {
alert.present();
});
}
});
}
Given the API does not provide a means to get specific data, where the client has to request duplicate data, you can .splice() from the end of the array
this.episodes.push(...data.splice(-10, 10))

Categories