Insert Multiple Items Into DynamoDB Using Promise.All - javascript

I'm trying to insert multiple items into a DynamoDB table using Promise.all() by reusing a function that inserts a single item and returns a promise. The problem I'm having is the array of returned data objects is empty. That data object is returned correctly when I insert a single item.
I based my implementation on this AWS Post.
My implementation is below:
function addMessage(message) {
const timestamp = Math.floor(new Date().getTime() / 1000);
const params = {
TableName: process.env.DYNAMODB_TABLE,
Item: {
esn: message.esn,
utcMsgTimestamp: parseInt(message.utcMsgTimestamp),
payload: message.payload,
createdAt: timestamp
}
};
console.log(params);
return dynamoDb.put(params).promise();
}
function addMessages(messages) {
// Make all of the addMessage calls immediately, this will return a rejected promise if any fail
return Promise.all(
messages.map(message => {
return addMessage(message);
})
);
}
My Jest unit test case is here:
it.only("add multiple messages", () => {
const dynamodb = require("../dynamodb");
expect.assertions(0);
const messages = [
{
esn: "0-700001",
utcMsgTimestamp: "1034268516",
payload: "0xC0560D72DA4AB2445A"
},
{
esn: "0-99991",
utcMsgTimestamp: "1034268521",
payload: "0xA14AA1DBDB818F9759"
}
];
return dynamodb.addMessages(messages).then(dataArray => {
dataArray.forEach(element => {
console.log(element);
});
// expect(true).toBeTruthy();
});
Any help would be greatly appreciated.

Related

useSWRInfinite with pagination and mutate features

I'm using useSWR to fetch data from client side in nextjs.
What I am doing and trying to achieve
I am using useSWRInfinite for the pagination feature and trying to update comments like state with bound mutate function with optimisticData option since I wanted to refresh the data immediately.(client-side perspective)
-> https://swr.vercel.app/docs/mutation#optimistic-updates and then get a new updated comment from axios and replace it with a previous comment that should be updated.
Expected
The data from useSWRInfinite should be updated right away since I am using optimisticData option until the API call is done and I could've set revalidate option to true but an async function in the mutate returns updated data with the response from axios. I didn't need it.
Actual behaviour
Even though I am passing optimisticData to the mutate, It doesn't update the data immediately. It keeps waiting until The API call is done and then gets updated.
What I've tried
I have tried using just normal useSWR function without the pagination feature and it worked well as I expected.
const { data, error, isValidating, mutate, size, setSize } = useSWRInfinite<CommentType[]>(
(index) => `/api/comment?postId=${postId}&currentPage=${index + 1}`,
fetcher,
{ revalidateFirstPage: false }
);
const likeCommentHandler = async (commentId: string, dislike: boolean) => {
const optimisticData = data?.map((comments) => {
return comments.map((comment) => {
if (comment.id === commentId) {
if (dislike) {
--comment._count.likedBy;
comment.likedByIds = comment.likedByIds.filter(
(likeById) => likeById !== session!.user.id
);
} else {
comment.likedByIds.push(session!.user.id);
++comment._count.likedBy;
}
return { ...comment };
} else {
return { ...comment };
}
});
});
mutate(
async (data) => {
const { data: result } = await axios.post("/api/likeComment", {
commentId: commentId,
userId: session?.user.id,
dislike,
});
const newData = data?.map((comments) => {
return comments.map((comment) => {
if (comment.id === result.comment.id) {
return result.comment;
} else {
return comment;
}
});
});
return newData;
},
{ optimisticData, revalidate: false, populateCache: true }
);
};

React JS multiple API calls, data undefined or unexpected reserved word 'await' mapping through the data:

I'm creating a JS function that will make a call to an API, loop through the returned data and perform another call to retrieve more information about the initial data (for example where the first call return an ID, the second call would return the name/address/number the ID corresponds to). Positioning the async and await keywords though, have proven to be way more challenging than I imagined:
useEffect(() => {
const getAppointments = async () => {
try {
const { data } = await fetchContext.authAxios.get('/appointments/' + auth.authState.id);
const updatedData = await data.map(value => {
const { data } = fetchContext.authAxios.get('/customerID/' + value.customerID);
return {
...value, // de-structuring
customerID: data
}
}
);
setAppointments(updatedData);
} catch (err) {
console.log(err);
}
};
getAppointments();
}, [fetchContext]);
Everything get displayed besides the customerID, that results undefined. I tried to position and add the async/await keywords in different places, nothing works. What am I missing?
map returns an array, not a promise. You need to get an array of promises and then solve it (also, if your way worked, it would be inefficient waitting for a request to then start the next one.)
const promises = data.map(async (value) => {
const { data } = await fetchContext.authAxios.get('/customerID/' + value.customerID);
return {
...value,
customerID: data
};
});
const updatedData = await Promise.all(promises);

Nested Promise Function Using Firebase Database

I have the following function, which accepts an argument (an array of "names") and then checks data from my firebase database for each user against that array.
It uses that to compile an array of "settings" for each user with their email, and the names that the user shared with the list fed in as an argument. The function looks like this.
fbDaemon = ({ folderNames }) => {
const settings = [];
ref.once("value")
.then((snapshot) => {
snapshot.forEach(user => {
auth.getUser(user.key)
.then(function(userRecord) {
let email = userRecord.toJSON().email;
let zips = [];
user.forEach(setting => {
let dep = setting.val().department;
if(folderNames.includes(dep)){
zips.push(dep);
}
});
settings.push({ email, zips });
})
.catch(function(error) {
console.log("Error fetching user data:", error);
});
});
});
});
Essentially, it's going through my entire database and compiling a list of the settings that I will pass onto the next function. The end result should look something like this:
[ { email: 'example#example1.com',
zips: [ 'Drug Enforcement Administration', 'Executive Branch' ] },
{ email: 'example#example2.com',
zips: [ 'FEMA', 'Congress' ] },
];
The problem that I'm having right now is I'm not able to return the "settings" array at the appropriate time.
How can I reconfigure this function so that the settings array is only returned when the entire function has run?
In other words, I'd like to return a resolved promise with the settings array. How can I do this?
Perhaps you could use Promise.all() here, to resolve an array of promises (where each item in the array corresponds to a call to getUser for that item/user) ?
So, something along these lines:
fbDaemon = ({ folderNames, folderPaths }) => {
const settings = [];
return ref.once("value")
.then((snapshot) => {
// Collect all users from snapshot into an array
const users = []
snapshot.forEach(user => { users.push(user) })
// Create promise for each user, and use Promise.all to
// resolve when each "user promise" is complete
return Promise.all(users.map(user => {
// Be sure to add "return" here
return auth.getUser(user.key)
.then(function(userRecord) {
let email = userRecord.toJSON().email;
let zips = [];
user.forEach(setting => {
let dep = setting.val().department;
if(folderNames.includes(dep)){
zips.push(dep);
}
});
settings.push({ email, zips });
})
.catch(function(error) {
console.log("Error fetching user data:", error);
})
}));
}).then(function() {
return settings;
});
};

Function to return non-null value not working properly when saving data to MongoDB

I get an array of objects from an API call and then I filter the values depending on two keys: story_title and title. If both values are null then the object is filtered. Then I loop through the filtered array of objects to save certain data to mongodb (using mongoose) from the filtered array. The problem is that I want to save the document with one title key, so I created a function to check if story_title or title is null and return the non-null value.
The function is not working properly because the function in title, inside the for loop, is returning some null values.
function pickTitle(title, story_title) {
if (!story_title) {
return story_title;
} else {
return title
}
}
everyHour: async () => {
try {
data = await axios.get(url);
let posts = data.data.hits;
const filteredPosts = await posts.filter((elem) => {
return (!elem.title || !elem.story_title)
});
for (let filtered of filteredPosts) {
Post.updateOne({
_id: filtered.objectID,
author: filtered.author,
title: await pickTitle(filtered.title, filtered.story_title),
created_at: filtered.created_at,
},
{$setOnInsert: filtered},
{upsert: true},
function(err, numAffected) {
if (err) {
//console.log("err")
} else {
//console.log(numAffected)
}
})
.then(res => {
//console.log(res)
})
.catch(err => {
//console.log(err)
});
}
} catch(error) {
console.log(error);
}
}
There are a few issues here. I'll step through them in some comments in the code, as well as having the "solution" directly below the commented code.
You are awaiting calls that are not asynchronous... don't do that. Array.filter is not an asynchronous operation
Your function pickTitle is being awaited, when it's not asynchronous
Your resolving promises inside a loop, which is generally considered bad practice. Add your promises to an array, and resolve everyone with Promise.all()
Lastly, your "filter" logic filters on NULL title OR story_title. That means only one needs to hold true. It's possible that both could be NULL though. Thus your pickTitle function returns a null value if both happen to be NULL. If you want to always have at least one of them contain a value, you need to change the way your array.filter works.
const pickTitle = (title, story_title) => {
if (!story_title) {
return story_title;
}
return title;
};
async () => {
try {
data = await axios.get(url);
const posts = data.data.hits;
// const filteredPosts = await posts.filter(elem => (!elem.title || !elem.story_title)); <-- you are awaiting on a filter... don't do that
const filteredPosts = posts.filter(elem => (!elem.title || !elem.story_title));
const filteredPostAnotherWay = posts.filter(post => { // <-- This might be more of what you want...
let allowed = false;
if (!post.title) {
allowed = true;
}
if (!post.story_title) {
allowed = true;
}
return allowed;
});
const postUpdates = [];
for (const filtered of filteredPosts) {
// Post.updateOne({ <-- You are resolving promises inside a for loop. While it may work, it's generally not advised to do this. Resolve everything with promise.all instead....
// _id: filtered.objectID,
// author: filtered.author,
// title: await pickTitle(filtered.title, filtered.story_title),
// created_at: filtered.created_at,
// story_url: filtered.story_url
// },
// { $setOnInsert: filtered },
// { upsert: true },
// (err, numAffected) => {
// if (err) {
// // console.log("err")
// } else {
// // console.log(numAffected)
// }
// })
// .then(res => {
// // console.log(res)
// })
// .catch(err => {
// // console.log(err)
// });
postUpdates.push(
Post.updateOne({
_id: filtered.objectID,
author: filtered.author,
// title: await pickTitle(filtered.title, filtered.story_title), // <-- You are awaiting a non asynchronous function... why?
title: pickTitle(filtered.title, filtered.story_title),
created_at: filtered.created_at,
story_url: filtered.story_url
},
{ $setOnInsert: filtered },
{ upsert: true },
(err, numAffected) => {
if (err) {
// console.log("err")
} else {
// console.log(numAffected)
}
})
);
}
return Promise.all(postUpdates);
} catch (error) {
console.log(error);
}
};

delayed returning of array of collections from mongodb nodejs

I need to retrieve a list of collections using express and mongodb module.
First, ive retrieved a list of collection names which works, I then retrieve the data of those given collections in a loop. My problem is in getColAsync():
getColAsync() {
return new Promise((resolve, reject) => {
this.connectDB().then((db) => {
var allCols = [];
let dbase = db.db(this.databaseName);
dbase.listCollections().toArray((err, collectionNames) => {
if(err) {
console.log(err);
reject(err);
}
else {
for(let i = 0; i < collectionNames.length; i++) {
dbase.collection(collectionNames[i].name.toString()).find({}).toArray((err, collectionData) => {
console.log("current collection data: " + collectionData);
allCols[i] = collectionData;
})
}
console.log("done getting all data");
resolve(allCols);
}
})
})
})
}
connectDB() {
if(this.dbConnection) {
// if connection exists
return this.dbConnection;
}
else {
this.dbConnection = new Promise((resolve, reject) => {
mongoClient.connect(this.URL, (err, db) => {
if(err) {
console.log("DB Access: Error on mongoClient.connect.");
console.log(err);
reject(err);
}
else {
console.log("DB Access: resolved.");
resolve(db);
}
});
});
console.log("DB Access: db exists. Connected.");
return this.dbConnection;
}
}
In the forloop where i retrieve every collection, the console.log("done getting all data") gets called and the promise gets resolved before the forloop even begins. for example:
done getting all data
current collection data: something
current collection data: something2
current collection data: something3
Please help
The Problem
The problem in your code is this part:
for (let i = 0; i < collectionNames.length; i++) {
dbase.collection(collectionNames[i].name.toString()).find({}).toArray((err, collectionData) => {
console.log("current collection data: " + collectionData);
allCols[i] = collectionData;
})
}
console.log("done getting all data");
resolve(allCols);
You should notice that resolve(allCols); is called right after the for loop ends, but each iteration of the loop doesn't wait for the toArray callback to be called.
The line dbase.collection(collectionNames[i].name.toString()).find({}).toArray(callback) is asynchronous so the loop will end, you'll call resolve(allCols);, but the .find({}).toArray code won't have completed yet.
The Solution Concept
So, basically what you did was:
Initialize an array of results allCols = []
Start a series of async operations
Return the (still empty) array of results
As the async operations complete, fill the now useless results array.
What you should be doing instead is:
Start a series of async operations
Wait for all of them to complete
Get the results from each one
Return the list of results
The key to this is the Promise.all([/* array of promises */]) function which accepts an array of promises and returns a Promise itself passing downstream an array containing all the results, so what we need to obtain is something like this:
const dataPromises = []
for (let i = 0; i < collectionNames.length; i++) {
dataPromises[i] = /* data fetch promise */;
}
return Promise.all(dataPromises);
As you can see, the last line is return Promise.all(dataPromises); instead of resolve(allCols) as in your code, so we can no longer execute this code inside of a new Promise(func) constructor.
Instead, we should chain Promises with .then() like this:
getColAsync() {
return this.connectDB().then((db) => {
let dbase = db.db(this.databaseName);
const dataPromises = []
dbase.listCollections().toArray((err, collectionNames) => {
if (err) {
console.log(err);
return Promise.reject(err);
} else {
for (let i = 0; i < collectionNames.length; i++) {
dataPromises[i] = new Promise((res, rej) => {
dbase.collection(collectionNames[i].name.toString()).find({}).toArray((err, collectionData) => {
console.log("current collection data: " + collectionData);
if (err) {
console.log(err);
reject(err);
} else {
resolve(collectionData);
}
});
});
}
console.log("done getting all data");
return Promise.all(dataPromises);
}
});
})
}
Notice now we return a return this.connectDB().then(...), which in turn returns a Promise.all(dataPromises); this returning new Promises at each step lets us keep alive the Promise chain, thus getColAsync() will itself return a Promise you can then handle with .then() and .catch().
Cleaner Code
You can clean up your code a bit as fallows:
getColAsync() {
return this.connectDB().then((db) => {
let dbase = db.db(this.databaseName);
const dataPromises = []
// dbase.listCollections().toArray() returns a promise itself
return dbase.listCollections().toArray()
.then((collectionsInfo) => {
// collectionsInfo.map converts an array of collection info into an array of selected
// collections
return collectionsInfo.map((info) => {
return dbase.collection(info.name);
});
})
}).then((collections) => {
// collections.map converts an array of selected collections into an array of Promises
// to get each collection data.
return Promise.all(collections.map((collection) => {
return collection.find({}).toArray();
}))
})
}
As you see the main changes are:
Using mondodb functions in their promise form
Using Array.map to easily convert an array of data into a new array
Below I also present a variant of your code using functions with callbacks and a module I'm working on.
Promise-Mix
I'm recently working on this npm module to help get a cleaner and more readable composition of Promises.
In your case I'd use the fCombine function to handle the first steps where you select the db and fetch the list of collection info:
Promise.fCombine({
dbase: (dbURL, done) => mongoClient.connect(dbURL, done),
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done),
}, { dbURL: this.URL })
This results in a promise, passing downstream an object {dbase: /* the db instance */, collInfos: [/* the list of collections info */]}. Where getCollectionNames(dbase, done) is a function with callback pattern like this:
getCollectionsInfo = (db, done) => {
let dbase = db.db(this.databaseName);
dbase.listCollections().toArray(done);
}
Now you can chain the previous Promise and convert the list of collections info into selected db collections, like this:
Promise.fCombine({
dbase: ({ dbURL }, done) => mongoClient.connect(dbURL, done),
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done),
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
return dbase.collection(info.name);
}));
})
Now downstream we have a the list of selected collections from our db and we should fetch data from each one, then merge the results in an array with collection data.
In my module I have a _mux option which creates a PromiseMux which mimics the behaviour and composition patterns of a regular Promise, but it's actually working on several Promises at the same time. Each Promise gets in input one item from the downstream collections array, so you can write the code to fetch data from a generic collection and it will be executed for each collection in the array:
Promise.fCombine({
dbase: ({ dbURL }, done) => mongoClient.connect(dbURL, done),
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done),
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
return dbase.collection(info.name);
}));
})._mux((mux) => {
return mux._fReduce([
(collection, done) => collection.find({}).toArray(done)
]).deMux((allCollectionsData) => {
return Promise.resolve(allCollectionsData);
})
});
In the code above, _fReduce behaves like _fCombine, but it accepts an array of functions with callbacks instead of an object and it passes downstream only the result of the last function (not a structured object with all the results). Finally deMux executes a Promise.all on each simultaneous Promise of the mux, putting together their results.
Thus the whole code would look like this:
getCollectionsInfo = (db, done) => {
let dbase = db.db(this.databaseName);
dbase.listCollections().toArray(done);
}
getCollAsync = () => {
return Promise.fCombine({
/**
* fCombine uses an object whose fields are functions with callback pattern to
* build consecutive Promises. Each subsequent functions gets as input the results
* from previous functions.
* The second parameter of the fCombine is the initial value, which in our case is
* the db url.
*/
dbase: ({ dbURL }, done) => mongoClient.connect(dbURL, done), // connect to DB, return the connected dbase
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done), // fetch collection info from dbase, return the info objects
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
/**
* we use Array.map to convert collection info into
* a list of selected db collections
*/
return dbase.collection(info.name);
}));
})._mux((mux) => {
/**
* _mux splits the list of collections returned before into a series of "simultaneous promises"
* which you can manipulate as if they were a single Promise.
*/
return mux._fReduce([ // this fReduce here gets as input a single collection from the retrieved list
(collection, done) => collection.find({}).toArray(done)
]).deMux((allCollectionsData) => {
// finally we can put back together all the results.
return Promise.resolve(allCollectionsData);
})
});
}
In my module I tried to avoid most common anti-pattern thought there still is some Ghost Promise which I'll be working on.
Using the promises from mongodb this would get even cleaner:
getCollAsync = () => {
return Promise.combine({
dbase: ({ dbURL }) => { return mongoClient.connect(dbURL); },
collInfos: ({ dbase }) => {
return dbase.db(this.databaseName)
.listCollections().toArray();
},
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
return dbase.collection(info.name);
}));
}).then((collections) => {
return Promise.all(collections.map((collection) => {
return collection.find({}).toArray();
}))
});
}

Categories