Values coming back as 'undefined' in for loop inside ES6 promise - javascript

So I'm currently writing an inventory application, and I'm running into some issues inserting values into my database. What happens is that I get a list of item objects that contain information about that specific item. I then push those items into a list and call a socket which runs a for loop to insert those items into my database, but when I look at the table where these items should be, all of their values are set as undefined.
My item object looks something like this:
let ItemObject = {
ID: id,
Name: name,
Qty: qty
}
The promise that calls the DB sequence:
export const insertItem = async (list) => {
return new Promise(async (resolve, reject) => {
try {
console.log(list);
for(let i = 0; i < list.length; i++){
const writeSolutionData = `INSERT INTO database (ID, Name, Qty) VALUES ("${list.ID}", "${list.Name}", "${list.Qty});`;
const response = await db(writeSolutionData, "Inserting Item");
resolve(response);
}
} catch (e) {
console.log("ERROR database.insertItem: " + e);
reject(e);
}
});
};
This is the socket that gets called:
socket.on('insertItem', async (data, callback) => {
try {
const results = await insertItem(data);
callback(true);
}
catch (error) {
callback(false);
}
});
When I console log the list before the for loop, I get what I would expect the output to be, but after this promise finishes the data that's returned in my DB is undefined. I also receive no errors from the callback either. The idea of ES6 promises are still new to me, so forgive me if what I have here is not correct. This is the first time I've tried implementing a for loop within an asynchronous promise like this, so it was no surprise to me that this didn't work the first time around.
If anyone has any suggestions as to what I'm doing wrong, I would greatly appreciate it. Thanks!

The insertItem does not have to be an async function. Just return the Promise and use async it the promise callback, only in that scope the await keyword is needed.
An async function automatically returns a Promise, but you need a Promise constructor to use the resolve and reject methods. It is basically the same thing but with different syntax and options.
Also I've noticed that your loop has an error. You try to select list.ID, list.Name and list.Qty from the array, but instead should get the values from the items in the array.
I also found a missing " at the end of your query string.
export const insertItem = (list) => new Promise(async (resolve, reject) => {
try {
list.forEach(item => {
const writeSolutionData = `INSERT INTO database (ID, Name, Qty) VALUES ("${item.ID}", "${item.Name}", "${item.Qty}");`;
const response = await db(writeSolutionData, "Inserting Item");
resolve(response);
});
} catch (e) {
console.log("ERROR database.insertItem: " + e);
reject(e);
}
});
Addendum
Updated the answer with usage of Promise.all. Suggested by #wizloc.
This will loop all of your items and return promises from the db() function and stores them into a new array. Promise.all will then resolve whenever all of the promises in the array have fullfilled and return the values of each promise in a single array.
export const insertItem = (list) => new Promise(async (resolve, reject) => {
try {
const responses = list.map(item => {
const writeSolutionData = `INSERT INTO database (ID, Name, Qty) VALUES ("${item.ID}", "${item.Name}", "${item.Qty}");`;
return db(writeSolutionData, "Inserting Item");
});
const results = await Promise.all(responses);
resolve(results);
} catch (e) {
console.log("ERROR database.insertItem: " + e);
reject(e);
}
});

I'm unfamiliar with what
const writeSolutionData = `INSERT INTO database (ID, Name, Qty) VALUES ("${list.ID}", "${list.Name}", "${list.Qty}");`;
const response = await db(writeSolutionData, "Inserting Item");
actually does (aside from the obvious), but since db(writeSolutionData, "Inserting Item") is awaitable, it returns a promise. As I mentioned in comments, your original code is resolving in the first iteration of the loop, therefore if you need to access the values returned from your promises down the road, you will find you only have access to the first value. You asked why you would need the values since you can just query for them afterwards, I can't answer this since I don't know anything about your project, what you plan to do with the data after it is inserted, etc. But another benefit of promises is error handling by chaining .then() and .catch().
You could simplify your entire insertData method to the following
socket.on('insertItem', async (data, callback) => {
const promises = data.map(x =>
db(`INSERT INTO database (ID, Name, Qty) VALUES ("${x.ID}", "${x.Name}", "${x.Qty});`)
)
Promise.all(promises)
.then(values => {
// Do something with values array
callback(true)
})
.catch(error => {
// Error handling
callback(false)
})
});
Doing it this way will ensure callback(true) is only called if all items were inserted successfully, and will error (callback(false)) if any of the items failed.

I think insertItem is not an async function because it doesn't contain await...
export const insertItem = /*async <- remove it */ (list) => {
return new Promise(async (resolve, reject) => {
try {
console.log(list);
for(let i = 0; i < list.length; i++){
const writeSolutionData = `INSERT INTO database (ID, Name, Qty) VALUES ("${list.ID}", "${list.Name}", "${list.Qty});`;
const response = await db(writeSolutionData, "Inserting Item");
resolve(response);
}
} catch (e) {
console.log("ERROR database.insertItem: " + e);
reject(e);
}
});
};

Related

How can I update mysql2 query to not return undefined for return?

I am using nodejs and mysql2. I am storing all my queries inside a class. When I console.log the results I am able to view the data correctly, however when I use a return statment to return the data, I am getting undefined. I believe I need to use promises, but am uncertain how to do so correctly. Here is what I have currently which is returning undefined,
viewManagerChoices() {
const sql = `SELECT CONCAT(first_name, ' ', last_name) AS manager, id FROM employee WHERE manager_id IS NULL`;
db.query(sql, (err, rows) => {
if (err) throw err;
const managers = rows.map(manager => ({ name: manager.manager, value: manager.id }));
managers.push({ name: 'None', value: null });
return managers;
});
};
This is my attempt at using promises which is returning as Promise {<pending>},
viewManagers() {
return new Promise((resolve, reject) => {
const sql = `SELECT CONCAT(first_name, ' ', last_name) AS manager FROM employee WHERE manager_id IS NULL`;
db.query(sql,
(error, results) => {
if (error) {
console.log('error', error);
reject(error);
}
const managers = [];
for (let i = 0; i < results.length; i++) {
managers.push({ name: results[i].manager, value: i+1 });
}
managers.push({ name: "None", value: null });
resolve(managers);
}
)
})
}
My class is called Query and I am calling these methods by doing,
const query = new Query();
query.viewManagerChoices();
query.viewManagers();
Your implementation for viewManagers is correct however promises don't make calls synchronous.
Either you need to use then callback or await the result in async context.
const query = new Query();
query.viewManagers().then((managers) => {
// do stuff here
}).catch((error) => console.error(error.message));
or
async someFunc() {
const query = new Query();
try{
const managers = await query.viewManagers();
}catch(error){
console.error(error.message);
}
}
Once you use promise you cannot just get a returned value without async/await or then flag. Once it's a promise the flow continues as the original.
For example:
// This is promise too because it has async flag.
// You cannot make it sync if you use promise in it
async myFunc(){
const const query = new Query();
const managers = await query.viewManagers();
return managers;
}
// It actually returns Promise<...>
// Now if you want to use myFunc in another function
// You need to do it the same way again
async anotherFunc(){
const something = await myFunc();
return something; // Returns promise
}
You can read more about promises here

Javascript function change variable value

I am currently developing signup page and I want to check if email address is already exist in the database.
var emailnum = email_num(`select * from contactinfo where email='${email}'`);
console.log(emailnum); //the output presents Promise { <pending> } not a num.
function sqlExecute(q) {
return new Promise((resolve, reject) => {
db.pool.query(q, function(err, result) {
if (err) {
console.log("ERROR ON " + q+"\n");
reject(err)
}
console.log("SUCCESS ON " + q);
resolve(result);
})
})
.catch(err => {
console.log(err);
})
}
//run check sql
async function email_num(tempquery){
var result = await sqlExecute(tempquery);
return result.rowCount;
}
I tried multiple ways but still could not figure it out.
I would appreciate any help TT
when I console.log, output is Always Promise { }.
I tried
var emailnum = email_num(`select count(*) as count from contactinfo where email='${email}'`)
.then((val)=>{
return val
};
console.log("number"+ emailnum);
The problem here is that you're not allowing the promise to resolve before attempting to retrieve the row count. When you create a function using async the return result is always going to be a promise. Here are a few solutions that will get you your desired result:
Solution 1: Use console.log to print the result of the promise after it has been resolved.
const email = 'something#gmail.com'
const numOfEmailsQuery = `SELECT * FROM contactinfo WHERE email = '${email}'`
email_num(numOfEmailsQuery)
.then(console.log)
.catch(console.error)
Solution 2: Use await inside of an async function to resolve the result of the promise and store it in a variable. Then print the result using console.log
async function printNumberOfEmails(email) {
const numOfEmailsQuery = `SELECT * FROM contactinfo WHERE email = '${email}'`
try {
const numOfEmails = await email_num(numOfEmailsQuery)
console.log(numOfEmails)
} catch (err) {
console.error(err)
}
}
const email = 'something#gmail.com'
printNumberOfEmails(email)
Hope that helps! Good luck!
First off, it's good practice to use prepared statements instead.
Also, you should just care about the count, so instead of select *, do select count(*) as count. Then you'll need check either the number of rows that result returned or the first index of rows, then the count property of that.
It should look something like this
function fetchEmailExists(email) {
const statement = 'select count(*) as count from contactinfo where email = $1';
return new Promise((resolve, reject) => {
db.pool.query(statement, [ email ], function(error, result) {
if (!error && result.rows.length > 0) {
resolve(result.rows[0]['count'] > 0);
} else {
reject(error);
}
});
})
.catch(error => {
console.log(error);
});
}
// Here is your call to fetchEmailExists
fetchEmailExists('test123#example.com')
.then(boolean => console.log(boolean));

Iterate over array of queries and append results to object in JavaScript

I want to return results from two database queries in one object.
function route(start, end) {
return new Promise((resolve, reject) => {
const queries = routeQuery(start, end);
var empty_obj = new Array();
for (i=0; i<queries.length; i++) {
query(queries[i], (err, res) => {
if (err) {
reject('query error', err);
console.log(err);
return;
} else {
empty_obj.push(res.rows);
}});
}
console.log(empty_obj);
resolve({coords: empty_obj});
});
}
This is my code right now, the queries are working fine but for some reason, pushing each result into an empty array does not work. When I console log that empty object, it stays empty. The goal is to resolve the promise with the generated object containing the two query results. I'm using node-postgres for the queries.
Output of res is an object:
{
command: 'SELECT',
rowCount: 18,
oid: null,
rows: [
{ ...
I suggest you turn your query function into a Promise so that you can use Promise.all:
// turn the callback-style asynchronous function into a `Promise`
function queryAsPromise(arg) {
return new Promise((resolve, reject) => {
query(arg, (err, res) => {
if (err) {
console.error(err);
reject(err);
return;
}
resolve(res);
});
});
}
Then, you could do the following in your route function:
function route(start, end) {
const queries = routeQuery(start, end);
// use `Promise.all` to resolve with
// an array of results from queries
return Promise.all(
queries.map(query => queryAsPromise(query))
)
// use `Array.reduce` w/ destructing assignment
// to combine results from queries into a single array
.then(results => results.reduce(
(acc, item) => [...acc, ...item.rows],
[]
))
// return an object with the `coords` property
// that contains the final array
.then(coords => {
return { coords };
});
}
route(1, 10)
.then(result => {
// { coords: [...] }
})
.catch(error => {
// handle errors appropriately
console.error(error);
});
References:
Promise.all - MDN
Array.reduce - MDN
Destructing assignment - MDN
Hope this helps.
The issue you currently face is due to the fact that:
resolve({coords: empty_obj});
Is not inside the callback. So the promise resolves before the query callbacks are called and the rows are pushed to empty_obj. You could move this into the query callback in the following manner:
empty_obj.push(res.rows); // already present
if (empty_obj.length == queries.length) resolve({coords: empty_obj});
This would resolve the promises when all rows are pushed, but leaves you with another issue. Callbacks might not be called in order. Meaning that the resulting order might not match the queries order.
The easiest way to solve this issue is to convert each individual callback to a promise. Then use Promise.all to wait until all promises are resolved. The resulting array will have the data in the same order.
function route(start, end)
const toPromise = queryText => new Promise((resolve, reject) => {
query(queryText, (error, response) => error ? reject(error) : resolve(response));
});
return Promise.all(routeQuery(start, end).map(toPromise))
.then(responses => ({coords: responses.map(response => response.rows)}))
.catch(error => {
console.error(error);
throw error;
});
}

delayed returning of array of collections from mongodb nodejs

I need to retrieve a list of collections using express and mongodb module.
First, ive retrieved a list of collection names which works, I then retrieve the data of those given collections in a loop. My problem is in getColAsync():
getColAsync() {
return new Promise((resolve, reject) => {
this.connectDB().then((db) => {
var allCols = [];
let dbase = db.db(this.databaseName);
dbase.listCollections().toArray((err, collectionNames) => {
if(err) {
console.log(err);
reject(err);
}
else {
for(let i = 0; i < collectionNames.length; i++) {
dbase.collection(collectionNames[i].name.toString()).find({}).toArray((err, collectionData) => {
console.log("current collection data: " + collectionData);
allCols[i] = collectionData;
})
}
console.log("done getting all data");
resolve(allCols);
}
})
})
})
}
connectDB() {
if(this.dbConnection) {
// if connection exists
return this.dbConnection;
}
else {
this.dbConnection = new Promise((resolve, reject) => {
mongoClient.connect(this.URL, (err, db) => {
if(err) {
console.log("DB Access: Error on mongoClient.connect.");
console.log(err);
reject(err);
}
else {
console.log("DB Access: resolved.");
resolve(db);
}
});
});
console.log("DB Access: db exists. Connected.");
return this.dbConnection;
}
}
In the forloop where i retrieve every collection, the console.log("done getting all data") gets called and the promise gets resolved before the forloop even begins. for example:
done getting all data
current collection data: something
current collection data: something2
current collection data: something3
Please help
The Problem
The problem in your code is this part:
for (let i = 0; i < collectionNames.length; i++) {
dbase.collection(collectionNames[i].name.toString()).find({}).toArray((err, collectionData) => {
console.log("current collection data: " + collectionData);
allCols[i] = collectionData;
})
}
console.log("done getting all data");
resolve(allCols);
You should notice that resolve(allCols); is called right after the for loop ends, but each iteration of the loop doesn't wait for the toArray callback to be called.
The line dbase.collection(collectionNames[i].name.toString()).find({}).toArray(callback) is asynchronous so the loop will end, you'll call resolve(allCols);, but the .find({}).toArray code won't have completed yet.
The Solution Concept
So, basically what you did was:
Initialize an array of results allCols = []
Start a series of async operations
Return the (still empty) array of results
As the async operations complete, fill the now useless results array.
What you should be doing instead is:
Start a series of async operations
Wait for all of them to complete
Get the results from each one
Return the list of results
The key to this is the Promise.all([/* array of promises */]) function which accepts an array of promises and returns a Promise itself passing downstream an array containing all the results, so what we need to obtain is something like this:
const dataPromises = []
for (let i = 0; i < collectionNames.length; i++) {
dataPromises[i] = /* data fetch promise */;
}
return Promise.all(dataPromises);
As you can see, the last line is return Promise.all(dataPromises); instead of resolve(allCols) as in your code, so we can no longer execute this code inside of a new Promise(func) constructor.
Instead, we should chain Promises with .then() like this:
getColAsync() {
return this.connectDB().then((db) => {
let dbase = db.db(this.databaseName);
const dataPromises = []
dbase.listCollections().toArray((err, collectionNames) => {
if (err) {
console.log(err);
return Promise.reject(err);
} else {
for (let i = 0; i < collectionNames.length; i++) {
dataPromises[i] = new Promise((res, rej) => {
dbase.collection(collectionNames[i].name.toString()).find({}).toArray((err, collectionData) => {
console.log("current collection data: " + collectionData);
if (err) {
console.log(err);
reject(err);
} else {
resolve(collectionData);
}
});
});
}
console.log("done getting all data");
return Promise.all(dataPromises);
}
});
})
}
Notice now we return a return this.connectDB().then(...), which in turn returns a Promise.all(dataPromises); this returning new Promises at each step lets us keep alive the Promise chain, thus getColAsync() will itself return a Promise you can then handle with .then() and .catch().
Cleaner Code
You can clean up your code a bit as fallows:
getColAsync() {
return this.connectDB().then((db) => {
let dbase = db.db(this.databaseName);
const dataPromises = []
// dbase.listCollections().toArray() returns a promise itself
return dbase.listCollections().toArray()
.then((collectionsInfo) => {
// collectionsInfo.map converts an array of collection info into an array of selected
// collections
return collectionsInfo.map((info) => {
return dbase.collection(info.name);
});
})
}).then((collections) => {
// collections.map converts an array of selected collections into an array of Promises
// to get each collection data.
return Promise.all(collections.map((collection) => {
return collection.find({}).toArray();
}))
})
}
As you see the main changes are:
Using mondodb functions in their promise form
Using Array.map to easily convert an array of data into a new array
Below I also present a variant of your code using functions with callbacks and a module I'm working on.
Promise-Mix
I'm recently working on this npm module to help get a cleaner and more readable composition of Promises.
In your case I'd use the fCombine function to handle the first steps where you select the db and fetch the list of collection info:
Promise.fCombine({
dbase: (dbURL, done) => mongoClient.connect(dbURL, done),
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done),
}, { dbURL: this.URL })
This results in a promise, passing downstream an object {dbase: /* the db instance */, collInfos: [/* the list of collections info */]}. Where getCollectionNames(dbase, done) is a function with callback pattern like this:
getCollectionsInfo = (db, done) => {
let dbase = db.db(this.databaseName);
dbase.listCollections().toArray(done);
}
Now you can chain the previous Promise and convert the list of collections info into selected db collections, like this:
Promise.fCombine({
dbase: ({ dbURL }, done) => mongoClient.connect(dbURL, done),
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done),
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
return dbase.collection(info.name);
}));
})
Now downstream we have a the list of selected collections from our db and we should fetch data from each one, then merge the results in an array with collection data.
In my module I have a _mux option which creates a PromiseMux which mimics the behaviour and composition patterns of a regular Promise, but it's actually working on several Promises at the same time. Each Promise gets in input one item from the downstream collections array, so you can write the code to fetch data from a generic collection and it will be executed for each collection in the array:
Promise.fCombine({
dbase: ({ dbURL }, done) => mongoClient.connect(dbURL, done),
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done),
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
return dbase.collection(info.name);
}));
})._mux((mux) => {
return mux._fReduce([
(collection, done) => collection.find({}).toArray(done)
]).deMux((allCollectionsData) => {
return Promise.resolve(allCollectionsData);
})
});
In the code above, _fReduce behaves like _fCombine, but it accepts an array of functions with callbacks instead of an object and it passes downstream only the result of the last function (not a structured object with all the results). Finally deMux executes a Promise.all on each simultaneous Promise of the mux, putting together their results.
Thus the whole code would look like this:
getCollectionsInfo = (db, done) => {
let dbase = db.db(this.databaseName);
dbase.listCollections().toArray(done);
}
getCollAsync = () => {
return Promise.fCombine({
/**
* fCombine uses an object whose fields are functions with callback pattern to
* build consecutive Promises. Each subsequent functions gets as input the results
* from previous functions.
* The second parameter of the fCombine is the initial value, which in our case is
* the db url.
*/
dbase: ({ dbURL }, done) => mongoClient.connect(dbURL, done), // connect to DB, return the connected dbase
collInfos: ({ dbase }, done) => getCollectionsInfo(dbase, done), // fetch collection info from dbase, return the info objects
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
/**
* we use Array.map to convert collection info into
* a list of selected db collections
*/
return dbase.collection(info.name);
}));
})._mux((mux) => {
/**
* _mux splits the list of collections returned before into a series of "simultaneous promises"
* which you can manipulate as if they were a single Promise.
*/
return mux._fReduce([ // this fReduce here gets as input a single collection from the retrieved list
(collection, done) => collection.find({}).toArray(done)
]).deMux((allCollectionsData) => {
// finally we can put back together all the results.
return Promise.resolve(allCollectionsData);
})
});
}
In my module I tried to avoid most common anti-pattern thought there still is some Ghost Promise which I'll be working on.
Using the promises from mongodb this would get even cleaner:
getCollAsync = () => {
return Promise.combine({
dbase: ({ dbURL }) => { return mongoClient.connect(dbURL); },
collInfos: ({ dbase }) => {
return dbase.db(this.databaseName)
.listCollections().toArray();
},
}, { dbURL: this.URL }).then(({ dbase, collInfos }) => {
return Promise.resolve(collInfos.map((info) => {
return dbase.collection(info.name);
}));
}).then((collections) => {
return Promise.all(collections.map((collection) => {
return collection.find({}).toArray();
}))
});
}

Get Knex.js transactions working with ES7 async/await

I'm trying to couple ES7's async/await with knex.js transactions.
Although I can easily play around with non-transactional code, I'm struggling to get transactions working properly using the aforementioned async/await structure.
I'm using this module to simulate async/await
Here's what I currently have:
Non-transactional version:
works fine but is not transactional
app.js
// assume `db` is a knex instance
app.post("/user", async((req, res) => {
const data = {
idUser: 1,
name: "FooBar"
}
try {
const result = await(user.insert(db, data));
res.json(result);
} catch (err) {
res.status(500).json(err);
}
}));
user.js
insert: async (function(db, data) {
// there's no need for this extra call but I'm including it
// to see example of deeper call stacks if this is answered
const idUser = await(this.insertData(db, data));
return {
idUser: idUser
}
}),
insertData: async(function(db, data) {
// if any of the following 2 fails I should be rolling back
const id = await(this.setId(db, idCustomer, data));
const idCustomer = await(this.setData(db, id, data));
return {
idCustomer: idCustomer
}
}),
// DB Functions (wrapped in Promises)
setId: function(db, data) {
return new Promise(function (resolve, reject) {
db.insert(data)
.into("ids")
.then((result) => resolve(result)
.catch((err) => reject(err));
});
},
setData: function(db, id, data) {
data.id = id;
return new Promise(function (resolve, reject) {
db.insert(data)
.into("customers")
.then((result) => resolve(result)
.catch((err) => reject(err));
});
}
Attempt to make it transactional
user.js
// Start transaction from this call
insert: async (function(db, data) {
const trx = await(knex.transaction());
const idCustomer = await(user.insertData(trx, data));
return {
idCustomer: idCustomer
}
}),
it seems that await(knex.transaction()) returns this error:
[TypeError: container is not a function]
I couldn't find a solid answer for this anywhere (with rollbacks and commits) so here's my solution.
First you need to "Promisify" the knex.transaction function. There are libraries for this, but for a quick example I did this:
const promisify = (fn) => new Promise((resolve, reject) => fn(resolve));
This example creates a blog post and a comment, and rolls back both if there's an error with either.
const trx = await promisify(db.transaction);
try {
const postId = await trx('blog_posts')
.insert({ title, body })
.returning('id'); // returns an array of ids
const commentId = await trx('comments')
.insert({ post_id: postId[0], message })
.returning('id');
await trx.commit();
} catch (e) {
await trx.rollback();
}
Here is a way to write transactions in async / await.
It is working fine for MySQL.
const trx = await db.transaction();
try {
const catIds = await trx('catalogues').insert({name: 'Old Books'});
const bookIds = await trx('books').insert({catId: catIds[0], title: 'Canterbury Tales' });
await trx.commit();
} catch (error) {
await trx.rollback(error);
}
Async/await is based around promises, so it looks like you'd just need to wrap all the knex methods to return "promise compatible" objects.
Here is a description on how you can convert arbitrary functions to work with promises, so they can work with async/await:
Trying to understand how promisification works with BlueBird
Essentially you want to do this:
var transaction = knex.transaction;
knex.transaction = function(callback){ return knex.transaction(callback); }
This is because "async/await requires the either a function with a single callback argument, or a promise", whereas knex.transaction looks like this:
function transaction(container, config) {
return client.transaction(container, config);
}
Alternatively, you can create a new async function and use it like this:
async function transaction() {
return new Promise(function(resolve, reject){
knex.transaction(function(error, result){
if (error) {
reject(error);
} else {
resolve(result);
}
});
});
}
// Start transaction from this call
insert: async (function(db, data) {
const trx = await(transaction());
const idCustomer = await(person.insertData(trx, authUser, data));
return {
idCustomer: idCustomer
}
})
This may be useful too: Knex Transaction with Promises
(Also note, I'm not familiar with knex's API, so not sure what the params are passed to knex.transaction, the above ones are just for example).
For those who come in 2019.
After I updated Knex to version 0.16.5. sf77's answer doesn't work anymore due to the change in Knex's transaction function:
transaction(container, config) {
const trx = this.client.transaction(container, config);
trx.userParams = this.userParams;
return trx;
}
Solution
Keep sf77's promisify function:
const promisify = (fn) => new Promise((resolve, reject) => fn(resolve));
Update trx
from
const trx = await promisify(db.transaction);
to
const trx = await promisify(db.transaction.bind(db));
I think I have found a more elegant solution to the problem.
Borrowing from the knex Transaction docs, I will contrast their promise-style with the async/await-style that worked for me.
Promise Style
var Promise = require('bluebird');
// Using trx as a transaction object:
knex.transaction(function(trx) {
var books = [
{title: 'Canterbury Tales'},
{title: 'Moby Dick'},
{title: 'Hamlet'}
];
knex.insert({name: 'Old Books'}, 'id')
.into('catalogues')
.transacting(trx)
.then(function(ids) {
return Promise.map(books, function(book) {
book.catalogue_id = ids[0];
// Some validation could take place here.
return knex.insert(book).into('books').transacting(trx);
});
})
.then(trx.commit)
.catch(trx.rollback);
})
.then(function(inserts) {
console.log(inserts.length + ' new books saved.');
})
.catch(function(error) {
// If we get here, that means that neither the 'Old Books' catalogues insert,
// nor any of the books inserts will have taken place.
console.error(error);
});
async/await style
var Promise = require('bluebird'); // import Promise.map()
// assuming knex.transaction() is being called within an async function
const inserts = await knex.transaction(async function(trx) {
var books = [
{title: 'Canterbury Tales'},
{title: 'Moby Dick'},
{title: 'Hamlet'}
];
const ids = await knex.insert({name: 'Old Books'}, 'id')
.into('catalogues')
.transacting(trx);
const inserts = await Promise.map(books, function(book) {
book.catalogue_id = ids[0];
// Some validation could take place here.
return knex.insert(book).into('books').transacting(trx);
});
})
await trx.commit(inserts); // whatever gets passed to trx.commit() is what the knex.transaction() promise resolves to.
})
The docs state:
Throwing an error directly from the transaction handler function automatically rolls back the transaction, same as returning a rejected promise.
It seems that the transaction callback function is expected to return either nothing or a Promise. Declaring the callback as an async function means that it returns a Promise.
One advantage of this style is that you don't have to call the rollback manually. Returning a rejected Promise will trigger the rollback automatically.
Make sure to pass any results you want to use elsewhere to the final trx.commit() call.
I have tested this pattern in my own work and it works as expected.
Adding to sf77's excellent answer, I implemented this pattern in TypeScript for adding a new user where you need to do the following in 1 transaction:
creating a user record in the USER table
creating a login record in the LOGIN table
public async addUser(user: User, hash: string): Promise<User> {
//transform knex transaction such that can be used with async-await
const promisify = (fn: any) => new Promise((resolve, reject) => fn(resolve));
const trx: knex.Transaction = <knex.Transaction> await promisify(db.transaction);
try {
let users: User [] = await trx
.insert({
name: user.name,
email: user.email,
joined: new Date()})
.into(config.DB_TABLE_USER)
.returning("*")
await trx
.insert({
email: user.email,
hash
}).into(config.DB_TABLE_LOGIN)
.returning("email")
await trx.commit();
return Promise.resolve(users[0]);
}
catch(error) {
await trx.rollback;
return Promise.reject("Error adding user: " + error)
}
}

Categories