Timing of query with mongoose and node - javascript

I am completely desperate with this problem. I have a query Array which I stored so long in memory. With it, I call an API, fire my requests and store the result to MongoDB...no problem. Unfortunately I have no control reg. the API-Server and some Econnet/TCP/IP connection error make my app crash from time to time.
To be able to resume my querying task, I wrote all my queries in my MonogDB and I want to track/record the queryState of each query being executed or not.
My problem occurs when I try to get the queries back from Mongo into my method which fires the requests. Due to some timing/async problems (I guess), my queryArray stays undefined all the time and I don't get it solved...
server.js:
//TEST
var querymongo = require('./config/queryMongo');
var queryobject = new querymongo;
var queryArray = queryobject.results();
queryArray stays undefined all the time...
queryMongo.js:
//require mongo model + db connection
var queryDB = require('./queryDB');
//constructor
...
//mongoRequest method
this.mongoRequest = function(){
console.log("Function mongoRequest called now!");
return new Promise(function(resolve, reject){
queryDB.queries.find({'SearchIndex': 'All'}, function(err, doc){
if(err) return reject (err)
else resolve (doc)
});
});
}
//resolve results
this.results = function(){
var queryArray = [];
this.mongoRequest().then(function(doc, err){
console.log(doc)
queryArray = doc;
return queryArray;
}).catch(function(err){
console.log(err)
});
}
}
module.exports = QueryMongo;
The console.log(doc) in the result method works, but returns all docs at the end of all code being executed. What is the problem here?
I would be very thankful as this makes me crazy!
Thanks
Hucho

So your question is why queryArray is undefined?
Because the Mongo query is asynchronous, and you're executing console.log right after, that is, before you get the results.
//resolve results
this.results = function(){
var queryArray = [];
this.mongoRequest().then(function(doc, err){
//////// DO SOMETHING WITH THE RESULT HERE
}).catch(function(err){
console.log(err)
});
}

Related

Problem with the async mongoDB find() method

I am trying to get some data from the mongoDB and store that in an array, then pass that array into the ejs file. The problem seems to be that while mongo is querying the results, the code after the db Code executes and an empty array is sent to ejs. Results come after execution of the render function and therefore no data is sent to ejs..
app.get('/', (req, res) => {
var batData = [];
//console.log("get req");
MongoClient.connect(url, (err,db)=>{
if(err) throw err;
console.log("Enter DB");
var dbo = db.db("MatchDB");
batData = dbo.collection("Batting").find().toArray((err,res)=>{
console.log("Query Success");
});
console.log("Exit DB");
db.close();
})
// batData remains empty when these lines of code executes.
res.render('index', {
batting: batData
});
});
Output is in this order :
Enter DB
Exit DB
Query Success
Expected Order:
Enter DB
Query Success
Exit DB
Use promise here
//change your query to function
functon query(){
//now here return a promise
return new Promise((resolve, reject) => {
MongoClient.connect(url, (err,db)=>{
if(err) reject(err) // reject the err
var dbo = db.db("MatchDB");
dbo.collection("Batting").find( (err, data) => {
console.log("Query Success");
batData = data//save your data here or do anything
db.close(); //close the db
resolve(batData) //this will get returned to the caller
});//dbo find ends
}) //mongo client ends
})//promise end
}//function ends
//now in your app.get route
// see here i marked this async, for using await
app.get('/', async (req, res) => {
let batData = await query() // this will wait until it gets resove or rejected
res.render('index', {
batting: batData // now you will have data here
});
});
Some points here for your help
A promise gets resolved or rejected
the caller function for the promise will wait until it gets the result back from the promise function, again you can handle this if you don't want to wait for the promise to get finished
async await are just way to handle promise in a more neat way removing callback hells from the code
whenever you need to use await then you have to mark its function in which it lies async as you can see I did in the app.get callback function
now await will block the code until it gets completed, rejected or resolved
after then it will go to the res.render part of the code
batData is declared as an Array but then later in your code you set it equal to the find query. You should .push() it instead or save it to a new variable and then .push() it afterwards.
Also, using const (instead of var) on batData would have thrown an error instead thereby showing this error. Unless you are using var for supporting older code, use const and let instead.

How can I get the result of a promise in node & jade

I'm new to NodeJS and Jade/PUG so I know this can be a really simple question for many of you but for me I can't understeand any of the answers a get from the internet because the term 'Promise' and how it works is a little "confusing" for me.
I am querying a postgre database to get several values from a table (really simple query). If I do this without using a promise everything works fine, console prints the result and everyone is happy, but when I try to store this result into a variable and pass it as a parameter to a Jade template things change.
I have read that, in order to do that, I need to use promises because it is more likely that when the variable is being accessed, the value might not be resolved yet and that's what Promises are for. So here I have my code:
hero.js:
getHeroes: function()
{
//Initialize array
var elem = [];
console.log('Querying heroes');
return new Promise((resolve, reject) =>
{
pg.connect(conString, function (err, client, done)
{
if (err)
{
return console.error('error fetching client from pool', err)
}
//Execute SELECT query
client.query('SELECT name from heroe;', function (err, rows, result)
{
//Iterate over results
for (var i = 0; i < rows.rowCount; i++)
{
//PUSH result into arrays
elem.push(rows.rows[i].name);
}
done()
if (err)
{
return console.error('error happened during query', err)
}
resolve(elem)
})
});
})
}
And this the part of my server.js where I call that function:
app.get('/jade', (request, response) => {
var heroList = [];
heroList = hero.getHeroes();
console.log(heroList);
response.render('test_jade', {param1: 'test'});
})
that console.log shows up "Promise { pending }" and I don't know how to "listen to the resolved event and retrieve the value from it once it has finished".
Would appreciate any advice/solution or even a good Node.js manual where all this mess is explained for total newbies like me.
Thanks in advance!
It's not how you use promise.
Try this,
app.get('/jade', (request, response) => {
var heroList = [];
hero.getHeroes().then(data=>{
heroList = data;
console.log(heroList);
response.render('test_jade', {param1: 'test'});
}).catch(e=>{
//handle error case here when your promise fails
console.log(e)
})
})
You should also catch in case your promise fails.

Node JS MySQL query function not returning result

I am running a MySQL query inside a .js file running on Node JS. I have the connection setup ok and the query works but when I try returning the result back to the original call it doesn't seem to work.
function sqlQuery(query){
console.log("Running query");
var conn = connection();
var result = false;
conn.query(query, function(err, rows, fields) {
conn.end();
if (!err){ result = rows; } else { result = err; }
});
return result;
}
var loginResult = sqlQuery("SELECT * FROM `players`");
console.log(loginResult);
If I use the following code it does write the result to the console inside the query but not the final "loginResult". I am not getting any errors so my question is - is there an error in the way I am getting the returned result?
if (!err){ result = rows; console.log(rows); } else { result = err; }
Virtually everything in Node.js is asynchronous, and SQL query functions definitely are. You're calling conn.query(query, callback), which means that query is called, and then once there is a result at some point in the future, your callback function gets called with the result for you to work with. So:
conn.query(query, function runThisEventually(err, rows, fields) {
if (err) {
console.error("One or more errors occurred!");
console.error(err);
return;
}
processResults(rows, fields);
});
You won't get the result immediately after calling conn.query(...), so your code gets to do "other things" in the mean time, and at some point, your callback will be triggered and you can pick up result processing there.

node poller not exiting properly

I have a function that will poll a database ever x seconds. I am using the Q library so the function will return a promise. The function will ultimetly be used in a long chain of .then()s.
The function does give me the results that I expect but the function continues to run for 30-40 seconds after the results are returned. I cannot figure out why it would not exit right after I return.
var _ = require('lodash');
var pg = require('pg');
var Q = require('q');
connString = 'postgres://somedb_info';
var query = "SELECT * FROM job where jobid='somejobid123123'";
exports.run_poller = function () {
var deferred = Q.defer();
function exec_query(callback) {
pg.connect(connString, function(err, client, done) {
if(err) {
deferred.reject(err);
}
client.query(query, function(err, result) {
done();
if(err) {
return deferred.reject(err);
}
callback(result.rows[0]);
});
});
}
function wait_for(res){
if(res.status == 'COMPLETE') {
return deferred.resolve(res);
} else {
setTimeout(function(){
exec_query(wait_for);
}, 1000);
}
}
exec_query(wait_for);
return deferred.promise;
};
Just to test this I call the function from a main.js file like so:
var poller = require('./utils/poller').run_poller;
poller().then(console.log).catch(function(err) {console.log(err,'*');});
Why doesn't main.js exit right after the data is returned? Is there a better way to achieve this?
I see that pg maintains a connection pool. I would assume that it's either taking awhile to time out some shared resources or it just takes a little while to shutdown.
You may be able to just call process.exit(0) in your node app to force it to exit sooner if you know you're done with all your work.
Or, you may be able to find configuration settings that affect how the connection pool works.
On this doc page, there's an example like this that might help:
//disconnect client when all queries are finished
client.on('drain', client.end.bind(client));
You should read the doc for .end() to make sure you're using it correctly as there are some cases where it says it should not be called (though they may not apply if your done with all activity).

How to force javascript not to make callback functions parallel?

I'm building a twitter clone using Node.js and MongoDB with mongoose. My Tweet model has body, user and created fields where user is the id of the user who has created the tweet. Now I'm building the API. I want when I make a GET request to receive a list of all the tweets (/api/tweets/) but except the user field (which returns only the id of the user) I want to get the whole user object so that I can display information about the tweet owner in my front-end part. I ended up with the following code.
exports.all = function (req, res, next) {
Tweet.find({}, function (err, tweets) {
if (err) return res.json(400, err);
var response = [];
tweets.forEach(function (element, index, array) {
var tweet = {};
tweet._id = element._id;
tweet.created = element.created;
tweet.body = element.body;
User.findById(element.user, function (err, user) { // <- This line
if (err) return res.json(400, err);
tweet.user = user;
});
response.push(tweet);
});
return res.json(response);
});
};
It works perfectly except that it doesn't add the user info. The problem is in the line I have marked. When javascript comes to that line, it tries to make it "parallel" and continues with the code without executing the callback function. But then it pushes the tweet object that doesn't have yet user info. How can I fix this?
You're going to want to use the async library. It will make your life much easier.
// inside `Tweet.find`
async.each(tweets, function(done) {
// do stuff to tweets
User.findById(element.user, function(err, user){
if (err) return done(err);
// do stuff with user
done();
});
}, function(err) {
// called when done
res.json(response);
});
The issue is that res.json sends the response so it doesn't matter that findById is being called. You need to call res.json once everything is done.
You can do this in several ways, but the easiest one with your existing code is to just keep a counter:
var tweetCount = 0;
tweets.forEach(/* snip */) {
User.findById(element.user, function (err, user) {
tweet.user = user;
tweetCount++;
response.push(tweet);
if (tweetCount == tweets.length) {
res.json(response);
}
});
});
You can use Q promise library to sync. It is simple and easy to use.
Response will only be send when whole promise chain is compeleted
var result = Q();
tweets.forEach(function (tweet, i) {
result = result.then(function () {
// do your stuff
}
result.then(function () {
res.json(response); //successfully completed
});

Categories