Node.js and Heroku - javascript

Getting started with Node.js and Heroku; I am trying to make sense of the following code, in order to build something of my own:
app.get('/db', function (request, response) {
pg.connect(process.env.DATABASE_URL, function(err, client, done) {
client.query('SELECT * FROM test_table', function(err, result) {
done();
if (err)
{ console.error(err); response.send("Error " + err); }
else
{ response.render('pages/db', {results: result.rows} ); }
});
});
});
Where can I find a tutorial or some comments or explanations for that?
Even though I can do some guessing, a good deal of this code is pretty mysterious.
Currently my main concerns are:
What happens if I change the SQL query, replacing it by 'SELECT
count(*) FROM test_table'? How do I then render the result?
What does "done();" do? Is it something I can modify or make use
of?
The parameter "request" is never used. Can it be used for
something at some point?

Before handling heroku, you should first look at tutorials about web application in node.js which will answers your last question.
You can see how works express.js, a web framework.
Then look at node-postgre documentation. You will find your answers about the second question here :
//this initializes a connection pool
//it will keep idle connections open for a 30 seconds
//and set a limit of maximum 10 idle clients
var pool = new pg.Pool(config);
// to run a query we can acquire a client from the pool,
// run a query on the client, and then return the client to the pool
pool.connect(function(err, client, done) {
if(err) {
return console.error('error fetching client from pool', err);
}
client.query('SELECT $1::int AS number', ['1'], function(err, result) {
//call `done()` to release the client back to the pool
done();
if(err) {
return console.error('error running query', err);
}
console.log(result.rows[0].number);
//output: 1
});
});
And finanlly, why don't you just log result output after changing the SQL query and look what you get ?

Related

what is the correct way to release connection after commit in nodejs for oracledb

I am currently using angularJS as my frontend framework while expressJS on node.JS to provide the REST API as my backend framework.
For my insertService function in node.JS, after I insert some value into the database, I want to commit and release the connection. However, I am getting the following error:
NJS-032: connection cannot be released because a database call is in progress
These are my commit and release functions:
function doRelease(connection)
{
console.log("before release");
connection.release(
function(err) {
if (err)
console.error(err.message);
});
console.log("after release");
}
function doCommit(connection)
{
console.log("before commit");
connection.commit(
function(err) {
if (err)
console.error(err.message);
});
console.log("after commit");
doRelease(connection);
}
This is how I am calling them:
app.post('/addService', function(req, res)
{
console.log("addService is called");
oracledb.getConnection(
DBconfig,
function(err, connection)
{
if (err) { console.error(err); return; }
connection.execute(
"Insert into mylist Values (MYLIST_ID_SEQUENCE.nextval ,'"+req.body.name+"','"+req.body.description+"')",
function(err, result)
{
if (err) { console.error(err); doRelease(connection); return; }
console.log("added mylist: "+req.body.name);
doCommit(connection);
}
);
}
);
})
This is the print out:
addService is called
before commit
after commit
before release
after release
NJS-032: connection cannot be released because a database call is in progress
How should I handle this issue? Should I sleep for 1 second before calling release? Should I recursively call release until it is successful?
Thanks
Before I answer your question I have to point out that your code is currently open to SQL injection vulnerabilities. Values from end users (in this case from req.body) should not be concatenated into the SQL, they should be "bound" in with bind variables.
Also, you're API will not scale if you're getting one off connections. You should create a connection pool and get connections from the pool.
Finally, you can use autoCommit (in the execute options object) to save an unnecessary round trip.
Now to your question, you have to wait until the commit finishes before releasing the connection. In doCommit, move the call to doRelease so that it's in the callback to connection.commit:
function doCommit(connection) {
console.log("before commit");
connection.commit(function(err) {
if (err) {
console.error(err.message);
}
console.log("after commit");
doRelease(connection);
});
}
On another note, I have a series on building REST APIs you might want to check out. In part 2, on database basics, I show how you can simplify these types of simple statement executions. There are links to the code in GitHub so you should be able to pull it down to see how it works for you.

Is opening a new connection for each query with a MongoDB database good practise?

I'm creating a web server that stores a user's data in a MongoDB database. The code behind the web requests uses asynchronous functions to insert a document into the database, but because these functions are asynchronous it means that for every request a new connection is made with the server.
exports.create_user = function(username, password, callback) {
mongo.connect(url, function(err, db) {
db.collection('users').insertOne({username: username, password: password}, function(err, result) {
callback(result)
db.close()
})
})
}
I'm under the impression that doing it this way is not the best practise, but I can't think a way to do it using the module model that I'm using above. Any suggestions or advice would be appreciated.
I stumbled upon this on my own research whether to use a new connection for mongodb on each query is the best practice or to use connection pooling. Turns out, that mongodb suggests connection pooling for most use-cases.
Citing from the docs:
A Connection Pool is a cache of database connections maintained by the driver so that connections can be re-used when new connections to the database are required. To reduce the number of connection pools created by your application, we recommend calling MongoClient.connect once and reusing the database variable returned by the callback
I am usually using the following form to establish and reuse a connection while firing queries:
// db.js
import { MongoClient } from 'mongodb';
// this will hold our cached database connection, which will itself hold multiple connections in a pool to be used
let connection,
database;
export {
connect: (next) => {
// already established? => return connection
if (database) return next(undefined, database);
// establish connection
MongoClient.connect('http://localhost:27017/admin', (err, db) => {
if (err) return next(err);
// save connection
connection = db;
// connect to database
database = db.db('myDatabase');
// call callback
next(undefined, database);
});
},
disconnect: (next) => {
if (!connection) return next();
// close connection
connection.close();
next();
}
};
Firing queries:
import db from './db';
db.connect((err, db) => {
if (err) return next(err);
db.collection('myUsers').insertOne({name: 'test'}, (err) => {
if (err) throw err;
db.disconnect((err) => {
if (err) throw err;
console.log('Everything finished, database connection closed');
});
});
});
Note: It is possible to determine the maximum amount of pooled connections manually (afaik the default is 5?). Refer to the docs about how to set the amount of opened connections via the mongodb url.
By doing db.close() you can close the connection, If you don't close your connection, event loop will keep the connection open and your process will not exit. If you are building a web server where your process will not be terminated, it's not necessary for you to close the connection.
For a reference node-mongodb-native

Node JS and pg module 'How can I really close connection?'

I'm going crazy with node pg module, getting 'too many clients already' error.
My app.js file for example, manages some routes in which I query some data to postgres. app.js looks like bellow:
//First I create a client
var client = new pg.Client(connectionString);
// Then I use that client to every routes, for example:
ContPg.prototype.someController = function(req, res){
client.connect(function(error){
if(error) return console.error('error conectando', error);
// Need to close client if there's an error connecting??
client.query(someQuery, function(e,r){
client.end();
// Here sometimes I dont end client if i need to query more data
if(e) return console.error('error consultando', e);
// Do anything with result...
})
});
}
As I said I use that client for all routes in file pg.js, but in other files with other routes I do the same to connect to postgres (create client and use for all routes that manage that file)
Questions
Is something wrong with my code? I ended wrong client connection?
If there's nothing wrong, what could be causing 'too many clients already' error?
Thanks in advance!!
The recommended pattern is to use client pooling. From the node-postgres documentation:
Generally you will access the PostgreSQL server through a pool of
clients. A client takes a non-trivial amount of time to establish a
new connection. A client also consumes a non-trivial amount of
resources on the PostgreSQL server - not something you want to do on
every http request. Good news: node-postgres ships with built in
client pooling.
var pg = require('pg');
var conString = "postgres://username:password#localhost/database";
//this initializes a connection pool
//it will keep idle connections open for a (configurable) 30 seconds
//and set a limit of 20 (also configurable)
pg.connect(conString, function(err, client, done) {
if(err) {
return console.error('error fetching client from pool', err);
}
client.query('SELECT $1::int AS number', ['1'], function(err, result) {
//call `done()` to release the client back to the pool
done();
if(err) {
return console.error('error running query', err);
}
console.log(result.rows[0].number);
//output: 1
});
});
Don't forget to call done() or you'll be in trouble!

MongoDB collection.find() query hanging

I am trying to make a query to find a user by username like so:
userRouter.get('/user/:user_username', function(req, res) {
console.log("GET request to '/user/" + req.params.user_username + "'");
User.find({ usernmame: req.params.user_username }, function(err, user) {
if (err) res.send(err); return;
res.json(user);
});
});
The query never completes and Chrome dev tools is showing 'pending' on the request. It's definitely going to that route because it prints the console message I logged at the start. I executed the same query in the mongo cli and it works. I tried logging messages in the callback body, but it never gets to that point. I'm at a loss as to what to do at this point.
It's because if (err) res.send(err); return; gets evaluated as
if (err) {
res.send(err);
}
return;
To fix the issue, consider adding some braces.

Sending multiple mail in Nodejs

A web app I'm building will send out invoices to clients every third month. This will be a scheduled event that is run in the middle of the night, but under development I have put this code into a route so I can test it.
In short i want the code to do the following.
QUery all unsent invoices from DB.
Make a call to Mandrill for each invoice (In this call I'm also invoking a function creating a Mandrill message object from the invoice).
For every message Mandrill send, Update the DB invoice sent: true.
When all invoices are sent, make a final callback in the async.waterfall
The code below works. but i have some concerns regarding the _.each.
invoices.post('/invoices/send/', function(req, res, next) {
async.waterfall([
// Query all unsent invoices
function(callback) {
db.invoices.find({sent: false}).toArray(callback);
},
// Send all unsent invoices
function(invoices, callback) {
if (invoices.length === 0) {
var err = new Error('There are no unsent invoices');
err.status = 400;
return next(err); //Quick escape if there are no matching invoice to process
}
// Make a call to Mandrill transactional email service for every invoice.
_.each(invoices, function(invoice) {
mandrillClient.messages.sendTemplate({template_name: "planpal-invoice", template_content: null, message: mandrillClient.createInvoiceMessage(invoice)}, function(sendResult) {
console.log(sendResult);
db.invoices.updateById(invoice._id, {$set: {sent: true}}, function(err, saveResult) {
console.log(saveResult);
});
}, function(err) {
return next(err);
});
});
callback(null, 'done');
}
],
function(err, result) {
if (err) {
return next(err);
}
res.json(result);
});
});
I'm thinking I should use async.eachLimit instead.... but I dont know how to write it.
I have no idea what i should set the limit to, but I guess several parallel request would be better than running all mandrill request in serie like above, am I wrong? EDIT _.each run the callbacks in parallel. The difference from a async.each is that I dont get a "final callback"
Conclusion: Should i use a async.eachLimit above? If Yes, what is a good limit value?
I think you can use the https://github.com/caolan/async#each function.
it will execute the queries in parallel too

Categories