Handling timeouts with Node.js and mongodb - javascript

I am currently testing how some code stands up against the following scenario:
Node.js application is started and successfully establishes a connection to mongodb
After it has successfully setup a connection, the mongodb server dies and all subsequent requests fail
To do this I have got the following code which makes use of the official driver (found here: https://github.com/mongodb/node-mongodb-native) :
MongoClient.connect('mongodb://localhost:27017/testdb', function(err, db) {
app.get('/test', function(req, res) {
db.collection('users', function (err, collection) {
console.log(err);
if (err) {
// ## POINT 1 ##
// Handle the error
}
else {
collection.find({ 'username': username }, { timeout: true }).toArray(function(err, items) {
console.log(err);
if (err) {
// ## POINT 2 ##
// Handle the error
}
else {
if (items.length > 0) {
// Do some stuff with the document that was found
}
else {
// Handle not finding the document
}
}
});
}
});
});
});
As the mongodb server is no longer running when the request is being handled, I'd made the assumption that at either the points which I have labelled ## POINT 1 ## or ## POINT 2 ##, it would return an error indicating a timeout; this however, isn't the case.
I have tried a number of different settings (including one you can see here that explicitly allows the cursor to timeout), however I cannot seem to enable it in any way. In every configuration I've tried Node.js will simply keep waiting for the find() operation to callback and it never does.
If I start the Node.js application before running mongodb, it catches the error in the connect callback fine, but if the connection dies after that it doesn't seem to handle it in any way.
Is there a setting I am missing or is there no way to detect connections being terminated after they've been established?
Edit: just to be clear, the username variable used in the find method is actually declared in my full code, the code I've put in this post is a cut down version to illustrate the structure and error checking.

UPD:
Based on this post, looks like they've deployed fix that will do the same as what we do here. Not sure if this is already within npm (15.10.13). https://github.com/mongodb/node-mongodb-native/issues/1092#ref-commit-2667d13
After some investigation I've managed to understand what is going on there:
Every time you call any method to deal with database (find, update, insert, etc.) it creates cursor, that has own ID and registers itself to EventEmitter of Db for being called back later. In meantime it registers itself to _notReplied object within same CallBackStore.
But once connection is closed, I couldn't locate anything that would iterate through _notReplied cursors and would trigger them with errors or any logic with timers (it still might be somewhere there). So I've managed to write small work around, that does force triggers cursors with error when DB emits close event:
new mongodb.Db('testdb', new mongodb.Server('localhost', 27017, { }), { safe: true }).open(function (err, db) {
if (!err) {
db.on('close', function() {
if (this._callBackStore) {
for(var key in this._callBackStore._notReplied) {
this._callHandler(key, null, 'Connection Closed!');
}
}
});
// ...
} else {
console.log(err)
}
});
I recommend using first approach instead of MongoClient. Reasons are few: for example when you close connection and then call .find it will properly trigger error in callback, while with MongoClient it won't.
If you are using MongoClient:
MongoClient.connect('mongodb://localhost:27017/testdb', function(err, db) {
if (!err) {
db.on('close', function() {
if (this._callBackStore) {
for(var key in this._callBackStore._notReplied) {
this._callHandler(key, null, 'Connection Closed!');
}
}
});
// ...
} else {
console.log(err);
}
});
What this will do? Once connection is closed, it will iterate through All _notReplied cursors and trigger events for them with error Connection Closed!.
Test case:
items.find({ }).toArray(function(err, data) {
if (!err) {
console.log('Items found successfully');
} else {
console.log(err);
}
});
db.close();
That will force close database connection and trigger close event that you handle earlier and will make sure that cursor will be closed.
UPD:
I've added Issue on GitHub: https://github.com/mongodb/node-mongodb-native/issues/1092 we'll see what they say regarding this.

I had the same problem, and found this page from google.
But your choosed answer didn't resolve the problem and it is as same as you, this._callBackStore can't use
but i tried to wrap the Mongo, and it seems work fine
var MongoClient = require('mongodb').MongoClient;
var mongo = {};
mongo.init = function() {
MongoClient.connect('mongodb://localhost:27017/testdb', function(err, db) {
if (err) {
mongo.DB = '';
} else {
mongo.DB = db;
}
db.on('close', function() {
mongo.DB = '';
});
db.on('reconnect', function() {
mongo.DB = db;
});
}
}
mongo.getdb = function(callback) {
if (mongo.DB) {
callback(null, mongo.DB);
} else {
callback('can not connect to db', null);
}
}
module.exports = mongo;
firstly start server and init() it
and then you can require it and use
mongo.getdb(function(err, db) {
if (err) {
console.log(err);
} else {
db.collection('user').find({'xxx':'xxx'}).toArray(function(err, items) {
console.log(items);
});
}
});

After some further investigation, it seems that you can't specify "offline" timeouts such as in the scenario outlined above. The only timeout that can be specified is one which informs the server to timeout the cursor after 10 minutes of inactivity, however as in the scenario above the connection to the server is down this does not work.
For reference, I found the information here: https://github.com/mongodb/node-mongodb-native/issues/987#issuecomment-18915263 by who I believed to be one of the main contributors to the project.

I'm making api with Hapi and Mongodb (w/o mongoose). Features:
Start responding to API request only if mongo db is available
Stop responding if mongo dies during cycle
Re-start when mongo available again
Keep single connection for all requests
Combining some ideas from other answers and this post https://productbuilder.wordpress.com/2013/09/06/using-a-single-global-db-connection-in-node-js/ my approach is this:
server.js
Utilities.initializeDb(() => {
server.start((err) => {
if (err) throw err;
console.log('Server running at:', server.info.uri);
});
}, () => {
server.stop((err) => {
if (err) throw err;
console.log('Server stopped');
});
});
Utilities.js
"use strict";
const MongoClient = require('mongodb').MongoClient;
const MongoUrl = 'mongodb://localhost:27017/db';
export const Utilities = {
initializeDb: (next, onCrash) => {
const ConnectToDatabase = (params) => {
MongoClient.connect(MongoUrl, (err, db) => {
if (err !== null) {
console.log('#t4y4542te Can not connect to mongo db service. Retry in 2 seconds. Try #' + params.retry);
console.error(err);
setTimeout(() => {
ConnectToDatabase({retry: params.retry + 1});
}, 2000);
} else {
db.on('close', () => {
onCrash();
console.log('#21df24sf db crashed!');
ConnectToDatabase({retry: 0});
});
global.db = global.db || db;
next();
}
});
};
ConnectToDatabase({retry: 0});
}
};
I'm exporting db connection to global space. It feels like not best solution, but I had projects where db connection was passed as param to all modules and that sucked more. Maybe there should be some modular approach where you import db connection where you need it, but in my situation i need it almost everywhere, I would have to write that include statement in most files. This API is pointless w/o connection to db, so I think it might be best solution even if I'm against having something flying magically in global space..

Related

Modifying content in file

I run into a problem, which I cant solve.
Im making an app, where on the first page I need to choose one of two machines, there are 2 buttons on page and when one of them is clicked, i make POST to /mechineChoose where I pass id of selected machine. Then I need to change config.js file, where I have all params needed for rest of app.
const config = {
machineName: "Machine",
...
So in my code I need to change machineName, right now I use fs module to read and then write to file, but problem is that I cant change this name more than once. When I restart app, Im able to change the name, but when trying to choose second machine, nothing happens.
router.post("/machineChoose", async (req, res) => {
console.log(req.body.machineChoose);
if (req.body.machineChoose == 1) {
machineX = "Machine1";
} else {
machineX = "Machine2";
}
console.log(machineX);
fs.readFile('./config.js', 'utf-8', function (err,data){
if (err){
console.log(err);
}
var result = data.replace(config.machineName,machineX);
fs.writeFileSync('./config.js', result, 'utf-8', function(err){
if (err) return console.log(err);
});
});
return res.send("")
})
Any idea how to solve it ?
After writing to the file, you need to reload the config-object as it will still hold the previous state in-memory and thus further calls to data.replace(...) will not replace anything, since it will still be called with "Machine".
I would do something like this (although you should consider using a real database):
router.post("/machineChoose", async (req, res) => {
const chosenMachine = req.body.machineChoose == 1 ? "Machine1" : "Machine2";
const config = await readConfig();
config.machineName = chosenMachine;
await writeConfig(config);
res.status(204).end();
});
async function writeConfig(currentConfig) {
try {
await fs.promises.writeFile("./config.json", JSON.stringify(currentConfig));
} catch (e) {
console.log("Could not write config file", e)
throw e;
}
}
async function readConfig() {
try {
const rawConfig = await fs.promises.readFile("./config.json", {encoding: 'utf-8'});
return JSON.parse(rawConfig);
} catch (e) {
console.log("Could not read config file", e)
throw e;
}
}

MongoClient's findOne() never resolves on Electron, even when collection is populated

I'm working on making a web app with Electron and I successfully connected to a Mongo DB Atlas database and I'm able to send information to it. However, I seem to be unable to retrieve it. The first snippet of code that I included is how I connected to the database.
MongoClient.connect(URI, (err, client) => {
if (err){
console.log("Something unexpected happened connecting to MongoDB Atlas...");
}
console.log("Connected to MongoDB Atlas...");
currentDatabase = client.db('JukeBox-Jam-DB'); /* currentDatabase contains a Db */
});
Then, this second snippet is how I've been writing to the database, which seems to work perfectly fine.
ipc.on('addUserToDatabase', (event, currentUser) => {
const myOptions = {
type: 'info',
buttons: ['Continue'],
defaultId: 0,
title: 'Success',
message: 'Your account has been created.'
};
dialog.showMessageBox(mainWindow, myOptions);
currentCollection = currentDatabase.collection('UsersInformation');
currentCollection.insertOne(currentUser);
});
Lastly, this is the code that I've been trying to use to retrieve information from the database. I don't see where I could be making a mistake so that it is not working for retrieving, but yes for writing. From my understanding findOne() when passed without parameters should simply return a Promise that resolves to the first entry that matches the query that is passed to it. If a query is not provided then it will resolve to the item that was put in the database first. If there's no entry that matches the query, then it should resolve to null. Any ideas why this isn't working?
ipc.on('checkUsernameRegistration', (event) => {
currentCollection = currentDatabase.collection('UsersInformation');
let myDocument = currentCollection.findOne(); /* I don't understand why this isn't working! */
console.log(myDocument); /* This prints Promise { <pending> } */
if (myDocument !== null){ /* If myDocument is not null, that means that that there is already someone with that username in the DB. */
}
});
Thanks to everyone that is attempting to help me! I've been stuck in this for several hours now.
Try using async/await :
ipc.on('checkUsernameRegistration', async (event) => {
currentCollection = currentDatabase.collection('UsersInformation');
let myDocument = await currentCollection.findOne({ _id: value });
if (myDocument !== null){
console.log(myDocument);
}
});
or, you need to pass a callback, like this:
currentCollection.findOne({ country: 'Croatia' }, function (err, doc) {
if (err) console.error(err);
console.log(doc);
});
This happens because queries are not promises. Actually, I recommend you to study the difference between async and sync code in node.js. Just to understand where the callback should be passed to function, and where you can simply write await Model.method({ options }).

MongoDB insert without duplicates

Right now I am running mongodb and I just realized, I am inserting into collections and I am not sure if I am preventing duplicates. Here is how I am inserting:
function insertCompanies(companyID, companyURL, companyAppID) {
MongoClient.connect(url, function(err, db) {
if (err) {
console.log(err);
} else {
console.log("We are connected");
}
var collection = db.collection('Companies');
var company = {
"companyProfileID": companyID,
"url": companyURL,
"appID": companyAppID
};
collection.insert(company, function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
db.close();
});
}
I am using NodeJS. When a user calls the insertCompanies method, if the company already exists (via companyID or URL), it seems like the collection allows duplicates.
Is there a different insert function that prevents duplicates? I looked around and could not find a straight forward solution tailored to me.
Instead of db.insert() you should use db.update() and specify $upsert: true.
https://docs.mongodb.com/v3.2/reference/method/db.collection.update/
The answer is not to insert, instead use update. The update() method either modifies the fields in existing documents or replaces an existing document entirely.

How to assure you are closing your functions properly with Javascript?

Can anyone explain to me why I am processing the 50 records but I never get console.log("all records processed"); to the console.
It is like I am closing a function too soon or too late. What is the best approach when working with call backs because I am sure that is why I do not get "all records processed". I am using node v0.10.26 with the NPM oracle plugin.
var oracle = require('oracle');
var connectData = {
hostname: "127.0.0.1",
port: 1521,
database: "xe", // System ID (SID)
user: "user",
password: "password"
};
oracle.connect(connectData, function(err, connection) {
if (err) {
console.log("Error connecting to db:", err);
return;
}
connection.setPrefetchRowCount(50);
var reader = connection.reader("SELECT * FROM CARS", []);
function doRead(cb) {
reader.nextRow(function(err, row) {
if (err) return cb(err);
if (row) {
// do something with row
console.log("got " + JSON.stringify(row));
// recurse to read next record
return doRead(cb)
} else {
// we are done
return cb();
}
});
}
doRead(function(err) {
if (err) throw err; // or log it
console.log("all records processed");
});
});
Does it make a difference that you don't have a semicolon after the closing brace of connectData? Personally, I just set more and more console logs until I figure out the line that's messing everything up. That, or use breakpoints if you can.
For these step by step callbacks, I suggest you use async to manage the callback pyramid.

Asynchrously load NPM module

I'm trying to create a module that connects to my Database (couchDB using Cradle). In the end, the module exports the 'db' variable.
Here is the code:
var cradle = require('cradle'),
config = require('./config.js');
var db = new(cradle.Connection)(config.couchURL, config.couchPort, {
auth: {
username: config.couchUsername,
password: config.couchPassword
},
cache: true,
retries: 3,
retryTimeout: 30 * 1000
}).database('goblin'); //database name
//Check if DB exists
db.exists(function (err, exists) {
if (err && exists) {
console.log("There has been an error finding your CouchDB. Please make sure you have it installed and properly pointed to in '/lib/config.js'.");
console.log(err);
process.exit();
} else if (!exists) {
db.create();
console.log("Welcome! New database created.");
} else {
console.log("Talking to CouchDB at " + config.couchURL + " on port " + config.couchPort);
}
});
module.exports = db;
The problem is that the db.exists call is async, and if it doesn't exist I think the variable exports the variable before it's done, effecting the rest of the system.
It's being included in the executed node page the normal way:
var db = require('./couchdb.js');
Is there a way to prevent this from happening, or any best practices to tackle a problem like this without having a giant nested callback?
For reference, you can see the entire application right here (https://github.com/maned/goblin) , and the bug referenced for the project here (https://github.com/maned/goblin/issues/36).
Embrace the async style. Instead of exporting db from the module, export an async function like this:
module.exports = {
getDb: function(callback) {
db.exists(function(err, exists) {
if (exists && !err) {callback(db);} else { /* complain */ }
});
}
};
Now the application can just require('mymodule').getDb(appReady) where appReady accepts a db object that is known to be valid and usable.

Categories