I am using Node Express API to run SQL queries to populate a dashboard of data. I am using the mssql-node package to do so. Sometimes it runs flawlessly, other times I get the following error:
[Error: [Microsoft][SQL Server Native Client 11.0]Query timeout expired]
I am creating a poolPromise with a connectionPool to the db, then I pass that object to my other controllers which run the specific queries to populate data. I run the server which initiates the db.js script and connects to MSSQL with a pool connection.
db.js:
// for connecting to sql server
const sql = require('mssql/msnodesqlv8');
// db config to connect via windows auth
const dbConfig = {
driver: 'msnodesqlv8',
connectionString: 'Driver={SQL Server Native Client 11.0};Server={my_server};Database={my_db};Trusted_Connection={yes};',
pool: {
idleTimeoutMillis: 60000
}
};
// create a connectionpool object to pass to controllers
// this should keep a sql connection open indefinitely that we can query when the server is running
const poolPromise = new sql.ConnectionPool(dbConfig)
.connect()
.then(pool => {
console.log('Connected to MSSQL');
return pool;
})
.catch(err => console.log('Database Connection Failed! Bad Config: ', err))
module.exports = { sql, poolPromise };
An example of one of my controllers and how I use the poolPromise object is below. I currently have about 7 of these controllers that run their own specific query to populate a specific element on the dashboard. The performance of the queries each run within 1-10 seconds (depending on current server load, as I am querying an enterprise production server/db, this can vary). As I mentioned earlier, the queries run flawlessly sometimes and I have no issues, but at other times I do have issues. Is this a symptom of me querying from a shared production server? Is it preferred to query from a server that has less load? Or am I doing something in my code that could be improved?
const { sql, poolPromise } = require('../db');
// function to get data
const getData = async (req, res) => {
try {
// create query parameters from user request
let id= req.query.id;
// create query from connectionPool
let pool = await poolPromise;
let qry = `
select * from tbl where id = #Id
`
let data = await pool.request()
.input('Id', sql.VarChar(sql.MAX), id)
.query(qry);
// send 200 status and return records
res.status(200);
res.send(data.recordset);
} catch(err) {
console.log('Error:');
console.log(err);
res.sendStatus(500);
}
};
module.exports = { getData };
Related
I have inherited a legacy system made in nodeJS and postgres. Whenever I encounter a database call error e.g let's say an insert violates a duplicate constraint the db client throws an error which I handle but its unable to make subsequent queries to the db and it hangs.
I have tried adding an error listener which recreates the client on error message but to no avail. I have many files calling the db so Its not ideal to recreate the client on each catch clause.
db connection class
const { Client } = require('pg');
const config = require('../../config');
const log = require('../../logger').LOG;
const client = new Client({
connectionString: config.dbUrl
});
client.on('error', err => {
log.info('client connection Error!'+ err.stack);
client = null;
client = new Client({
connectionString: config.dbUrl
});
client.connect();
});
client.on('end', () => {
log.info('client connection! End client sent');
});
client.on('notification', msg => {
log.info('client connection! notification message sent'+ msg);
});
client.connect();
// Export the Postgres Client module
module.exports = client;
sample query that is encountering
function createGame(gameHash) {
Model.query("INSERT INTO games (hash) values($1) RETURNING gId",[gameHash], function(err,db_res) {
if(err) {
log.info('Game record creation error: '+err.stack);
}
log.info('create record db resp: '+JSON.stringify(db_res));
gameId = db_res.rows[0].gId;
});
}
UPDATE:
so after reviemwing the logs keenly I have observed the issue occurs when the client decides to call an update command instead of the insert command in the query above. Interestingly the query passed to it is hard coded string clearly indicating insert but for some reason the update command is called.
2023-02-03 01:34:04 : create record db resp- with hash=>:
{"command":"UPDATE","rowCount":1,"oid":null,"rows":[],"fields":[],"_types":{"_types":{"arrayParser":{},
"builtins":{"BOOL":16,"BYTEA":17,"CHAR":18,"INT8":20,"INT2":21,"INT4":23,"REGPROC":24,"TEXT":25,"OID":26,"TID":27,"XID":28,"CID":29,"JSON":114,
"XML":142,"PG_NODE_TREE":194,"SMGR":210,"PATH":602,"POLYGON":604,
"CIDR":650,"FLOAT4":700,"FLOAT8":701,"ABSTIME":702,"RELTIME":703,
"TINTERVAL":704,"CIRCLE":718,"MACADDR8":774,"MONEY":790,"MACADDR":829,"INET":869,"ACLITEM":1033,"BPCHAR":1042,"VARCHAR":1043,"DATE":1082,
"TIME":1083,"TIMESTAMP":1114,"TIMESTAMPTZ":1184,"INTERVAL":1186,
"TIMETZ":1266,"BIT":1560,"VARBIT":1562,"NUMERIC":1700,"REFCURSOR":1790,"REGPROCEDURE":2202,"REGOPER":2203,"REGOPERATOR":2204,"REGCLASS":2205,"REGTYPE":2206,"UUID":2950,"TXID_SNAPSHOT":2970,"PG_LSN":3220,
"PG_NDISTINCT":3361,"PG_DEPENDENCIES":3402,"TSVECTOR":3614,"TSQUERY":3615,"GTSVECTOR":3642,"REGCONFIG":3734,"REGDICTIONARY":3769,
"JSONB":3802,"REGNAMESPACE":4089,"REGROLE":4096}},"text":{},"binary":{}},"RowCtor":null,"rowAsArray":false}
I have a MariaDB that stores Energy-Data like voltage, frequency and so on. My aim is to visualize the data in a Web-application. Though i achieved to connect the MariaDB to node.js and log the data on a specific port thanks to the code below, i don't have a clue how to store this data for further mathematic operations or visualizations.
How can i store the data for further operations?
const express = require('express');
const pool = require('./db');
const app = express();
const port = 4999;
// expose an endpoint "persons"
app.get('/persons', async (req, res) => {
let conn;
try {
// make a connection to MariaDB
conn = await pool.getConnection();
// create a new query to fetch all records from the table
var query = "select * from Herget_Netz2_WirkleistungL1";
// run the query and set the result to a new variable
var rows = await conn.query(query);
console.log('Daten kommen');
// return the results
res.send(rows);
} catch (err) {
throw err;
} finally {
if (conn) return conn.release();
}
});
app.listen(port, () => console.log(`Listening on pfort ${port}`));
This question is quite broad.
It sounds like you need to set up a frontend and call fetch on your endpoint, something like:
fetch(<your-url>/persons)
.then(r => r.json())
.then(yourData => "<p>" + yourData "</p>")
Your data will be interpolated into HTML then. You will need to iterate over it.
The "storage" will take place in the variable you define in the second .then(yourData) of the promise for you to do further operations on.
You should search for tutorials like "set up frontend with maria db database and node backend".
Im new to nodejs and Im currently doing an sql to mongodb migration. I have created a script to load data to mongodb from sql queries. I created the script with the sample code from Google and it is working. But im facing below issue and need a workaround for this.
I have an sql query array and I don't need to run those queries if any of the queries has any syntax issues or any errors in the query result. (Say if the second query has syntax issue then no need to load the data of first query to mongo, currently its loading in my case). Basically if any of the query has any issue then no need to load the result in the mongo collection. And also if any issues from the mongo side no need to commit the transactions.
I have used the mongo transactions here to roll back the data if any errors. please find the below code and any help would be much appreciated.The sql and mongo credentials are mock data only.
config file code
var mongoCollection = 'collectionName';
exports.mongoCollection = mongoCollection;
var queryList = [
'sample query one',
'sample query two '
];
exports.queryList = queryList;
main script code
var MongoClient = require('mongodb').MongoClient;
var sql = require('mysql');
const config = require('./assets/config');
var sqlConfig = {
user: 'username',
password: 'password',
server: 'servername',
database: 'databasename',
port: 'portname',
multipleStatements: true
};
async function transaction() {
const mongodbUrl = 'mongourl';
const client = await MongoClient.connect(mongodbUrl, {useNewUrlParser: true}, {useUnifiedTopology:
true});
const db = client.db();
config.queryList.forEach(query => {
new sql.ConnectionPool(sqlConfig).connect().then(pool => {
return pool.request().query(query)
}).then(result => {
(async()=>{
const session = client.startSession();
session.startSession({
readConcers: {level: 'snapshot'},
writeConcern: {w: 'majority'}
});
try {
const collection = client.db('mongodbName').collection(config.mongoCollection);
await collection.insertMany(result.recordset, {session});
await session.commitTransaction();
session.emdSession();
console.log('transaction completed');
}catch(error){
await session.abortTransaction();
session.endSession();
console.log('transaction aborted');
throw error;
}
});
sql.close();
}).catch(error => {
sql.close();
throw error;
})
});
};
transaction();
Depending on the volume of data, you might look at breaking the process into two parts
Get the data from mySql
If no errors, load into Mongo
That would save you having to roll back the mongo writes
You can also take advantage of the default mongo pool size (5) and use pool on the mySQL side too.
Currently, this code is creating a pool for every select, which isn't optimal
config.queryList.forEach(query => {
new sql.ConnectionPool(sqlConfig).connect().then(pool => {//<-New pool per query?
return pool.request().query(query)
})
})
Instead, you can set up a pool once, per the mySql documentation
It looks like that driver only has a callback api, but you can promisfy the query to make it easier to work with.
So to put it all together, you could try something like this (this isn't working/tested code, just a suggestion)
var MongoClient = require('mongodb').MongoClient;
var sql = require('mysql');
const config = require('./assets/config');
var pool = sql.createPool({
connectionLimit : 5,
host : 'servername',
user : 'username',
password : 'password',
database : 'databasename'
});
async function transaction() {
try{
const mongodbUrl = 'mongourl';
const client = await MongoClient.connect(mongodbUrl, {useNewUrlParser: true}, {useUnifiedTopology: true});
const db = client.db();
const collection = client.db('mongodbName').collection(config.mongoCollection);
//Map your query list to an array of runSql promises
//this will complete when all queries return, and jump to the catch if any fail
let results = await Promise.all(config.queryList.map(runSql))
//Map the results to an array of mongo inserts
let inserts = await Promise.all(results.map(r=>collection.insertMany(r.recordset)))
//Close all connections
pool.end((err)=>err?console.err(err):console.log('MySQL Closed'))
client.close((err)=>err?console.err(err):console.log('MongoDB Closed'))
}
catch(err){
console.error(err)
}
};
transaction();
function runSql(queryStr){
return new Promise((resolve, reject)=>{
pool.query(queryStr, function (error, results, fields){
error?reject(error):resolve(results)
})
})
}
If data volume is a concern, you might want to look at getting streams from your mySql selects instead of just running them
To make my code more readable, I'm trying to move all database related code into a single file. and use Sequelize as ORM. I would like that this file, when included provide a ready to use Database. Tables schemas are also managed by Sequelize which is why I use the sync() method to create the tables on the first run. Unfortunately, when I run the application for the first time, I get an error that the table doesn't exist when using this code:
File: test.js
const database = require('./dbInit');
(async () => {
await database.testTable.max('id').then((maxId) => {
console.log(maxId);
});
})();
File: dbInit.js
const Sequelize = require('sequelize');
const sequelize = new Sequelize('mysql://root:root#localhost:3306/test');
const testTable = sequelize.import('testTable');
const database = {
sequelize: sequelize,
testTable: testTable,
};
sequelize
.authenticate()
.then(() => {
console.log('Connection to the database has been established successfully.');
})
.catch(error => {
console.error(error);
});
sequelize.sync();
module.exports = database;
File: testTable.js
const Sequelize = require('sequelize');
module.exports = (sequelize, DataTypes) => {
return sequelize.define('testTable',
{
id: {
type: Sequelize.BIGINT(19).UNSIGNED,
primaryKey: true,
autoIncrement: false,
}
}
);
};
When I run the code as is, without tables created, I can see from the logs that the query is run before the connection to the database is available:
> node .\test.js
Executing (default): SELECT 1+1 AS result
Executing (default): SELECT max(`id`) AS `max` FROM `testTables` AS `testTable`;
Connection to the database has been established successfully.
(node:1572) UnhandledPromiseRejectionWarning: SequelizeDatabaseError: Table 'test.testtables' doesn't exist
I have found a way to make it work by adding this like, just before the call to the DB (in test.js before the max('id') call):
await database.sequelize.sync();
Is there any other way to have the dbInit module completely independent and not having to add this sync() call inside all other files which will require database connectivity?
I've looked for sync module loading but it doesn't seem an option yet.
Because of async behavior all of ops that You want to do:
Connect
Sync
Do DB operations
You've to make it following way:
put model files to: db folder as: db/schemas/User.js
and make module file for db: db/index.js
const Sequelize = require('sequelize');
const sequelize = new Sequelize('mysql://root:root#localhost:3306/test');
const connect = async () => {
try {
await sequelize.authenticate();
await sequelize.sync();
console.log('Connection to the database has been established successfully.');
}
catch (error) {
console.error(error.message);
process.exit(-1);
}
});
const model = name => database.models[name];
const User = sequelize.import('./schemas/User');
const database = {
sequelize: sequelize,
models: {User},
connect,
model
};
module.exports = database;
and in test.js:
const db = require('./db');
(async () => {
await db.connect();
const User = db.model('User');
const id = await User.max('id');
console.log(id);
})();
P.S. forget about examples that used in web apps when developer does not care when db will connect and when express app will listen on port.
Your question is different - You want to do db query immediately, so You've to make sure connection and sync established successfully.
You can follow up my github repo Sequelize-DemoApp. It's a fully working full stack application made especially to demonstrate and understand Sequelize.js and it's integration with nodejs
I am working on a Koa + Mongodb backend. My question is: When should I close the db, or does Mongodb manage that because I am not closing any of them right now and it seems fine.
// app.js
const Koa = require('koa')
const database = require('./database')
const app = new Koa()
database
.connet()
.then(() => {app.listen(':8080')})
.catch((err) => {console.error(err)})
// ./database.js
const MongoClient = require('mongodb').MongoClient
const Model = require('./model')
class Database {
async connect() {
if (!db) {
db = await MongoClient.connect("localhost:27017")
this.item = new Model(db, 'item_collection')
}
}
}
module.exports = new Database()
// ./model.js
class Model {
constructor(db, collectionName) {
this.name = collectionName
this.database = database
}
async findAll() {
const result = await this.db.collection(this.name).find().toArray()
if (!result) {
throw new Error('error')
}
return result
}
}
module.exports = Model
I also ran a stress test using vegeta to make API request to the server at 100 request / second and the response time is good. So, am I worried about premature optimization here? If not, when should I close the db?
As Koa keeps running (and in your case listening on port 8080) you should not close the db connection.
If you are running scripts that are expected to end (tasks running on cron, etc) you should manually close the connection when you are finished with all of your db tasks.
You can take a look at this example for express.js (Koa's sister framework)