Knex.js connection error when rerunning the function - javascript

I'm using Knex to connect to an Azure database, run a query that returns the status of a database (COPYING/ONLINE).
If I run this once, all is fine.
But if I use a setInterval to rerun this (I want to know when the status changes from COPYING to ONLINE) I'm getting a connection error the second, and third, and.. time the function is called.
Here is my code
const knex = require('knex')({
client: 'mssql',
connection: {
host: '***',
user: '***',
password: '***',
options: { requestTimeout: 350000, encrypt: true },
},
pool: {
min: 0,
max: 15,
},
});
async function copyStatus() {
try {
console.log('Running query');
const status = await knex.raw(
"SELECT name, state_desc FROM sys.databases WHERE name = 'Tide_QA_Dev_runtime' "
);
return status[0].state_desc;
// console.log(status[0].state_desc);
} catch (error) {
console.log(error);
} finally {
console.log('Closing connection with database');
await knex.destroy();
}
}
function intervalFunc() {
copyStatus().then(function (result) {
if (result === 'ONLINE') {
console.log('Database copy is done.');
} else if (result === 'Database is still copying') {
console.log('bezig');
}
});
}
setInterval(intervalFunc, 2000);
Here is my output
Closing connection with database
Database copy is done.
Running query
Error: Unable to acquire a connection
at Client_MSSQL.acquireConnection (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/client.js:286:13)
at Runner.ensureConnection (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/execution/runner.js:259:46)
at Runner.run (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/execution/runner.js:30:30)
at Raw.Target.then (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/builder-interface-augmenter.js:24:43)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
Closing connection with database
Running query
Error: Unable to acquire a connection
at Client_MSSQL.acquireConnection (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/client.js:286:13)
at Runner.ensureConnection (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/execution/runner.js:259:46)
at Runner.run (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/execution/runner.js:30:30)
at Raw.Target.then (/Users/davidbouckaert/Documents/Qite/TIDE_repo/node_modules/knex/lib/builder-interface-augmenter.js:24:43)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
Closing connection with database```
It looks like the connection is made (see console log: Running query).
Any idea what's going on?

You should use code like below, it works for me.
const knex = require('knex')({
client: 'mssql',
connection: {
// no tcp:
server: 'j***2sqlserver.database.windows.net',
user: 'j***2',
password: 'J****0',
database: 'yourdbname',
port: 1433,
options: { requestTimeout: 350000, encrypt: true },
},
pool: {
min: 0,
max: 15,
},
});
My Test Result.

You should not call knex.destroy() if you are going to make more queries to the DB.
Move that knex.destroy() call to somewhere just before application is going to exit.

Related

MariaDB NodeJS backend : too many connections

I have a NodeJS express backend which uses a MariaDB database.
My file dbconnect.js creates a mariadb pool and has a function to make queries.
const mariadb = require('mariadb');
const pool = mariadb.createPool({
host: process.env.DBHost,
user: process.env.DBUser,
database: process.env.DB,
password: process.env.DBSecret
});
const dbQuery = async(query) => {
let conn;
let res = '';
try {
conn = await pool.getConnection();
res = await conn.query(query);
} catch (err) {
console.log("Error sending Query: ", query, err.text);
} finally {
if (conn) {
conn.end();
}
return res;
}
}
Everything seems to work perfectly, but after a few months with the server running these messages begin to appear on the console:
These messages keep appearing every 10-14 seconds, but no queries are being performed.
Thanks for any help
I am not sure but there is one way,
configure your connection pool to ping at particular time. so it will close any inactive connections before the server closes them. mariadb has pingInterval for this
Replace this code with your code
const pool = mariadb.createPool({
host: process.env.DBHost,
user: process.env.DBUser,
database: process.env.DB,
password: process.env.DBSecret,
pingInterval: 60000
});
This will send a ping to the server every 60 seconds, which will prevent the server from closing inactive connections.

NodeMailer error when sending email. On production only

I'm using the following code to create a SMTPtransporter that will be used to send emails. The code works perfectly fine on my computer. With Yarn version '1.22.17'.
import * as nodemailer from 'nodemailer';
import * as SMTPTransport from "nodemailer/lib/smtp-transport";
const poolOptions = {
pool: true,
maxConnections: 1,
maxMessages: 5
}
const smtpOptions = {
host: 'smtp.gmail.com',
port: 587,
secure: false,
auth: {
user: SMTP_USER,
pass: SMTP_PASSWORD
},
tls:{
ciphers: 'SSLv3',
rejectUnauthorized: false
}
}
const nodemailerOptions: SMTPTransport.Options = {
...poolOptions,
...smtpOptions
}
const transport = nodemailer.createTransport(nodemailerOptions);
The send email function :
export function sendMail(
to: string, subject: string, text: string, html: string) {
const mailOptions = {
from: 'Bobo <no-reply#bobo.bo>', to, subject, text, html
};
return new Promise((resolve, reject) => {
transport.sendMail(mailOptions, (error, info) => {
if (error) {
console.error(
`Failed to send email to ${to} with body [${text}]`, error);
reject(new Error('Email sending failed'));
} else {
console.log(
`Email sent to ${to} with subject [${subject}]`, info.response);
resolve();
}
});
});
}
Meanwhile in the server i got the following error each time i try to send an email :
{ Error: read ECONNRESET
at TCP.onStreamRead (internal/stream_base_commons.js:111:27)
errno: 'ECONNRESET',
code: 'ECONNECTION',
syscall: 'read',
command: 'CONN' }
It's the same app deployed in ubuntu server with same Yarn version.
If anyone can help i would be very grateful.
NOTE :: For the deployment on the server, i used Nginx to forward all the requests on port 80 to 3000 (App running). And the ports for SMTP are open (25,587)
It seems to be some kind of problem related to the TLS node library, there is an already question addressing this problem, you can find it here
The problem was due to the restriction in the network put by the hosting company. I adjusted the smtp host to one they allow trafic with.

Knex leaves open server when using Jest (recommendation)

I'm trying to do some TDD with my following stack
Jest
Node
Koa2
SuperTest
Knex
Objection
My problem starts with the open handler of the koa server and I could solve that with the instance of the server and close it with server.close()
However, I have the same problem with knex; It leaves the server open and I have to run the knex.close to stop it. With that i can avoid the following error message
Jest did not exit one second after the test run has completed.
This usually means that there are asynchronous operations that weren't >stopped in your tests. Consider running Jest with --detectOpenHandles to >troubleshoot this issue.
knex.config
const config = {
development: {
client: 'pg',
connection: process.env.DATABASE_URL,
migrations:{
directory:"./migrations/"
},
pool: { min: 0, max: 7 }
},
test: {
client: 'pg',
connection: process.env.DATABASE_URL,
migrations:{
directory:"./migrations/"
},
pool: { min: 0, max: 7 }
},
//TBD
staging: {
client: 'postgresql',
connection: {
database: 'my_db',
user: 'username',
password: 'password'
},
pool: {
min: 2,
max: 10
},
migrations: {
tableName: 'knex_migrations'
}
},
//TBD
production: {
client: 'postgresql',
connection: {
database: 'my_db',
user: 'username',
password: 'password'
},
pool: {
min: 2,
max: 10
},
migrations: {
tableName: 'knex_migrations'
}
}
}
module.exports = config;
user.model.js
'use strict';
const knex = require('../config/db/knex');
const { Model } = require('objection');
Model.knex(knex);
class User extends Model {
// Table name is the only required property.
static get tableName() {
return 'user';
}
// Custom function to close knex
static close() {
knex.destroy();
}
}
module.exports = User;
user.test.js
const supertest = require('supertest');
const server = require('../../server');
var request = require("supertest").agent(server);
describe("Test users routes", () => {
let Model;
beforeAll(async () => {
// do something before anything else runs
console.log('Jest starting!');
Model = require('../../models/user.model')
});
// close the server after each test
afterAll(() => {
server.close();
Model.close();
console.log('server closed!');
});
test("Get /",async () => {
let res = await request.get('/users/');
expect(res.status).toBe(200);
});
});
I'm pretty sure it could be a better approach solution for what I did, maybe something related with the pool or some callback on the knex.cofing but I'm not sure.
Thank you

node-postgres: [error] This socket has been ended by the other party

I use node-protgres to manipulate the db in my nodejs app.
What I have done:
const { Pool, Client } = require('pg')
var dbconnect = {
user: 'xxxxx',
database: 'xxxxx',
password: 'xxxxx',
host: 'xxxxx.yyyyy.zzzzz.eu-west-1.rds.amazonaws.com',
port: 0000,
max: 20, // max number of clients in the pool
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000
};
const pool = new Pool(dbconnect);
pool.on('error', function (err, client) {
console.error('idle client error', err.message, err.stack)
});
function listOfPets(req, res) {
pool.connect(function (err, client, done) {
if (err) {
return console.error('error fetching client from pool', err);
}
var sql =
"SELECT * FROM pets"
client.query(sql, function (err, result) {
done();
if (err) {
return console.error('error running query', err);
}
... //Handle the result
});
});
}
The function is working fine, however the server keep sending me the error report OK to severe. I checked the log:
idle client error This socket has been ended by the other party Error:
This socket has been ended by the other party
at Socket.writeAfterFIN [as write] (net.js:291:12)
at Connection.end (/var/app/current/node_modules/pg/lib/connection.js:313:22)
at global.Promise (/var/app/current/node_modules/pg/lib/client.js:410:23)
at Client.end (/var/app/current/node_modules/pg/lib/client.js:409:12)
at Pool._remove (/var/app/current/node_modules/pg-pool/index.js:135:12)
at Timeout.setTimeout (/var/app/current/node_modules/pg-pool/index.js:38:12)
at ontimeout (timers.js:365:14)
at tryOnTimeout (timers.js:237:5)
at Timer.listOnTimeout (timers.js:207:5)
I think the problem came from the 'done()' doesn't work or was put at wrong place.
Any suggestion is appreciated.
Try declaring the pool object inside the callback. I had a similar error with a postgres client. I solved it by declaring the client inside the callback for a GET request.
Have a look at this issue, it's where I found my solution: Github issue
Hope this can help you =). I think you can use this link to fixed http://mherman.org/blog/2015/02/12/postgresql-and-nodejs/#.WbpNjMgjGHs

Connecting to MSSQL server with Sequelize

Using the following tedious code, I can successfully connect to an Azure SQL Server.
const Connection = require('tedious').Connection;
const connection = new Connection({
userName: '[USER]',
password: '[PASSWORD]',
server: '[HOSTNAME]',
options: {encrypt: true}
});
connection.on('connect', (err) => {
if (err) {
console.log('error connecting', err);
} else {
console.log('connection successful');
}
});
However, using what should be the equivalent Sequelize code, I get a connection timeout error.
const Sequelize = require('sequelize');
const sequelize = new Sequelize('[DBNAME]', '[USER]', '[PASSWORD]', {
dialect: 'mssql',
host: '[HOSTNAME]',
dialectOptions: {
encrypt: true
}
});
sequelize.authenticate().then((err) => {
console.log('Connection successful', err);
})
.catch((err) => {
console.log('Unable to connect to database', err);
});
Any thoughts?
Using: sequelize 3.29.0, tedious 1.14.0, SQL Server v12
I was getting below error
SequelizeConnectionError: Server requires encryption, set 'encrypt' config option to true.
I tried it out with Azure SQL Database and below way is working for me.
const sequelize = new Sequelize('DB Name', 'Username', 'Password', {
host: 'Host',
dialect: 'mssql',
dialectOptions: {
options: {
encrypt: true,
}
}
});
If you're trying it out with Azure SQL Database, you might also want to specify a longer request timeout value:
[...]
dialectOptions: {
requestTimeout: 30000 // timeout = 30 seconds
}
[...]
I tried your Sequelize code and it works fine. So you might need to add Client IP address to allow access to Azure SQL Server. To do this, go to the Azure portal, click on All Resources, select your SQL server, click on Firewall in the SETTINGS menu.
Your client address is conveniently included in the list, so you can just click on Add client IP followed by Save. When you run your code now, it should connect.
if you are using sql server management studio then simply replace dialect:'mysql' with dialect:'mssql':
const sequelize = new Sequelize('DB Name', 'Username', 'Password', {
host: 'Host',
dialect: 'mssql',
dialectOptions: {
options: {
encrypt: true,
}
}
});

Categories