Hapi: Port is closed after moving to vps - javascript

I moved my Hapi app from my local computer to a vps and suddenly it stopped working, I checked out and I saw even though Hapi was logging that info Server running at: http://localhost:8080 It was actually closed.
I'm pretty sure it is Hapi problem because other ports are working, I don't have any firewall rules and it is even closed when I port scan my server locally.
This is my Hapi configuration:
const hapiServer = new Hapi.Server({
cache: {
engine: require('catbox-redis'),
host: redisAddress,
port: redisPort
}
});
hapiServer.connection({
host: 127.0.0.1,
port: 8080
});
hapiServer.register({
register: yar,
options: yarOptions
}, function (err) {
// start your hapiServer after plugin registration
hapiServer.start(function (err) {
console.log('info', 'Server running at: ' + hapiServer.info.uri);
});
});
I even changed host to my real ip address and changed port to 3000 but it didn't work.

did you catch any err here
hapiServer.register({
register: yar,
options: yarOptions
}, function (err) {
// start your hapiServer after plugin registration
hapiServer.start(function (err) {
if(err) {
console.log('registration failed', err) // ???
return;
}
console.log('info', 'Server running at: ' + hapiServer.info.uri);
});
});

Related

NodeMailer error when sending email. On production only

I'm using the following code to create a SMTPtransporter that will be used to send emails. The code works perfectly fine on my computer. With Yarn version '1.22.17'.
import * as nodemailer from 'nodemailer';
import * as SMTPTransport from "nodemailer/lib/smtp-transport";
const poolOptions = {
pool: true,
maxConnections: 1,
maxMessages: 5
}
const smtpOptions = {
host: 'smtp.gmail.com',
port: 587,
secure: false,
auth: {
user: SMTP_USER,
pass: SMTP_PASSWORD
},
tls:{
ciphers: 'SSLv3',
rejectUnauthorized: false
}
}
const nodemailerOptions: SMTPTransport.Options = {
...poolOptions,
...smtpOptions
}
const transport = nodemailer.createTransport(nodemailerOptions);
The send email function :
export function sendMail(
to: string, subject: string, text: string, html: string) {
const mailOptions = {
from: 'Bobo <no-reply#bobo.bo>', to, subject, text, html
};
return new Promise((resolve, reject) => {
transport.sendMail(mailOptions, (error, info) => {
if (error) {
console.error(
`Failed to send email to ${to} with body [${text}]`, error);
reject(new Error('Email sending failed'));
} else {
console.log(
`Email sent to ${to} with subject [${subject}]`, info.response);
resolve();
}
});
});
}
Meanwhile in the server i got the following error each time i try to send an email :
{ Error: read ECONNRESET
at TCP.onStreamRead (internal/stream_base_commons.js:111:27)
errno: 'ECONNRESET',
code: 'ECONNECTION',
syscall: 'read',
command: 'CONN' }
It's the same app deployed in ubuntu server with same Yarn version.
If anyone can help i would be very grateful.
NOTE :: For the deployment on the server, i used Nginx to forward all the requests on port 80 to 3000 (App running). And the ports for SMTP are open (25,587)
It seems to be some kind of problem related to the TLS node library, there is an already question addressing this problem, you can find it here
The problem was due to the restriction in the network put by the hosting company. I adjusted the smtp host to one they allow trafic with.

Ssh tunnel Node.js to MongoDB

I have installed ssh2 in order to be able to connect from my Node.js server to MongoDB. First, when I run the code provided by documentation I take this Module not found: Error: Can't resolve 'cpu-features' in 'C:\Users\chris\Desktop\abot\node_modules\ssh2\lib\protocol' and this
Module not found: Error: Can't resolve './crypto/build/Release/sshcrypto.node' in 'C:\Users\chris\Desktop\abot\node_modules\ssh2\lib\protocol'
With a little resarch I understand that this module not found errors are for optional modules and do not have some impact. Second, I am trying to connect with this code provided by the documentation:
const { Client } = require('ssh2');
const conn = new Client();
conn.on('ready', () => {
console.log('Client :: ready');
conn.forwardOut('my local ip', 8000, '127.0.0.1', 27017, (err, stream) => {
if (err) throw err;
stream.on('close', () => {
console.log('TCP :: CLOSED');
conn.end();
}).on('data', (data) => {
console.log('TCP :: DATA: ' + data);
}).end([
'HEAD / HTTP/1.1',
'User-Agent: curl/7.27.0',
'Host: 127.0.0.1',
'Accept: */*',
'Connection: close',
'',
''
].join('\r\n'));
});
}).connect({
host: 'ip of the mongo server',
port: ssh port,
username: 'my user name',
password: 'my password'
});
Let me note that the server is configured to connect without providing the public key.
When I am running the code, I take these messages:
Client :: ready
TCP :: CLOSED
Note number 2. I can connect with ssh tunnel through python, but now I need to connect through node also but without results.

ssh2 test not shutting down when using nightwatchjs tag

I'm running a javascript test that checks some remote file details using the nodejs package ssh2.
My current code works Ok, and the tests completes in a matter of seconds;
var Client = require('ssh2').Client;
var conn = new Client();
conn.on('ready', function() {
console.log('Client :: ready');
conn.sftp(function(err, sftp) {
if (err) throw err;
sftp.readdir('parkersreviewcontent/', function(err, list) {
if (err) throw err;
.... run some assertion...
conn.end();
});
});
})
.connect({
host: '******',
port: **,
user: '****',
password: '****',
});
The next step is for me to run this test as part of my nightwatchjs test suite, using a tag.
var Client = require('ssh2').Client;
var conn = new Client();
module.exports = {
'#tags': ['bugs']
};
conn.on('ready', function() {
console.log('Client :: ready');
conn.sftp(function(err, sftp) {
if (err) throw err;
sftp.readdir('parkersreviewcontent/', function(err, list) {
if (err) throw err;
.... run some assertion...
conn.end();
});
});
})
.connect({
host: '******',
port: **,
user: '****',
password: '****',
});
However, when I now run the test (with the tag added) via the tag it takes about 5 minutes to run (i.e. for the command prompt to be available again, the actual test again takes 1-2 seconds to perform).
Have I placed the tag command in the wrong place in the script?
Or, is there a command that I need to add that will 'close' the test once its run when using the tag?
Any help would be greatly appreciated. Thanks.

Nodemailer connection timeout error using Godaddy SMTP server on aws

I am trying to send email using nodemailer using godaddy smtp server(secureserver.net).
On my local machine code works fine but when I deploy same code on aws server it gives Error: Connection timeout.
Here is my code
const nodemailer = require('nodemailer');
let transporter = nodemailer.createTransport({
service: 'Godaddy',
host: 'smtpout.secureserver.net',
secureConnection: true,
port: 465,
auth: {
user: 'xxx#zzzzzz.com',
pass: '*******'
}
});
let mailOptions = {
from: 'xxx#zzzzzz.com',
to: 'aaaa#gmail.com',
subject: 'Test sub',
html: 'Test body'
};
transporter.sendMail(mailOptions, function (error, info) {
if (error) {
console.log(error);
} else {
console.log('Email sent: ' + info.response);
}
});
I have added port 465/25 in outbound port list for the server
Please let me know any workaround this?
(Solution - Latest) This one has worked successfully.
static transport = nodeMailer.createTransport({
// service: 'Godaddy', <--- Dont add this anymore --->
host: 'smtpout.secureserver.net',
port: 465,
auth: {
user: config.get('application.mail.MAIL_SENDER'),
pass: config.get('application.mail.MAIL_SENDER_PASSWORD')
},
tls: { rejectUnauthorized: false }
});
SendMailService.transport.sendMail(mailOptions, function (error: Error, response: SentMessageInfo) {
if (error) {
loggerService.logError(' Error in mail send ' + error);
}
loggerService.logDebug(' response in mail send ' + response);
callback(error, response);
});
This has worked in my case. I am using GoDaddy professional mail service.
I was also getting connection refused and connection timeout for some changes but this one(above mentioned code had worked).
If still there is an issue then check for DNS records whether it is pointing properly under "My Domain >> DNS >> DNS management". Over there check for A type and MX.

node-postgres: [error] This socket has been ended by the other party

I use node-protgres to manipulate the db in my nodejs app.
What I have done:
const { Pool, Client } = require('pg')
var dbconnect = {
user: 'xxxxx',
database: 'xxxxx',
password: 'xxxxx',
host: 'xxxxx.yyyyy.zzzzz.eu-west-1.rds.amazonaws.com',
port: 0000,
max: 20, // max number of clients in the pool
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000
};
const pool = new Pool(dbconnect);
pool.on('error', function (err, client) {
console.error('idle client error', err.message, err.stack)
});
function listOfPets(req, res) {
pool.connect(function (err, client, done) {
if (err) {
return console.error('error fetching client from pool', err);
}
var sql =
"SELECT * FROM pets"
client.query(sql, function (err, result) {
done();
if (err) {
return console.error('error running query', err);
}
... //Handle the result
});
});
}
The function is working fine, however the server keep sending me the error report OK to severe. I checked the log:
idle client error This socket has been ended by the other party Error:
This socket has been ended by the other party
at Socket.writeAfterFIN [as write] (net.js:291:12)
at Connection.end (/var/app/current/node_modules/pg/lib/connection.js:313:22)
at global.Promise (/var/app/current/node_modules/pg/lib/client.js:410:23)
at Client.end (/var/app/current/node_modules/pg/lib/client.js:409:12)
at Pool._remove (/var/app/current/node_modules/pg-pool/index.js:135:12)
at Timeout.setTimeout (/var/app/current/node_modules/pg-pool/index.js:38:12)
at ontimeout (timers.js:365:14)
at tryOnTimeout (timers.js:237:5)
at Timer.listOnTimeout (timers.js:207:5)
I think the problem came from the 'done()' doesn't work or was put at wrong place.
Any suggestion is appreciated.
Try declaring the pool object inside the callback. I had a similar error with a postgres client. I solved it by declaring the client inside the callback for a GET request.
Have a look at this issue, it's where I found my solution: Github issue
Hope this can help you =). I think you can use this link to fixed http://mherman.org/blog/2015/02/12/postgresql-and-nodejs/#.WbpNjMgjGHs

Categories