Ssh tunnel Node.js to MongoDB - javascript

I have installed ssh2 in order to be able to connect from my Node.js server to MongoDB. First, when I run the code provided by documentation I take this Module not found: Error: Can't resolve 'cpu-features' in 'C:\Users\chris\Desktop\abot\node_modules\ssh2\lib\protocol' and this
Module not found: Error: Can't resolve './crypto/build/Release/sshcrypto.node' in 'C:\Users\chris\Desktop\abot\node_modules\ssh2\lib\protocol'
With a little resarch I understand that this module not found errors are for optional modules and do not have some impact. Second, I am trying to connect with this code provided by the documentation:
const { Client } = require('ssh2');
const conn = new Client();
conn.on('ready', () => {
console.log('Client :: ready');
conn.forwardOut('my local ip', 8000, '127.0.0.1', 27017, (err, stream) => {
if (err) throw err;
stream.on('close', () => {
console.log('TCP :: CLOSED');
conn.end();
}).on('data', (data) => {
console.log('TCP :: DATA: ' + data);
}).end([
'HEAD / HTTP/1.1',
'User-Agent: curl/7.27.0',
'Host: 127.0.0.1',
'Accept: */*',
'Connection: close',
'',
''
].join('\r\n'));
});
}).connect({
host: 'ip of the mongo server',
port: ssh port,
username: 'my user name',
password: 'my password'
});
Let me note that the server is configured to connect without providing the public key.
When I am running the code, I take these messages:
Client :: ready
TCP :: CLOSED
Note number 2. I can connect with ssh tunnel through python, but now I need to connect through node also but without results.

Related

NodeMailer error when sending email. On production only

I'm using the following code to create a SMTPtransporter that will be used to send emails. The code works perfectly fine on my computer. With Yarn version '1.22.17'.
import * as nodemailer from 'nodemailer';
import * as SMTPTransport from "nodemailer/lib/smtp-transport";
const poolOptions = {
pool: true,
maxConnections: 1,
maxMessages: 5
}
const smtpOptions = {
host: 'smtp.gmail.com',
port: 587,
secure: false,
auth: {
user: SMTP_USER,
pass: SMTP_PASSWORD
},
tls:{
ciphers: 'SSLv3',
rejectUnauthorized: false
}
}
const nodemailerOptions: SMTPTransport.Options = {
...poolOptions,
...smtpOptions
}
const transport = nodemailer.createTransport(nodemailerOptions);
The send email function :
export function sendMail(
to: string, subject: string, text: string, html: string) {
const mailOptions = {
from: 'Bobo <no-reply#bobo.bo>', to, subject, text, html
};
return new Promise((resolve, reject) => {
transport.sendMail(mailOptions, (error, info) => {
if (error) {
console.error(
`Failed to send email to ${to} with body [${text}]`, error);
reject(new Error('Email sending failed'));
} else {
console.log(
`Email sent to ${to} with subject [${subject}]`, info.response);
resolve();
}
});
});
}
Meanwhile in the server i got the following error each time i try to send an email :
{ Error: read ECONNRESET
at TCP.onStreamRead (internal/stream_base_commons.js:111:27)
errno: 'ECONNRESET',
code: 'ECONNECTION',
syscall: 'read',
command: 'CONN' }
It's the same app deployed in ubuntu server with same Yarn version.
If anyone can help i would be very grateful.
NOTE :: For the deployment on the server, i used Nginx to forward all the requests on port 80 to 3000 (App running). And the ports for SMTP are open (25,587)
It seems to be some kind of problem related to the TLS node library, there is an already question addressing this problem, you can find it here
The problem was due to the restriction in the network put by the hosting company. I adjusted the smtp host to one they allow trafic with.

Nodemailer connection timeout error using Godaddy SMTP server on aws

I am trying to send email using nodemailer using godaddy smtp server(secureserver.net).
On my local machine code works fine but when I deploy same code on aws server it gives Error: Connection timeout.
Here is my code
const nodemailer = require('nodemailer');
let transporter = nodemailer.createTransport({
service: 'Godaddy',
host: 'smtpout.secureserver.net',
secureConnection: true,
port: 465,
auth: {
user: 'xxx#zzzzzz.com',
pass: '*******'
}
});
let mailOptions = {
from: 'xxx#zzzzzz.com',
to: 'aaaa#gmail.com',
subject: 'Test sub',
html: 'Test body'
};
transporter.sendMail(mailOptions, function (error, info) {
if (error) {
console.log(error);
} else {
console.log('Email sent: ' + info.response);
}
});
I have added port 465/25 in outbound port list for the server
Please let me know any workaround this?
(Solution - Latest) This one has worked successfully.
static transport = nodeMailer.createTransport({
// service: 'Godaddy', <--- Dont add this anymore --->
host: 'smtpout.secureserver.net',
port: 465,
auth: {
user: config.get('application.mail.MAIL_SENDER'),
pass: config.get('application.mail.MAIL_SENDER_PASSWORD')
},
tls: { rejectUnauthorized: false }
});
SendMailService.transport.sendMail(mailOptions, function (error: Error, response: SentMessageInfo) {
if (error) {
loggerService.logError(' Error in mail send ' + error);
}
loggerService.logDebug(' response in mail send ' + response);
callback(error, response);
});
This has worked in my case. I am using GoDaddy professional mail service.
I was also getting connection refused and connection timeout for some changes but this one(above mentioned code had worked).
If still there is an issue then check for DNS records whether it is pointing properly under "My Domain >> DNS >> DNS management". Over there check for A type and MX.

How to connect AWS elasticache redis from Node.js application?

How to connect AWS elasticache redis from Node.js application ?
You're connecting to Redis. The fact that it's a managed AWS service is not really that important in this respect.
So, use a Node.js package that implements a Redis client interface, for example:
redis
node-redis
You can use the package ioredis
and stablish a connection like this
const Redis = require('ioredis')
const redis = new Redis({
port: 6379,
host: 'your-redis-host',
connectTimeout: 10000 // optional
});
You can try connecting using ioredis.
var Redis = require('ioredis');
var config = require("./config.json");
const redis = new Redis({
host: config.host,
port: config.port,
password: config.password, // If you have any.
tls: {}, // Add this empty tls field.
});
redis.on('connect', () => {
console.log('Redis client is initiating a connection to the server.');
});
redis.on('ready', () => {
console.log('Redis client successfully initiated connection to the server.');
});
redis.on('reconnecting', () => {
console.log('Redis client is trying to reconnect to the server...');
});
redis.on('error', (err) => console.log('Redis Client Error', err));
//check the functioning
redis.set("framework", "AngularJS", function(err, reply) {
console.log("redis.set ", reply);
});
redis.get("framework", function(err, reply) {
console.log("redis.get ", reply);
});

Connect via ssh to see folders

I use the following repo to connect via ssh to some system.
I was able to do it ,( I got client ready and on data event )
Now I want to see the folders of the app which I do ssh to it ,
any idea how I can do it ?
when I use the ssh via command line I was able to connect to the app
and simple by doing ls I can see the application folders etc
I use the following repo and this sample code
https://github.com/mscdex/ssh2
var conn = new Client();
conn.on('ready', function() {
console.log('Client :: ready');
conn.shell(function(err, stream) {
if (err) throw err;
stream.on('close', function() {
console.log('Stream :: close');
conn.end();
}).on('data', function(data) {
console.log('STDOUT: ' + data);
}).stderr.on('data', function(data) {
console.log('STDERR: ' + data);
});
stream.end('ls -l\nexit\n');
});
}).connect({
host: '192.168.100.100',
port: 22,
username: 'frylock',
privateKey: require('fs').readFileSync('/here/is/my/key')
});

Hapi: Port is closed after moving to vps

I moved my Hapi app from my local computer to a vps and suddenly it stopped working, I checked out and I saw even though Hapi was logging that info Server running at: http://localhost:8080 It was actually closed.
I'm pretty sure it is Hapi problem because other ports are working, I don't have any firewall rules and it is even closed when I port scan my server locally.
This is my Hapi configuration:
const hapiServer = new Hapi.Server({
cache: {
engine: require('catbox-redis'),
host: redisAddress,
port: redisPort
}
});
hapiServer.connection({
host: 127.0.0.1,
port: 8080
});
hapiServer.register({
register: yar,
options: yarOptions
}, function (err) {
// start your hapiServer after plugin registration
hapiServer.start(function (err) {
console.log('info', 'Server running at: ' + hapiServer.info.uri);
});
});
I even changed host to my real ip address and changed port to 3000 but it didn't work.
did you catch any err here
hapiServer.register({
register: yar,
options: yarOptions
}, function (err) {
// start your hapiServer after plugin registration
hapiServer.start(function (err) {
if(err) {
console.log('registration failed', err) // ???
return;
}
console.log('info', 'Server running at: ' + hapiServer.info.uri);
});
});

Categories