I'm having mocha tests with project that uses knex pool.
The issue I have is when test ends, it keep waiting until knex pool is drained, and it takes extra seconds (about 5-10) to finish mocha run.
Code example:
knex initialized:
const Knex = require('knex');
const knex = new Knex({
client: 'pg',
pool: { min: 1, max: 10 },
connection: {},
searchPath: 'knex,public',
// debug: true,
});
Mocha test drains connection:
after((done) => {
knex.destroy().then(done);
})
What I would like to accomplish is any of both:
drain knex connections faster
somehow mocha finish test without waiting knex draining
Any suggestions?
This functionality changed while ago in mocha
2879: By default, Mocha will no longer force the process to exit once all tests complete. This means any test code (or code under test)
which would normally prevent node from exiting will do so when run in
Mocha. Supply the --exit flag to revert to pre-v4.0.0 behavior
(#ScottFreeCode, #boneskull)
https://github.com/mochajs/mocha/blob/master/CHANGELOG.md
If the above doesn't help you can always call process.exit(0) in done to force killing the process with success exit code.
AFAIK in knex side there is no force options for force killing the connections.
This will help your connection from draining.
const config = {
client: "pg",
connection: {
host: hostname,
user: username,
password: password,
database: datbase
},
pool: {
min: 0,
max: 10
},
acquireConnectionTimeout: 1000
}
var Knex = require('knex')
this.functioname = () => {
var output = {}
return new Promise(function (resolve) {
var knex = new Knex(config)
knex(tablename)
.select()
.then((result) => {
if (result.length > 0) {
output.error = false
output.result = result
} else {
output.error = true
}
resolve(output)
})
.catch((err) => {
err.error = true
resolve(err)
})
.finally(() => {
knex.destroy()
})
})
}
Related
I have an issue when I am trying to set Knex database dynamically. I have multiple database, which increments in every hour. ( e.g db-1-hour, db-2-hour). When we switched into a new hour I wan to use the next database. I created a sample function which returns a new Knex function based on the new database but I got a deprecated warning.
My config
import knex from 'knex';
const knexConfig = {
client: 'pg',
connection: {
host: host,
port: port,
user: user,
database: '',
password: password,
},
pool: {
min: 2,
max: 10,
},
timezone: 'UTC',
};
exports const getCurrentDb = async () => {
const hourDb = await getLatestDbName()
cons knexConfig.connection.database = hourDb; // I update the database name
return knex(knexConfig);
}
Usage
import { getCurrentDb } from "./config"
const getSomething = async () => {
const db = await getCurrentDb()
return db.select().from("something")
}
The code is working but I always get this waring message:
calling knex without a tableName is deprecated. Use knex.queryBuilder() instead.
How could I connect to a database dynamically? Thank you in advance!
The warning message is not related to the DB switch mechanism.
Try to change your select statement to something like:
import { getCurrentDb } from "./config"
const getSomething = async () => {
const db = await getCurrentDb()
return db("something").columns('*')
}
I am trying to kill a child process I have running within my server. Basically the child process runs johnny-five code I've written in an online terminal in React to my server. When I run the child process, the code works great but if I want to kill the child process I cant do so without stopping the server. I've tried doing so with Control-C and .exit() but neither seem to work.
codeRouter
.post('/codeAPI', (req, res) => {
console.log(req.body)
let fileName = `johnnyFiles/${req.body.currentFile}`
fs.writeFileSync(fileName, req.body.currentCode, (err) => {
if (err) throw err
})
let id = shortid.generate()
let fileObject = {
fileName: req.body.currentFile,
fileContents: req.body.currentCode,
ID: id
}
data = [fileObject, ...data]
fs.writeFileSync('data/fileData.json', JSON.stringify(data), (err) => {
if (err) throw err
})
res.json(data)
///////////////////////////////////////////
let nodeSpawn = spawn('node', [fileName], {
//detached: true,
shell: true
})
nodeSpawn.stdout.on('data', (data) => {
console.log("OUTPUT", data.toString())
})
nodeSpawn.stderr.on('data', (data) => {
console.log("ERRORS", data.toString())
})
nodeSpawn.on('exit', (code) => {
console.log(`Child exited with code ${code}`)
nodeSpawn.kill('SIGINT')
})
})
`
You can use the linux command line.
To see the running processes use the command, use:
pgrep node
To kill the process you can use:
kill <pid>
Or to force the shutdown
kill -9 <pid>
Or if you want kill all node processes
kill $(pgrep node)
I have the code in my nodejs file which gives me the following information
host:"147.0.40.145"
method:"aes-256-cfb"
password:"9c359ad1ebeec200"
port:38473
I need to use above information and want to connect VPN through it. I have used below code to extract the above information.
const connectServer = (serverId) => {
const token = store('access_token')
httpOptions.Authorization = token.token_type+' '+token.access_token
return new Promise((resolve, reject) => {
const response = await axios.post(`${baseUrl}/servers/${serverId}/connect`, {'serverId':serverId},{headers: httpOptions})
console.log(response.data)
resolve(response.data)
})
}
So I need to know whether it is possible using nodejs to connect or create VPN?
Thank you in advance!!!
Install this npm
npm i node-openvpn --save
const openvpnmanager = require('node-openvpn');
const opts = {
host: '147.0.40.145',
port: 38473,
timeout: 1500, //timeout for connection - optional, will default to 1500ms if undefined
logpath: 'log.txt' //optional write openvpn console output to file, can be relative path or absolute
};
const auth = {
user: '{{add user name}}',
pass: '9c359ad1ebeec200',
};
const openvpn = openvpnmanager.connect(opts)
openvpn.on('connected', () => {
console.log("Connected to VPN successfully...");
});
For more info , please read this link
Another option
Link
It seems protractor doesn't provide any out of the box solution for starting a server before it runs. Having to run multiple commands before functional tests will run is a bad user experience and bad for automated testing.
Angular-cli has its own solution that is rather complicated, which this plugin claims to duplicate, although it doesn't work for me and may be unmaintained. https://www.npmjs.com/package/protractor-webpack
EDIT: BETTER SOLUTION ACCEPTED BELOW
I came up with a solution using child_process.exec that seems to work well, although I don't like it very much. I'd like to share it in case anyone needs it and to see if anyone can come up with a better solution.
Launch the process in the beforeLaunch hook of protractor:
beforeLaunch: () => {
webpackServerProcess = exec(`webpack-dev-server --port=3003 --open=false`, null, () => {
console.log(`Webpack Server process reports that it exited. Its possible a server was already running on port ${port}`)
});
},
Then above the configuration block we set up the exit handlers to make positively sure that server gets killed when we are done.
let webpackServerProcess; // Set below in beforeLaunch hook
function cleanUpServer(eventType) {
console.log(`Server Cleanup caught ${eventType}, killing server`);
if (webpackServerProcess) {
webpackServerProcess.kill();
console.log(`SERVER KILLED`);
}
}
[`exit`, `SIGINT`, `SIGUSR1`, `SIGUSR2`, `uncaughtException`].forEach((eventType) => {
process.on(eventType, cleanUpServer.bind(null, eventType));
})
The various event listeners are needed to handle cntrl+c events and situations where the process is killed by ID. Strange that node does not provide an event to encompass all of these.
Protractor also has onCleanUp that will run after all the specs in the file have finished.
And you are doing the right thing by keeping a reference to your process so that you can kill it later.
let webpackServerProcess;
beforeLaunch: () {
webpackServerProcess = exec('blah'); // you could use spawn instead of exec
},
onCleanUp: () {
process.kill(webpackServerProcess.pid);
// or webpackServerProcess.exit();
}
Since you are launching the serverProcess with child_process.exec, and not in a detached state, it should go away if the main process is killed with SIGINT or anything else. So you might not even have to kill it or cleanup.
I found a much more reliable way to do it using the webpack-dev-server node api. That way no separate process is spawned and we don't have to clean anything. Also, it blocks protractor until webpack is ready.
beforeLaunch: () => {
return new Promise((resolve, reject) => {
new WebpackDevServer(webpack(require('./webpack.config.js')()), {
// Do stuff
}).listen(APP_PORT, '0.0.0.0', function(err) {
console.log('webpack dev server error is ', err)
resolve()
}).on('error', (error) => {
console.log('dev server error ', error)
reject(error)
})
})
},
// conf.js
var jasmineReporters = require('jasmine-reporters');
var Jasmine2HtmlReporter = require('protractor-jasmine2-html-reporter');
const path = require('path');
const WebpackDevServer = require('webpack-dev-server');
const webpack = require('webpack');
let webpackServerProcess;
beforeLaunch: () => {
return new Promise(resolve => {
setTimeout(() => {
const compiler = webpack(require('./webpack.config.js'));
const server = new WebpackDevServer(compiler, {
stats: 'errors-only'
});
server.listen(0, 'localhost', () => {
// `server.listeningApp` will be returned by `server.listen()`
// in `webpack-dev-server#~2.0.0`
const address = server.listeningApp.address();
config.baseUrl = `http://localhost:${address.port}`;
resolve();
});
}, 5000);
});
};
exports.config = {
framework: 'jasmine',
//seleniumAddress: 'http://localhost:4444/wd/hub',
specs: ['Test/spec.js'],
directConnect: true,
// Capabilities to be passed to the webdriver instance.
capabilities: {
'browserName': 'chrome'/*,
chromeOptions: {
args: [ '--headless','--log-level=1', '--disable-gpu', '--no-sandbox', '--window-size=1920x1200' ]
}*/
},
onPrepare: function() {
jasmine.getEnv().addReporter(new jasmineReporters.JUnitXmlReporter({
consolidateAll: true,
filePrefix: 'guitest-xmloutput',
savePath: 'reports'
}));
jasmine.getEnv().addReporter(new Jasmine2HtmlReporter({
savePath: 'reports/',
screenshotsFolder: 'images',
takeScreenshots: true,
takeScreenshotsOnlyOnFailures: true,
cleanDestination: false,
fileName: 'TestReport'
}));
},
}
onCleanUp: ()=> {
//process.kill(webpackServerProcess.pid);
webpackServerProcess.exit();
}
I'm trying to deploy from GitHub using I want to execute more than one command, in order of the array. The code I'm using now is included below.
async.series([
...
// Deploy from GitHub
function (callback) {
// Console shizzle:
console.log('');
console.log('Deploying...'.red.bold);
console.log();
console.log();
var deployFunctions = [
{
command: 'cd ' + envOptions.folder + ' && pwd',
log: false
},
{
command: 'pwd'
},
{
command: 'su ' + envOptions.user,
log: false
},
{
command: 'git pull'
},
{
command: 'chmod 0777 * -R',
log: false
}
];
async.eachSeries(deployFunctions, function (item, callback) {
deployment.ssh2.exec(item.command, function (err, stream) {
deployment.logExec(item);
stream.on('data', function (data, extended) {
console.log(data.toString().trim());
console.log();
});
function done() {
callback(err);
}
stream.on('exit', done);
stream.on('end', done);
});
}, function () {
callback();
});
},
...);
But, after I cd'ed to the right directory, it forgets where it was and starts all over again.
$ cd /some/folder && pwd
/some/folder
$ pwd
/root
#robertklep is correct about why your cd doesn't persist. Each command invokes a distinct shell instance which starts in its initial state. You could prefix each command with cd /home/jansenstok/domains/alcoholtesterwinkel.com/public_html/ && as a quick fix, but really you are setting yourself up for pain. What you want is a shell script with all the power of multiple lines as opposed to a list of individual disconnected commands.
Look at using ssh2's sftp function to transfer a complete shell script to the remote machine as step 1, execute it via exec (/bin/bash /tmp/your_deploy_script.sh) as step 2, and then delete the script as step 3.
I know this is a super old question, but I ran into this problem while trying to manage an ACE through my Node server. The answer didn't work for me, but several searches later led me to a wrapper that worked really well for me. Just wanted to share here because this was the top link in my Google search. It's called ssh2shell and can be found here: https://www.npmjs.com/package/ssh2shell
It's very simple to use, just pass an array of commands and they run one by one waiting for each command to complete before moving on to the next.
A practical example:
const client = new Client();
const cmds = [
'ls -lah \n',
'cd /mnt \n',
'pwd \n',
'ls -lah \n',
'exit \n',
];
client.on('ready', () => {
console.log('Client :: ready');
client.shell((err, stream) => {
stream.on('close', (code) => {
console.log('stream :: close\n', { code });
}).on('data', (myData) => {
console.log('stream :: data\n', myData.toString());
}).on('exit', (code) => {
console.log('stream :: exit\n', { code });
client.end();
}).on('error', (e) => {
console.log('stream :: error\n', { e });
rej(e);
});
for (let i = 0; i < cmds.length; i += 1) {
const cmd = cmds[i];
stream.write(`${cmd}`);
}
});
}).connect({
host: '127.0.0.1',
port: 22,
username: 'root',
password: 'root',
});
all the examples in the doc use stream.end() which caused the creation of a new session instead of using the current one.
You cooldn't use "shell" on your program because "Shell" command invokes a new terminal on the system and does your jop. You need to use "exec" command without not emitting "exit" . Default "exec" command emits "exit" command after the command which you gave has been executed.