How to get the runtime url of sails app within mocha tests - javascript

I'm aware of sails.getBaseUrl and the fact that it is deprecated. In the bootstrap.test.js while doing a sails lift I specify a port: 1337. sails.getBaseUrl() returns http://localhost:1337. I then run the tests using mocha (from within WebStorm if that matters). At the same time I'm able to do a sails lift at the terminal and run the same sails app on http://localhost:1337. Both seem to be running fine without a port conflict.
So at what location is mocha running the sails app when running the tests?

If you're not setting it to something else, then your Sails app is starting on port 1337. I'd check that you don't have a PORT environment variable set somewhere that one or the other app is using to override the default. Unless one of the apps is running in a virtual machine like Docker, it's not possible for both of them to be running on port 1337 without conflict, so either your tests are failing silently or they're running a different port.
In Sails v0.12.x and Sails v1.0, the HTTP server is available as sails.hooks.http.server, so you should be able to check the port that an app is listening on with sails.hooks.http.server.address().port.

The issue was that our tests were doing a sails load and not a sails lift. Based on the sails lift and load documentation and the source code I figured out that the port binding is not done in sails load. So I just added extra code for binding/unbinding the port as part of the test lifecycle.
before(function (done) {
server = sails.hooks.http.server.listen(sails.config.port, function (err) {
if (err) {
return done(err);
}
return done();
});
});
after(function (done) {
server.close();
done();
});

Related

Update Nodejs App launched with Pm2 as Windows Service

I'm writing an small application in nodejs.
This app should be executed as windows service (so I can't use electron or others because It should be active even if the user is not logged), and so I have thought to use PM2.
It starts and works fine, but my problem, now, are the updates of my NodeJS app.
My app will released to many PC and I don't want update it one by one.
Yes, I have a repository where I can read and so I can create in my app a function where with established interval I go to my repo and pull.
For now I have created, in packages.json of my NodeJs app a scripts command like:
git pull //myrepourl.git origin
And in my index.js a function like:
function updateApp(){
return new Promise((resolve,reject)=>{
exec('cd app_path && npm run prod_update', (error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
reject();
}
resolve();
});
})
}
setInterval( ()=>{
updateApp();
console.log("------ Updated ---------")
},60*60*1000);
But this way don't convince me because actually my repo is private and then I have to expose my credentials git in app, whithout considering the problem of node_modules.
So are there others way to update a Nodejs app launched with PM2 as Service on Windows?

Execute NPM module through cordova

I created an app using cordova and everything is fine, expect I need to use a node module which doesn't have a client-side equivalent because I'm dealing with file write streams etc. I have found Cordova hooks to be my best shot so far, where I create an app_run hook to execute a node file that runs a socket server to listen for events from the client side.
I know, a very longwinded solution, but seems logically correct to me, the issue is that when I do create the server, building the app through Visual Studio 2017, the app launches on my android phone, but VS hangs on the "deploy" stage. I guess that it has to do with the event chain, so I created an asynchronous script like this:
(async function () {
const server = require('http').createServer()
const io = require('socket.io')(server)
io.on('connection', function (socket) {
console.log('heyo')
socket.emit('hello world', 'hi')
})
server.listen(3000, function (err) {
if (err) throw err
console.log('listening on port 3000')
})
})();
but this doesn't seem to work either, somehow VS hangs on "deploy". If anyone can possibly guide me in the right direction, that would be highly appreciated.
PS: I know the title is off, but every time I use StackOverflow to get help with a particular attempt, I'm told to do it another way, so I'll leave it open.
If the goal is to use socket.io in your cordova app, there IS a JS client for the web that you need to use and you don't need to use npm for that, just add a link to your client js file in your index file. (should be in a "client" folder when you init socket.io via npm).
<script src="/socket.io/socket.io.js"></script>
<script>
const socket = io('http://localhost');
</script>
https://socket.io/docs/client-api/

Can I run 2 Node.js Project in windows?

Can I run 2 Node.js Project in windows?
If yes, How Can I do That?
If no, Can I run 2 Node.js Project in a dedicated Host?
You have a lot of options.
If you want to have diffrent versions of nodeJS, for windows NVM-windows is the best option.
But instead if you're talking about running different http-request based programs, the simplest solution is to simply listen at different ports on each of the project on your system.
Eg, if you're using nodejs http module
// project 1
server.listen('8080', (err) => { // Will start up the server on port 8080
console.log(`server is listening on 8080`)
})
// project 2
server.listen('8081', (err) => { // Will start up the server on port 8081
console.log(`server is listening on ${port}`)
})
or if you're using express server
// project 1
app.listen(3000, function() {
console.log('app listening on port 3000 for project 1 !');
})
// project 2
app.listen(3001, function() {
console.log('app listening on port 3001 for project 2 !');
})
NOTE: This requires that you switch to port 80 on server as that is the default port for request on any server.
If you'd want to get a more sophisticated solution wherein both these applications are sand-boxed into their own environment, you can go for Virtual box or Docker. Both of them offer same functionality but in different manner. They both can set up a isolated environment for your application so that you'r applications don't interact with each other.
To give you a perspective of this,
let's say your application uses a enviornment-variable that you have set to 'Abra-kadabra' for project 1. Now if you refrence that enviornment variable in project 2, you'd still get 'Abra-kadabra' while you might want second project to have the value 'whoosh'
Virualbox or docker would set up a isolated system where in you can have this exact functionality available to you

Automate UI testing with Javascript, node - starting & stopping a web server

I'm currently investiagating automated UI testing using JavaScript on windows with node.js and phantom.js and unsurprisingly I've found many frameworks that can help in this regard (casper.js, Buster.js, etc).
The one thing that seems to be missing from the frameworks I have looked at so far is stopping and starting a web server to server the web pages so that testing framework can perform its testing. One exception is WebDriver.js which uses the Selenium standalone server but this relies on a server written Java and at the moment I'd prefer to find a node based solution if at all possible.
From the node perscpective I've looked at Connect.js and also Http-Server (which I particularly like) but the issue is starting and stopping these from a JavaScript test.
I've attempted to create a casper.js test that would interact with a server, run the test and then stop the server but I can't get it to work, here's an example script
var childProcess = require('child_process').spawn('http-server', '/TestSite');
casper.test.begin("Load-page", 1, function suite(test){
casper.start('http://localhost:8080/',function(){
test.assertTitle("test page");
});
casper.run(function(){
test.done();
childProcess.kill();
});
});
I call this from the command line using the following command (casper is in my Path variable):
casperjs Load-page testFile.js
What I was hoping would happen is the http-server would start, casper would start the test and then after the test was run the http-server would be killed.
I've also tried similar with Connect:
var server= connect.createServer(connect.static('/TestSite')).listen(8080)
casper.test.begin("Load-page", 1, function suite(test){
casper.start('http://localhost:8080/',function(){
test.assertTitle("test page");
});
casper.run(function(){
test.done();
server.stop();
});
});
But again with no luck.
I can run the Casper sample tests which work and I've also got Node in my Path as well and can call the REPL from the command prompt.
The directory structure is:
Code
/TestSite
/node_modules
and I run the tests from the Code folder.
Am I simply unable to do this or am I just not getting how it should work?
When you say "no luck" what do you mean?
The connect example looks mostly OK, and I'd prefer it over spawning a subprocess. Bear in mind that listen is potentially async so the server might not be available immediately though. The second param to listen is a callback that will be run once the server is listening - maybe try running the tests in that callback instead?
Pro Tip: Don't rely on port 8080 always being free on whatever machine you're running on - passing in 0 for the port will cause the server to start on a random port, you can then do server.address().port in the listen callback to get the port that was chosen
I managed to get it working using a combination of differebt scripts and using child_process spawn.
I created a script called startServer.js that would start the server using Connect:
var connect = require('connect');
var server= connect.createServer(connect.static('TestSite'));
server.listen(8081);
I created another script runTests.js that would call the server script via spawn and then call Casper, again via spawn, and run all the capser tests that are in a folder called tests relative to where the script is run from.
var child_process = require('child_process');
var stillRunning = true;
var server = child_process.spawn('node', ['createServer.js']);
var casper = child_process.spawn('casperjs', ['test tests']);
casper.stdout.on('data', function(data){
console.log(data.toString());
});
casper.stderr.on('data', function(data){
console.log('Error: ' + data);
});
casper.on('exit', function(code){
server.kill();
process.exit(0);
});
To use this at the command prompt navigate to the folder where the script is and then run node runTests.js and the server will be started and the tests run against the site

OpenShift NodeJS deployment : socket.io index.html port assignment, etc

I locally wrote a nodeJS app using socket.io and express modules.
I wanted to use openshift for hosting.
So I changed the main .js to server.js which seems to be the index equivalent of the openshift file and changed the server port setting to:
var server = require('http').createServer(app).listen(process.env.OPENSHIFT_NODEJS_PORT || 3000);
as indicated in some posts.
However after git commit, I am still getting:
remote: info: socket.io started
remote: warn: error raised: Error: listen EACCES
remote: DEBUG: Program node server.js exited with code 0
remote:
remote: DEBUG: Starting child process with 'node server.js'
and the website doesn't work.
As the app is serving a html file, there are two more places, where the port is mentioned, which sit in the index.html that is served:
header:
<script src='//localhost:3000/socket.io/socket.io.js'></script>
and within javascript for the html file:
var socket = io.connect('//localhost:'+process.env.OPENSHIFT_NODEJS_PORT || 3000);
// intial vars and multi list from server
socket.on('clientConfig', onClientConfig);
All files and modules are seemingly uploaded, but the EACCES error still prevails.
I get the feeling that maybe the header link to localhost:3000 might be the skipping point, but I am not sure. Anyone have any idea, what the problem is?
Also, there is no : socket.io/socket.io.js file in the socket.io modules folder, which I find confusing.
I had recently developed a chat client application using socket.io and also had webrtc in it. I was able to deploy the app on openshift by making the following changes into code.
Client Side
Keep the include script tag in a relative manner like so
<script src="/socket.io/socket.io.js"></script>
While declaring io.connection, change the ip part to point the application to server in this format.
var socket = io.connect('http://yourapp-domain.rhcloud.com:8000/', {'forceNew':true });
8000 is for http and 8443 is for https
Server Side
The io and the server should both be listening on the same port and the order in which the statements are run should also be given attention.
Step 1: Declare the http server using app.
( app is obtained from express)
var express = require('express');var app = express();)
var server = require('http').Server(app);
Step 2:
Declare io from socket.io and combine it with the server object.
var io = require('socket.io').listen(server);
Step 3:
Now, allow the server to listen to openshift port and ip.
server.listen(process.env.OPENSHIFT_NODEJS_PORT, process.env.OPENSHIFT_NODEJS_IP);
Please pay special attention to the order of the statements you write, it is the order which causes issues.
The server side of your websocket needs to listen on port 8080 on your openshift ip address, the CLIENT side needs to connect to your ws://app-domain.rhcloud.com:8000
I have a few notes on how to use WebSockets here: https://www.openshift.com/blogs/10-reasons-openshift-is-the-best-place-to-host-your-nodejs-app#websockets
You don't need any additional server-side changes after adapting your code to take advantage of environment variables (when available)
OpenShift's routing layer exposes your application on several externally-accessible ports: 80, 443, 8000, 8443.
Ports 8000 and 8443 are both capable of handling websocket connection upgrades. We're hoping to add support for WebSocket connections over ports 80 and 443 soon.

Categories