Run Node Js on iis localhost - javascript

how to run node js on iis localhost port 80?
thanks

You can't run node.js on IIS, would be like trying to run apache on IIS, node.js is doing the HTTP request/response handling.
What you are really looking for (or at least what it sounds like based on your question) is to run js files server side on IIS, which I don't think is currently provided.
However if you want to know how to run your application using port 80, just make sure nothing is using this port and setup your application to listen on that port.
var net = require('net');
//do stuff
var server = server.createServer();
//do some other stuff
server.listen(80);
//do more stuff

Related

Moving ExpressJS API files onto Server

So I created an simple API using ExpressJS that connects to MongoDB to perform CRUD operations. Currently I am able to get the local host running by performing command "npm nodemon" in the source folder. And it worked by testing with postman I wonder how to implement it on the server. As server runs a linux system, also I have a line of code in my root file "server.js ":
const port = process.env.PORT || 5000;
I think the process.env.port needs need to be changed in order to make it work on the server?
In addition, I did look into aws CE2 server it is so complicated that I was immediately overwhelmed. I am hoping someone can recommend dummy like me a simple and very specific solution to have a server run my scripts in ExpressJS environment. Thank you
I'm assuming your question is "How to deploy express app to a server?"
You can read some advanced topics on http://expressjs.com/, which covers some best practices, and other useful stuff. But the things you want to look at now is Things to do in your environment / setup
The important part is:
Keep your express runing on port 5000
Run your app in cluster
Run your app behind a proxy server like Nginx.
You can check this nice guide (Step 3 and 4) on how to deploy your express app to a Linux server with PM2, Nginx.
So at the end, your express app will run on port 5000 (or whatever port you desire), and your Nginx will run on port 80, and nginx will forward any request to your express app.

deploying node add on server without using localhost

I have a node app that I am trying to deploy on my server. I have an index.html file in a public folder and an app.js file. If I navigate to the project in the command line and run node app.js it runs the app on localhost:8888 and shows the index.html file.
Now that I have uploaded this to my server I am wondering what I need to do, and change (if anything) in my app.js file so that i can visit the site without visiting localhost:8888, but instead the actual url.
I have tried http://162.xx.xxx.xxx/folderName/app/public:8888, but this doesn't work.
var express = require('express')
var app = express();
app.use(express.static(__dirname + '/public'))
app.get('/', function (req, res) {
res.send('Hello World!')
})
app.listen(8888, function () {
console.log('Example app listening on port 8888!')
});
"Server" is a word with two primary meanings in software development.
It can mean either "A piece of software that listens on a network" or "A computer running that kind of software".
So having uploaded the JavaScript program to the remote computer that is your server you need to do exactly the same as you did on your own computer.
i.e. you need to get a terminal on the server and run node app.js
It will then be available at http://your.example.com:8888/
(More advanced uses would involve using software like forever or system.d to run it automatically as a background process).
If you were using the term server with the other meaning (i.e. you mean "Apache HTTP" or "IIS" or similar), then you are out of luck.
Using Node for server side code means running a server written in JavaScript.
To use this in combination with something like Apache, you would either:
Run the Node server instead of Apache
Run the Node server on a different port and point some services at that port explicitly
Run the Node server on a different port and use something like ProxyPass to have Apache relay requests to it
Change the port number from 8888 to 80 and then use the address of your server in the browser. For example, "mysite.com" for a domain name or "123.45.678" for an IP address.
If there are other sites on that server, you can't run it on port 80. (Port 80 is the default port websites use.) You'd need to use a different port. So, say you kept 8888 -- the address would be yoursite.com:8888

Azure Web Site starting my Hapi Node.js site with socket protocol

Whenever I deploy my Hapi.js web application to azure, it starts the server using the socket protocol (see output below).
socket:\\.\pipe\b5c0af85-9393-4dcb-bd9a-3ba9b41ed6fb
GET /
GET /{param*}
GET /api/employees
POST /api/employees
GET /api/employees/{id}
PUT /api/employees/{id}
DELETE /api/employees/{id}
POST /api/worklog
GET /login
POST /login
Hapi server started # socket:\\.\pipe\b5c0af85-9393-4dcb-bd9a-3ba9b41ed6fb
150914/214730.270, [response], socket:\\.\pipe\b5c0af85-9393-4dcb-bd9a-3ba9b41ed6fb: [1;32mget[0m / {} [32m200[0m (316ms)
However, whenever I am running this locally, it starts using http... I have not run into this issue using express or loopback, only Hapi. Is there some sort of configuration that I am missing? This is the server.connection function:
var server = new Hapi.Server();
var host = process.env.host || '0.0.0.0';
var port = process.env.port || 3000;
server.connection({host: host, port: port});
The reason this is a big deal is because I cannot pass socket://*<mydoamin>* to google as a callback URI for OAuth.
You shouldn't need to pass socket://<domain> to google, you'd pass the normal https://yourDomain.com or even the https://yourSiteName.azurewebsites.net to Google for OAuth callback and it should work as you would expect.
The fact that the node application is listening on a pipe rather than a normal tcp socket is just an implementation detail of iisnode. Basically the problem is that node has it's own webserver so you can't use it with other webservers like IIS, Apache, nginx, etc. iisnode bridges the gap between IIS and node in that it allows IIS to listen to the HTTP port on the machine 80 and when IIS gets a request on that port, it just forwards it to the node process that's listening on a named pipe. This allows you to manage your sites in IIS as you normally would on a Windows Server machine, while actually writing your app in node.
You can think of it as 2 webservers running on the box, one (IIS) is acting as a proxy for the other (node) where all the work is actually happening. The fact that the iisnode developer chose to use a named pipe instead of a normal tcp socket is odd (though kind of understandable since you can't easily reserve a port per se as you can a pipe), but it's the way it is.

OpenShift NodeJS deployment : socket.io index.html port assignment, etc

I locally wrote a nodeJS app using socket.io and express modules.
I wanted to use openshift for hosting.
So I changed the main .js to server.js which seems to be the index equivalent of the openshift file and changed the server port setting to:
var server = require('http').createServer(app).listen(process.env.OPENSHIFT_NODEJS_PORT || 3000);
as indicated in some posts.
However after git commit, I am still getting:
remote: info: socket.io started
remote: warn: error raised: Error: listen EACCES
remote: DEBUG: Program node server.js exited with code 0
remote:
remote: DEBUG: Starting child process with 'node server.js'
and the website doesn't work.
As the app is serving a html file, there are two more places, where the port is mentioned, which sit in the index.html that is served:
header:
<script src='//localhost:3000/socket.io/socket.io.js'></script>
and within javascript for the html file:
var socket = io.connect('//localhost:'+process.env.OPENSHIFT_NODEJS_PORT || 3000);
// intial vars and multi list from server
socket.on('clientConfig', onClientConfig);
All files and modules are seemingly uploaded, but the EACCES error still prevails.
I get the feeling that maybe the header link to localhost:3000 might be the skipping point, but I am not sure. Anyone have any idea, what the problem is?
Also, there is no : socket.io/socket.io.js file in the socket.io modules folder, which I find confusing.
I had recently developed a chat client application using socket.io and also had webrtc in it. I was able to deploy the app on openshift by making the following changes into code.
Client Side
Keep the include script tag in a relative manner like so
<script src="/socket.io/socket.io.js"></script>
While declaring io.connection, change the ip part to point the application to server in this format.
var socket = io.connect('http://yourapp-domain.rhcloud.com:8000/', {'forceNew':true });
8000 is for http and 8443 is for https
Server Side
The io and the server should both be listening on the same port and the order in which the statements are run should also be given attention.
Step 1: Declare the http server using app.
( app is obtained from express)
var express = require('express');var app = express();)
var server = require('http').Server(app);
Step 2:
Declare io from socket.io and combine it with the server object.
var io = require('socket.io').listen(server);
Step 3:
Now, allow the server to listen to openshift port and ip.
server.listen(process.env.OPENSHIFT_NODEJS_PORT, process.env.OPENSHIFT_NODEJS_IP);
Please pay special attention to the order of the statements you write, it is the order which causes issues.
The server side of your websocket needs to listen on port 8080 on your openshift ip address, the CLIENT side needs to connect to your ws://app-domain.rhcloud.com:8000
I have a few notes on how to use WebSockets here: https://www.openshift.com/blogs/10-reasons-openshift-is-the-best-place-to-host-your-nodejs-app#websockets
You don't need any additional server-side changes after adapting your code to take advantage of environment variables (when available)
OpenShift's routing layer exposes your application on several externally-accessible ports: 80, 443, 8000, 8443.
Ports 8000 and 8443 are both capable of handling websocket connection upgrades. We're hoping to add support for WebSocket connections over ports 80 and 443 soon.

Setting laravel to work on a port number?

I am working with nodejs, expressjs, and socket.io I am triggering events on my web app with a mobile phone over the nodejs server.
The app is built on javascript but I am using laravel to store data into a database. I am new to nodejs so I am pretty sure if I wanted, I think I could cut out php and just use the whole app with nodejs, but I don't want to. I like laravel and php and it's alread setup, so let me explain my problem.
laravel is installed on my server http://example.com/public/ laravel's index.php is here. My routes for my data base resources are http://example.com/public/feeds. I can access this fine, but if I want to access my nodejs server I need to use http://example.com:3000 which obviously causes a problem.
The nodejs/expressjs files are inside http://example.com/public/MY-FILES-HERE but since the nodejs dispatches on http://example.com:3000 this throws my laravel routes off.
So what I am asking is how do I get it all to work well with eachother? I assume I need to setup a port somehow in laravel.
EDIT: So I am new to the port, and I didnt know there is already a default port set (80). My laravel install is on port 80, and inside here I can listen to calls from port 3000 using socket.io. I did not know that, so I have a page http://my-server-ip:3000/test which has one button and a script that sends the event to the nodejs server and that responds to my script which listens to events on port 3000 and executes a function. Cool stuff here, I hope I made sense I am very new.
Not quite sure what you mean by
this throws my laravel routes off
In a situation where you want to host multiple servers on port 80 from the same machine you might want to consider a reverse proxy. I recommend nginx for this.(http://www.cyberciti.biz/tips/using-nginx-as-reverse-proxy.html). Nginx will listen to port 80.
Then you setup a subdomain eg. node.example.com for the node.js service.
In the reverse proxy you listen for node.example.com on port 80 and direct that to port 3000. You set up Laravel/Apache? to listen on port 4000 and have nginx listen for www.example.com on port 80 and direct that to port 4000.
Is this what you are after?

Categories