I'm building a stateless authentication by Double Submit Cookie in Node.js/ExpressJs and wonder how I best can make all my connections go over HTTPS.
The reason is that I want to be able to handle high traffic and only use HTTPS for security reasons with the stateless authentication.
Should I avoid the HTTPS npm dependency and set up a reverse proxy server in Nginx?
Perhaps I can configure Nginx so that it will enforce HTTPS without doing any other in-app settings in ExpressJs?
You have two options:
Use the built-in https module that comes with Node.js (not an npm module). This will handle the HTTPS requests. Good for light applications, personal web sites, development.
Set up a proxy (nginx, HAProxy, etc.) in front of your Web server (which uses http module). The proxy will terminate SSL from an incoming HTTPS request, then proxy a plain HTTP request to your Web server. Good for production, high traffic applications that also need load balancing provided by these proxy software.
For playing around, I suggest going simple with the built-in https module. I suppose you do not expect millions of requests to introduce a proxy to your stack.
Related
I deployed a web application which require to send request and get response from a local ip(ie. 192.168.0.123), which is actually a payment terminal.
It seems that this violate the entire https purposes, however I have no control on the payment terminal and it's address due to 3rd parties API.
Are there any workaround to handle localhost/ local ip ith https request?
If you can install additional software on the payment terminal, then you could install another HTTP server that acts as a reverse proxy and listens on HTTPS.
You then make your requests to https://192.168.0.123 and your reverse proxy forwards them to its http://localhost.
Nginx is a popular server for this purpose.
If you can't install additional software then you should look at using an isolated network instead.
It doesn't really matter if communication over a local network isn't encrypted if you completely trust every device that is connected to it.
I have a node.js server made with keystone.js, but before launching it I need to add an SSL certificate. All the examples and tutorials I've seen to add an SSL certificate to a node.js server only show how to create a new https server from scratch.
I already have my entire application and api working off http, is there a simple way to make it all work on https?
I'm not sure why you think you need to do this without restarting the application; it's a pretty low cost thing to do.
That said, the simplest approach would be to put up an Nginx web server in front of it, and then setup the Nginx server as the https endpoint and proxypass the requests forward to your node process.
We have kerberos authentication at our nginx layer and want to connect to deepstream.io instances as a reverse proxy. From my reading of the docs, it looks like putting a webserver in front of deepstream.io instances will hamper performance. Also, there is the question of who does load balancing - usually it is at nginx layer but deepstream.io does seem to have inbuilt capabilities to ask other instances to handle load (via messaging)
What would be the best way to get deepstream.io instances play well with a web server ? It is non-trivial to re-implement kerberos authentication in Node.
It is possible (and a good idea) to deploy and loadbalance deepstream behind Nginx, HAProxy etc. There are a few things to be aware of though:
engine.io, websockets and sticky sessions.
Deepstream uses engine.io (the transport-layer behind socket.io) to connect to browsers.
engine.io uses a number of different transport mechanisms, most notably long-polling (keeping an http request open until data needs to be send) and WebSockets.
For long-polling it is crucial that all requests from the same client are routed to the same deepstream server, so make sure that sticky/persistent sessions are enabled in your nginx upstream group (just add the ip_hash directive). For websocket it is important that the http update headers are forwarded.
sync data / messages between your deepstream nodes
Make sure that your deepstream nodes are connected with each other, both by a cache and message-bus. You have a choice of caches and messaging systems, but the simplest is to just use Redis for both. Have a look at the section on "What’s the simplest production-ready setup?" at the bottom of https://deepstream.io/tutorials/connectors-and-deployment.html.
TCP Connections from other clients (e.g. NodeJS)
engine.io is mainly used for browser clients, backend processes may directly connect via TCP. Loadbalancing TCP connections is perfectly possible with NginX, but requires an extra stream group in your configuration.
I'm exploring using the node-http-proxy proxy server so that I can have our proxy server on port 80 forward requests to our app server on port 8000. However, I'm a little confused as to why this is a good idea, and what exactly this set up would protect against security-wise.
The note-http-proxy documentation discusses a lot about using it as a way to forward requests to an app with multiple ports or ip addresses. This obviously would be very useful, particularly with a basic round-robin load balancer strategy. However, we only have one app on one port, so there is no need for us to do this.
If there is an important security reason why we should be using this proxy-server, then I'd love to know what types of attacks it protects against. Also, we're using socket.io, so if there is something that the proxy does to help the websocket server scale up, I'd like to understand that as well. We're having trouble figuring out how to run our app without sudo (since all ports below 1024 require root access), so if there really is no good reason to use a proxy server at this point, we're just going to scrap at. If anyone knows how to run this app with the proxy server on port 80 without root access, that'd be very helpful as well. Thanks!
The reasons for running a reverse proxy are:
You have limited IP ports open and need to run many Node services each of which needs it's own port
Your back-end service does not support HTTPS but you need it (e.g. Derby)
To add some other feature to the request that cannot be easily done with the back end such as adding Basic Authentication or some form of common logging/auditing
To enforce an addition or change to outgoing responses common across several back end services
To provide a load-balancing service
Unless your needs are quite simple, it would be better to use a dedicated proxy such as HAproxy since node-http-proxy is rather simplistic.
Well, if you're only running one instance of server, then theres not really a reason. The node-http-proxy docs mention using a single SSL certificate across multiple apps, which is very possible. You can also load balance across several HTTP and web socket servers (say, run 10 socket.io servers for real time data but only 1 HTTP server to serve out assets and REST APIs). Of course with one instance these don't provide any benefits.
If you want to run node servers without sudo, maybe you could try setting up IP tables port forwarding from port 80 to a port above 1024. See Can I run Node.JS with low privileges?
We use mainly the http-proxy to have multiple back-end server behind a single IP, but we also use it to forward https to http. It strengthens our app.
Security wise, you may have more confidence on the good quality of http-proxy than on your app. The proxy build by nodejitsu is ready for production and it should be harder for attaquants to gain privileges (like reading the private key files) on a http-proxy instead of your own app (of course this depends on your security development skill and your trust in the open source http-proxy project).
I have been tasked with setting up a server which uses a web based control interface using kerberos and active directory for authentication. I am using twisted.web as the web server. The issue is that I do not want user passwords coming through this server, but I don't know if it is possible for firefox and chrome to get access keys from the kerberos key server. Specifically it must work with firefox, other browsers would be a bonus. Is there a javascript library, possibly using HTML5 or a firefox plugin that allows for authentication to an untrusted server using kerberos? A flash application might also be possible.
Maybe you could through a reverse proxy in front of twisted and use http auth from the web app and delegate authentication itself to Kerberos via an apache or nginx module.
While the proxy will receive the password, the twisted server won't, in line with your use case. Requests would be intercepted by the proxy and delegated to your back end (proxy_pass) following a successful authentication.
This way your solution would work independently from any http client/web browser.