node.js http server, detect when clients disconnect - javascript

I am using express with node.js for a http server. I store the response object so I can stream events down to the client on that channel. Is there a way to detect when the client disconnects? When I kill my client I can still write to the response object without getting any kind of exception/error. It looks like the service keeps the underlying tcp channel open as long as I keep writing to the response object.

req.on("close", function() {
// request closed unexpectedly
});
req.on("end", function() {
// request ended normally
});

Related

Can I interface with Laravel websockets using just JavaScript?

We currently have a Laravel CMS server that sends a request to another Laravel websocket server which then broadcasts to multiple IoT devices listening on their individual channels for requests. Everything works perfectly.
So it looks like this:
CMS server -> Socket server -> devices
I am trying to initiate these websocket requests from a third server which is not running Laravel, it's just running core PHP and JavaScript. I looked in my browser's Network tab to spy on the websocket connection from a legitimate request, copied it exactly, the connection and subscription succeed but when I send the requests over the socket from my test script, nothing happens. It's literally the exact same socket request over the same channel on the same connection, just with a different socket ID. How could this fail?
I thought maybe it was only accepting requests from that specific server so I copied my test script to that server and still no response. I don't know much about web sockets, could someone please help me understand how I can spoof these messages without using Laravel?
If you're wondering what I'm actually up to, we're decommissioning this CMS server for our new CMS we built in core PHP so we no longer have Laravel as an option but I don't want to rewrite the socket server which uses Laravel, that can stay. I just need to interface with it any way possible. I have full SSH and DNS access to anything I need to configure.
Here is my test script, I've tried about 50 of them I found on Google, none of them give me a response from my specific socket server, besides the connection successful and subscription successful responses.
<script>
let socket = new WebSocket('wss://subdomain.domain.ca:6001/app/apikeyyyyy?protocol=7&client=js&version=4.4.0&flash=false');
var message = {
channel:"aaaaaa.bbbbbbbb",
event:"1000",
data:"{\"channelName\":\"aaaaaa.bbbbbbbb\",\"message\":{\"msgId\":\"\",\"cmd\":\"help\",\"timestamp\":1663616905529,\"request_id\":\"\",\"device_id\":\"\"},event:\"1000\",timestamp:1663616905529,request_id:\"\",device_id:\"\"}"
};
socket.onopen = function(e) {
var data_json = {
event:'pusher:subscribe',
data:{
channel:'aaaaaa.bbbbbbbb',
}
};
socket.send(JSON.stringify(data_json));
socket.send(JSON.stringify(message));
};
socket.onmessage = function(event) {
console.log("Response: " + event.data);
};
socket.onclose = function(event) {
if (event.wasClean) {
alert(`[close] Connection closed cleanly, code=${event.code} reason=${event.reason}`);
} else {
// e.g. server process killed or network down
// event.code is usually 1006 in this case
alert('[close] Connection died');
}
};
socket.onerror = function(error) {
alert(`[error] ${error.message}`)
};
</script>
I obviously removed the channel name, domain and application key for security but the rest is accurate. And here's what I see in my browser console:
Response: {"event":"pusher:connection_established","data":"{\"socket_id\":\"166323189.236668505\",\"activity_timeout\":30}"}
Response: {"event":"pusher_internal:subscription_succeeded","channel":"aaaaaa.bbbbbbbb"}
When I send a request from the (working) CMS server, I get a OK response from the socket server and the related device will reboot or whatever I asked it to do. When I send the exact same response from my test script, I see the request get logged in the websockets.log file on the socket server but no response is logged like it is with the requests from the CMS server. And no response prints in the browser console even though I do see the responses print in the browser console for my test script when I send them from the CMS so I know the subscription is working correctly. Also the devices do not reboot.

sending two responses for same request

In Node.js, suppose serving a request takes time. So upon receipt of the request the server wants to response back with "I revived your request and I will get back to you soon". Then once the processing the request is over, the server wants to, this time, get back to the client with the actual response . How can we do that?
The code snippet below issues error ( connection closed or something ).
'use strict';
var http = require('http');
var numberOfRequests = 0;
http.createServer(function (request, responce) {
request.n = numberOfRequests;
console.log('Request number ' + numberOfRequests + ' recioved!');
responce.writeHead(200);
responce.write("soon we will get back to you for your request# " + request.n);
responce.end();
setTimeout(function () {
responce.writeHead(200);
responce.write("Responce to your request# " + request.n);
console.log("Responce to the request# " + request.n);
responce.end();
}, 5000);
numberOfRequests++;
}
).listen(8080);
console.log('listening ...');
You can't send two responses to the same request. You can only send one fully formed response per request. That's the http specification.
The only work-arounds for your issue that I know of are:
Use http in a flushed streaming mode. Here, you send part of the response, flush it out so you know it is sent and you have a special type of client on the other end that is reading partial responses and interpreting them (not the usual way that http responses are read and not what a browser does on its own).
Use a websocket or socket.io connection to update the client with progress before you finally send the actual http response. The user would connect a socket.io connection, then make a long running http request, then receive regular progress on the socket.io connection, then receive the http response when it was done. This can be done simply in any web page without much difficulty. The only main problem to solve is you have to install a method of associated a webSocket or socket.io connection with an incoming http connection (which can usually be done via a session cookie) so you know which webSocket or socket.io connection is associated with the http request that just arrives.
Use some other server-push scheme to "push" the second response to the client.
Implement client polling. Send an initial http response that instructs the client to check back in NN ms to see if there is more to the response. When the client checks back again NN ms later, you either have the final response for them and send it or you respond again tell them to to check back again in NN ms, etc...
You are looking for Server-Sent-Events (SSE), of course you can send chunked data, just don't call res.end() until the final piece of data is sent. And by the way, you can only send headers one time. See this example:
res.writeHead(200);
res.write("The first piece of data");
setTimeout(()=>{
res.write("This piece of data will be sent in 1 second.");
res.end("Ok, all data has be send, now close the connection.");
}, 1000);

What happens to the request stream in node before it is consumed?

I am working on a node application where I pipe a post request's body into a writable stream that saves the data to disk. I realized while building this application that I have no idea what actually happens to the request stream before it is consumed. Say I did something like this:
app.post('/api/data', (req, res) => {
const writableStream = fs.createWriteStream('data.txt');
setTimeout(() => {
req.pipe(writableStream);
}, 3000);
});
What is actually happening to the stream in the 3 seconds between when the request is initially received and when the stream starts being piped? Is it being loaded into memory?
Streams support a buffer for incoming data, but when the buffer fills up, they tell the sender to stop sending more data until they are ready for some more.
Since an incoming request is actually a TCP connection and incoming data is data arriving on the TCP connection, this probably turns into more a question about what happens to incoming TCP data when you aren't reading the data as fast as it wants to arrive. The answer is that TCP supports flow/control where the receiver tells the sender to stop sending data for the moment and then when incoming buffers clear, it tells the sender it can start sending data again.
Here's a quick overview of TCP flow control.
In your specific stream coding example, until you issue the req.pipe(), there are no data listeners on the stream so it has nothing to do with incoming data. Thus, it will fill up its buffers from the incoming TCP stream, stop reading more from the incoming TCP socket (which will trigger TCP flow control). Then, when you run the req.pipe() that automatically registers handlers for data events and the stream will start triggering those events. As data is read out of the stream buffer, it will then be able to accept more incoming data from the TCP socket which will allow TCP to tell the other end of the TCP socket to restart the incoming flow of new data and so on.
There's a lot more here about how readable streams work: http://www.sitepoint.com/basics-node-js-streams/ and how they can be paused or resumed.

Keep Node JS connection opened and write (GET Request)

I was trying to implement a NODE JS get method where I could encode in the url parameters and send back responses like in Server Sent Events.
For example, when I used:
curl -D- 'http://localhost:8030/update'
The server would return a message, and then keep the connection opened to return more messages (like Push).
I was using require('connect'), I tried with require('express') but can't get it working.
Here's part of my code:
var http = require('http');
var connect = require('express');
var app = connect();
app.use(bodyParser.urlencoded({ extended: false }))
.use(bodyParser.json()) // JSON
.use(cors(corsOpts))
.get('/update', updateMiddleware);
var server = http.createServer(app);
server.listen("twserver.alunos.dcc.fc.up.pt", 8030);
function updateMiddleware(req, res) {
res.setHeader('Connection', 'keep-alive');
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.writeHead(200);
setTimeout(function() {
res.write("this is an event");
res.flushHeaders();
}, 1000);
setTimeout(function() {
res.write("this is another event");
}, 2000);
// should print without ending
}
EDIT: I found it was working, but only in chrome. In terminal, I only receive it after waiting a long time, and messages come like in chunks.
You can't use a single HTTP request to listen for multiple event data. If you are really stuck with HTTP (i.e. WebSocket or WebRTC is not an option) then the technique you are looking for is called long polling. This basically works this way:
Client sends request to server
Server waits until an event happens (or until a specific but not too long timeout, so the client application does not throw a timeout error for the request)
Server responses with a complete http response, containing the details of the event
Client receives the event details and immediately sends another request to listen to further events.
This method really takes advantage of HTTP Keep-Alive
EDIT:
For me it looks like your code does not follow the protocol of server sent events. Here is a tutorial: Server-Sent Events in nodejs.
Following another tutorial on MDN about Server-Sent Events, the structure of the messages should be the following:
: this is a test stream
data: some text
data: another message
data: with two lines
Note that the data to be sent must be followed by a double new-line \n\n.
In general, http endpoints in Express aren't supposed to do things like that. If you want live event data, the best way is to use a web socket.
That being said, this thread has an example on how to force Express to do this.
socket.io or Webrtc is the best choice

WebSockets: Make javascript wait until server responds

My web application uses javascript to communicate with a server by HTTP requests. However the server software is changed so that instead of using HTTP requests, it uses WebSocket communication.
Rewriting the the entire web application to use open communication (i.e. WebSockets) instead of a request-respond mechanism entirely will be an incredibly tedious job, and I don't really have anything to gain on using WebSockets for the web application.
What I want to be able to achieve is to add a javascript module that communicates by WebSockets, but still in a request-respond manner. I guess this means that I have to "pause" the javascript until the server responds to my WebSocket messages? Because with HTTP requests my web application waited until the request was responded to.
So I guess it boils down to: How can I make javascript wait until the server sends a WebSocket message?
Cheers
EDIT: To specify, my Javascript code uses HTTP requests throughout the source code via a function I named "get(URL)". I want to modify this function so that it sends a WebSocket message, waits for the response and returns the response. Any ideas?
If I understand you correctly from the comments, you do in fact not desire to "pause" your Javascript, but merely to replace async ajax calls with websocket communication.
You would emit a message to the server, with a pre-defined key as it is in the server. You could then have some sort of payload in the message if you desire, and then have a callback function with the data you receive. Following is a sample where I did something similar for a simple chat application I created.
My client code:
socket.emit('join', options, function(res){
_printToChat(res);
});
Server code:
socket.on('join', function(roomname, fn){
socket.join(roomname);
fn('You joined ' + roomname);
});
This sends a messge with the key "join", and the callback is called with the response from server. In this case res in the client would be "You joined " + roomname.
You would likely do something like:
Client code:
socket.emit('getSomeResource', options, function(res){
// do something with response in res variable.
});
Server code:
socket.on('getSomeResource', function(yourOptions, fn){
fn(resourceToReturn);
});

Categories