Is it necessary to call response.end() on every connection? - javascript

I am using SSE, so the only methods needed are response.writeHead and response.write since the response never really ends. The only way it will end is by client diconnecting and this brings me to my question ...
Do I need to listen to "close" event and manually call response.end or it makes no difference? Technically the connection is gone already anyway, so the response cannot be flushed or anything. Therefore I am confused whether the nodejs will properly unload everything even if response.end was never called for a specific response.

When setting up a connection for SSE, you just set the headers and then call res.write() whenever you want to send data. Keep in mind that you need to send some sort of delimiters (such as a line break) so that the client knows when it has received a "chunk" of data that it can process.
Here are a couple examples of nodejs and server sent events and you can see that it just uses res.write() whenever it wants to send data:
https://jasonbutz.info/2018/08/server-sent-events-with-node
https://masteringjs.io/tutorials/express/server-sent-events
Do I need to listen to "close" event and manually call response.end or it makes no difference?
You do need to listen for the close if you have resources or timers on your end that you need to clean up when the connection is done (as you will see in some of the examples above). You do not need to call res.end() when the connection closes as it's already been closed by the client.

Related

What will happen to an ajax request which has been initiated and after that browser is closed?

This is one of the interview question
For example :
Lets say there is a MVC application.
In the View we have written a ajax method to call MVC controller
$("#SubmitCompany").click(function(){
var jqxhr = $.post( "\Company\Save", function() {
alert( "success" );
})
.done(function() {
alert( "second success" );
})
.fail(function() {
alert( "error" );
})
jqxhr.always(function() {
alert( "second finished" );
});
});
Where in Save method we have written a functionality to process the data and store it in database. We also return a bit whether it is successful or failed.
So the question is:
What will happen if we initiate the request and close the browser after that?
What will the output in this scenario (as per the Ajax call written)
In my opinion if the request is made and sent to the server and suddenly the browser get closed then any changes will be made to server but there is no client here hence it just does nothing and no output will be made.
Another condition is that if the request is not made on server and browser get closed then it does nothing to both end.
For whatever length of time that the client operator (program, and so on.) has completely sent the demand, the server has all it needs and will finish the demand and attempt to send back a reaction.
Indeed, this kind of "pinging" conduct is frequently utilized for "pulse"- like procedures that keep an administration warm or perform intermittent upkeep.
IMHO, different languages have different ways for handling client connection states on the server. To answer your questions, it depends on how far the request has gone into server execution. If the server has all the input it needs to execute the request, then it will continue to do so.
However, if there are any reliable ways to detect the client has aborted in the middle of the execution, then whether to continue or abort further execution can be decided with the server code. I'm no .NET expert but from this answer, it looks like you can use Response.IsClientConnected to identify if the client browser aborts the request and decide the course of action and output.
In PHP for example, a connection status is maintained. There are 4 possible states based on which you can decide whether or not you want a client disconnect to cause your script to be aborted.
Sometimes it is handy to always have your scripts run to completion even if there is no remote browser receiving the output. The default behaviour is however for your script to be aborted when the remote client disconnects.

Client/Server operation detection in NodeJS

I would like to have a spinner appear when a form has been submitted and then be replaced by a checkmark to indicate the operation has been completed. I already have JQuery detect when the button has been pressed and make the spinner appear like so:
$('.scrape').on('click', function(){
$('.spinner').removeClass('hidden');
})
How can I have the client/server side detect that the operation has been completed? An AJAX call wouldn't work because the POST request has already been made, right? Note: I am working with the express framework. I'm not rendering anything because it's all on the same page.
Thanks!
Each time you make a request to your server, you should get a response from the server. Receiving a response with a non-error HTTP status means your operation succeeded.
"How can I have the client/server side detect that the operation has been completed?" You mean client. The client ultimately controls what is displayed on the screen, so the client needs to know when it's time for the checkmark. (So I don't care that the server is a NodeJS process using express.)
"An AJAX call wouldn't work because the POST request has already been made, right?" But you must have an handler somewhere for the completion of the request. You haven't said whether you're doing this by $.ajax or vanilla javascript or something else, but whatever you're doing, you just need to add something to that handler so that it adds the checkbox. If you want a more specific answer, feel free to share some more code :-)

Symfony not send response if previous process is work

I send few ajax requests to my site. First running creating document, others check him status. But symfony not respond while first process is working. All processes wait in queue while end first. If look in log i see:
request.INFO: Matched route "start creating". work first request.INFO:
Matched route "check" work first request.INFO: Matched route "check"
work first .... event.DEBUG: Notified event "kernel.response" to
listener for first process event.DEBUG: Notified event
"kernel.response" to listener for second process etc
Why symfony very strange processes requests?
Problem in php session mechanism.
session_write_close();
fix this problem
It is not symfony, but your browser which limits number of simultaneous connections to the same domain. See What's the maximum number of simultaneous connections a browser will make?

Server Side Broadcast Update PHP

I want to build an application which will automatically broadcast notification(s) to a user when data on server is changed. So far, I just know one method of doing this i.e. using JQuery setInterval. Using this function, every client requests data through ajax to server, asking if something changed.
The weakness of this method is every client must send a packet every specific time interval, so my server receives huge data packet. It's so frustrating to manage the server. Are there any alternatives for this besides Jquery setInterval?
If Websockets is not an option for you, you could use one ajax request to the server. Than server side go into a infinite loop. Use the sleep function to not overload the memory. Than check each time if there is something changed. If so, break out the loop and return the data. On the client side send immediately the next request.
After a bit of research it's called "Ajax long-polling requests".
Here is a explanation.
The PHP code would look something like this:
$prevHash = $_GET['hash'];
while(true){
$currHash = GetHashFromTable('myTable');
if ($prevHash != $currRowCount) break;
sleep(3);
}
$response[0] = GetDataFromTable('myTable');
$response[1] = GetHashFromTable('myTable');
echo json_encode($response);
Update
Long polling is not the best option. Better to use web-sockets.
If you want to compare the differences, see this answer: https://stackoverflow.com/a/10029326/3269816

Asynchronous Servlet Client, Server push

Hello guys i want to process some server pushes. I have an asynchronous servlet processing something, pushing it to the client and then it processes something else and pushes it again to the client (same connection). The servet just returns data (Json in this case, but that does not really mather) nothing more.
So my problem is the client. How do i build a client for that? If i make an ajax request with JQuery for example how can i react on the data that comes after the first response?
To make it more clear what i want here is a comparison : With websockets i have the method onmesssage.
websocket ws = new WebSocket("ws://myserver.com");
ws.onmessage = function(event)
{
var x = event.data
.... // some other code here
}
So all i want is a onmessage Method :). I guess it is not that easy as it is with websockets but maybe someone has an idea.
Greetings Aleks
You can have your server generate a response which is loaded into an hidden iframe by the client. The generated response would contain occasional JavaScript statements which call to the "outside" (the containing document). You can get your hands on the containing document using parent.
But please not that this technique is pretty hackish (at least it seems to me). You might want to re-consider just using the XMLHttpRequest, especially because it gives you simple and robust error handling. You can just do more requests (instead of appending to an "old" response on the server side). This will probably introduce additional lag, but that iframe trick is really troublesome in practice.

Categories