I have the following problem:
In javascript, I am using websocket for sending the request to server to receive the data in real time. If I try to send the request it returns the response but only once. which means If I send 1 request Then I will receive 1 response....
If I send a Json message by the below structure,
websocket.send(jsonmessage);
I will receive the response on,
websocket.onmessage = function(evt) { console.log(evt);onMessage(evt) };
But, I would Like to receive response continuously for the single request itself.
Is it possible or Any ways to achieve this?
Related
I'm making a chat application that uses long polling with express (I'm aware websockets are better for this, I just wanted to make something specifically with long polling).
This is the code that the client sends a GET request to when it's waiting for a new chat message:
app.get('/api',(req,res)=>{
listeners.push(res)
})
It pushes the response objects to an array, and once a new chat message is available it calls .send() on the responses. When the client recieves the response, it sends a GET request again, waiting for new messages.
Here's the code for when someone sends a message (simplified)
app.post('/api', (req,res)=>{
listeners.forEach(listener=>{
listener.json(req.body)
})
listeners = []
res.status(200).end()
})
I noticed that when I open two tabs of my site, only the 1st tab's request gets put in the array at first. That makes the 2nd tab not receive the 1st chat message. The 2nd tab's request only gets pushed into the array after the 1st chat message. So when the second chat message is sent it works fine. This means that the 2nd tab is always one request behind.
Here's an example in case that wasn't clear:
Client 1 connects
Client 2 connects
---
Client X sends a message:
Client 1's request 1 receives the message
Client 2's request 1 is still pending
---
Client X sends another message
Client 1's request 2 receives the message
Client 2's request 1 receives the message
---
Client X sends another message
Client 1's request 3 receives the message
Client 2's request 2 receives the message
...
Another thing I noticed is that when I restart the server after client 1 and 2 connect, client 2's request get pushed into the array after the restart.
Could someone explain this behavior? Thanks in advance.
This sounds like the browser is hoping to use a cached request so it waits for a previous request with the exact same URL to see if it can use the cached result rather than firing the same request again.
You can test this hypothesis by adding a unique query parameter to every request as that will disable any attempts at caching since they would all be different URLs.
Or, if these requests are sent using the fetch() api in the browser, you can directly tell that API to not use caching with the cache: 'no-cache' option as in:
const response = await fetch(url, {cache: 'no-cache'});
I want to send a get HTTP request and get the response only when it changes. That is, is it possible to add something to the request so that it checks whether something has changed on the site or not, and only if something has changed, I received a response?
In Node.js, suppose serving a request takes time. So upon receipt of the request the server wants to response back with "I revived your request and I will get back to you soon". Then once the processing the request is over, the server wants to, this time, get back to the client with the actual response . How can we do that?
The code snippet below issues error ( connection closed or something ).
'use strict';
var http = require('http');
var numberOfRequests = 0;
http.createServer(function (request, responce) {
request.n = numberOfRequests;
console.log('Request number ' + numberOfRequests + ' recioved!');
responce.writeHead(200);
responce.write("soon we will get back to you for your request# " + request.n);
responce.end();
setTimeout(function () {
responce.writeHead(200);
responce.write("Responce to your request# " + request.n);
console.log("Responce to the request# " + request.n);
responce.end();
}, 5000);
numberOfRequests++;
}
).listen(8080);
console.log('listening ...');
You can't send two responses to the same request. You can only send one fully formed response per request. That's the http specification.
The only work-arounds for your issue that I know of are:
Use http in a flushed streaming mode. Here, you send part of the response, flush it out so you know it is sent and you have a special type of client on the other end that is reading partial responses and interpreting them (not the usual way that http responses are read and not what a browser does on its own).
Use a websocket or socket.io connection to update the client with progress before you finally send the actual http response. The user would connect a socket.io connection, then make a long running http request, then receive regular progress on the socket.io connection, then receive the http response when it was done. This can be done simply in any web page without much difficulty. The only main problem to solve is you have to install a method of associated a webSocket or socket.io connection with an incoming http connection (which can usually be done via a session cookie) so you know which webSocket or socket.io connection is associated with the http request that just arrives.
Use some other server-push scheme to "push" the second response to the client.
Implement client polling. Send an initial http response that instructs the client to check back in NN ms to see if there is more to the response. When the client checks back again NN ms later, you either have the final response for them and send it or you respond again tell them to to check back again in NN ms, etc...
You are looking for Server-Sent-Events (SSE), of course you can send chunked data, just don't call res.end() until the final piece of data is sent. And by the way, you can only send headers one time. See this example:
res.writeHead(200);
res.write("The first piece of data");
setTimeout(()=>{
res.write("This piece of data will be sent in 1 second.");
res.end("Ok, all data has be send, now close the connection.");
}, 1000);
I've a JS (Angular) client that makes a PUT request (REST API) to server and server sends back a large payload that I'm not using in the client currently.
Is there a way to just fire the request and ignore any response that comes back? The main need here is to avoid the data cost incurred by receiving that payload. I've looked at closing the connection once the request is fired, but am not sure if that's the best way to handle this.
If able, I think the only way to change this would be to change the api endpoint to not include a payload from the put request.
I'm assuming you are using angular's http class and using Observables. But even if you aren't, your angular client is going to need to read the response status sent back from the server to determine whether or not the put request was successful or not. In order to read the status, you'll need to response, and unfortunately the full response sent from the server.
You could close the connection right after the request, but as I've mentioned you'll have no way of knowing whether or not the request was successful.
To ignore the request just don't do anything if the request is successful.
If you don't want the request to exist at all then do it on the backend.
Is there a way to make an $.ajax POST request:
without requiring a response
so the server doesn't even try to return anything
Are there some HTTP headers to accomplish this? The goal would be to track statistics with minimal server and client request processing.
The server should return a HTTP/204 No Content response. That's as close as you can get.
If it is no problem in terms of security, and the amount of data you send is at maximum 2K minus the length of your URL, use a GET request instead. A GET request sends only one TCP packet, instead of two packets as a POST request does (first the header, then the data).