Symfony not send response if previous process is work - javascript

I send few ajax requests to my site. First running creating document, others check him status. But symfony not respond while first process is working. All processes wait in queue while end first. If look in log i see:
request.INFO: Matched route "start creating". work first request.INFO:
Matched route "check" work first request.INFO: Matched route "check"
work first .... event.DEBUG: Notified event "kernel.response" to
listener for first process event.DEBUG: Notified event
"kernel.response" to listener for second process etc
Why symfony very strange processes requests?

Problem in php session mechanism.
session_write_close();
fix this problem

It is not symfony, but your browser which limits number of simultaneous connections to the same domain. See What's the maximum number of simultaneous connections a browser will make?

Related

Is it necessary to call response.end() on every connection?

I am using SSE, so the only methods needed are response.writeHead and response.write since the response never really ends. The only way it will end is by client diconnecting and this brings me to my question ...
Do I need to listen to "close" event and manually call response.end or it makes no difference? Technically the connection is gone already anyway, so the response cannot be flushed or anything. Therefore I am confused whether the nodejs will properly unload everything even if response.end was never called for a specific response.
When setting up a connection for SSE, you just set the headers and then call res.write() whenever you want to send data. Keep in mind that you need to send some sort of delimiters (such as a line break) so that the client knows when it has received a "chunk" of data that it can process.
Here are a couple examples of nodejs and server sent events and you can see that it just uses res.write() whenever it wants to send data:
https://jasonbutz.info/2018/08/server-sent-events-with-node
https://masteringjs.io/tutorials/express/server-sent-events
Do I need to listen to "close" event and manually call response.end or it makes no difference?
You do need to listen for the close if you have resources or timers on your end that you need to clean up when the connection is done (as you will see in some of the examples above). You do not need to call res.end() when the connection closes as it's already been closed by the client.

On ajax error, cache and try later

I have a mobile project where I have to send ajax-requests one after the other. The project is using a mobile internet connection (egde, 3G), so it can happen that I lost the connection and I have to cache the failed request (in the localStorage), check at intervals for a valid connection and try again the request.
At the same time other requests come in (from the Browser), so I have to cache the requests in a queue and send them whole in a row.
Sorry for my bad Englisch, I hope you can understand my problem.
Any suggestions? Are there any libraries for my problem?
May be you can use below logic.
1. Create a array which will hold status of your ajax request.
2. Once you make a request add particular request to array and it results(response recieved) to false.
3. Once you recieve response from that request update the array and its results(response recieved) as true.
4. Read this array after particular time and send request again for false once.

Is it bad idea to make an AJAX post call every 2 secs?

If I make an AJAX $.post call (with jQuery) to a php file for updating a certain parameter/number, does it considered bad practise, dangerous or similar?
$.post(file.php, {var:var}, function(data){
// something
}, json);
It would be a single user on a single page updating a number by clicking on an object. For example if user A is updating a certain number by clicking on an object user B should see this update immediately without reloading the page.
It depends on 3 main factors:
How many users will you have at any given time?
How much data is being sent per request on average?
Given 1 and 2, is your sever set up to handle that kind of action?
I have a webapp that's set up to handle up to 10-20k users simultaneously, makes a request each time the user changes a value on their page (could be more than 1 req per second), and it sends roughly 1000 bytes on each request. I get an average of 10ms response time, however that's with node js. Originally I started the project in PHP but it turned out to be too slow for my needs.
I don't think web-sockets is the right tool for what you're doing, since you don't need the server to send to the client, and a constant connection can be much more expensive than sending a request every few seconds.
Just be sure to do lots of testing and then you can make judgements on whether it'll work out or not for your specific needs.
tl;dr - It's not a good idea if your server can't handle it. Otherwise, there's nothing wrong with it.
Another solution could be, to cache user actions in local storage/variables, and send them all at once every 10-15 seconds or so, then clear the cache, when sending was successful.
In this case you should also validate the data in local storage to prevent tampering.

making ajax requests in every 2 seconds is doable?

I have to build a results viewing page where I have to list results of a voting process in live. I am planning to check for DB changes in every 2 seconds ? Is it doable ? will the page crash or get stuck after 1 or 2 hours?
I think you can use settimeout to call the ajax function.
The second better way would be if its possible in your existing architectural setup, use setimeout for making an io connection and then checking for any event , this event is nothing else but a change in the db which is triggered by a node js server.

How to make auto-updating (ajax) counter correctly? Or how to disable network log?

I'm trying to make auto-reload counter (for ex.: Messages [num]).
So, I just in setTimeout(); getting JSON code from test_ajax.php. I think it's not correctly..
Can I send info by server (I think not, but suddenly I something don't know..)?
Why I think that's not correctly: because when I'm looking in my chrome network log (F12 -> network tab), I see a lot of requests (to test_ajax.php), but when, I'm visiting vk.com (great example for ajax) or facebook.com, I don't see any requests while something will not change.
So, what's incorrectly in my solution (or what's bad..)?
UPD: Sorry, vk.com sending requests to q%NUM%.queue.vk.com every 25s, but until 25s last request's status is "Pending". When someone, for example, sending me a message it immediately display it. And request has parameter "wait" which equals 25. This delay in requests doing on server side.. But how?
Ajax counter can be done in easy just include below files
index.html
counter.php (ajax file)
necessary images
JS file (for jquery paging call)
download link: https://docs.google.com/open?id=0B5dn0M5-kgfDcE0tOVBPMkg2bHc
What you are looking for is called COMET (also sometimes called Reverse AJAX) techniques.
Doing what you want to do, e.g. regular polls, is one way of doing it.
A lot is actually happening on the server side; to avoid recreating new connections on every poll, some servlet containers like Jetty started to implement techniques like Continuation which basically maintain a two-way connection open.
In the Java world, with Servlet 3, you have asynchronous calls as part of the specs.

Categories