I have tested this and it seems like even if the page unloads, the async action completes regardless if the tab is closed or if the tab navigates to a new url. The server it's calling to however is pretty snappy.
What is the process behind the scenes when it comes to async processes running and tabs being closed - at which point will the browser just call it quits on an async process that the page has started?
You cannot rely on the behavior of async calls from within onbeforeunload and onunload between servers.
We had an app that ran with an Apache server- Windows on our dev environment, and Unix on the release. It turns out that when the server was configured to handle requests in threads- default for our Windows/dev boxes- the Ajax would always complete; when it was configured to handle requests in processes- default for our Unix/prod environment, it would always cancel!
What happens is that the Ajax request fires, and then the browser unloads the page, which closes the connection for the Ajax reply. We set up a test where the Ajax call would execute a 4-second "sleep" on the server to avoid any timing issue. It looked like with a threaded back-end, Apache did not notice the client closing the connection until after it returned from the "sleep", whereas with the child-process backend, it aborted the Ajax call immediately.
The answer is to use synchronous requests in your on[before]unload event handler. That will keep the page open until the request completes. That does mean an additional delay when the page switches/reloads/goes back...
...and also you have no guarantee that the next page will see the results of that request- it seems that some browsers will GET the next page before firing the onunload event! But that's a topic for another question.
Related
I'm using Aurelia and have multiple promises calling my API. If I continually click the button to fire the promises over and over and over again the back end will timeout. How can I stop/halt the promises I am firing and just get the newest one so as to not over work the API and cause a timeout?
This is not specific to aurelia, it happens with any asynchronous event implementation.
When you click on a button, the event handler is called.
The event handler sends an asynchronous AJAX request to your server. Because it's async, your code and your application continues to run while the request is happening.
User can click again on the button and a second AJAX request can be sent at the same time, assuming that you click faster than requests complete.
To make it worse, concurrent requests may even complete out-of-order (i.e. a later request completes before a former). Your code should be prepared to handle that properly.
If you don't want this behaviour, it is up to you to prevent the user from submitting again until the AJAX request completes. For example you could:
Disable the button when sending the request; or
Display a modal "loading" screen / spinner to prevent any interaction with the application while the request is in flight.
Note that giving user feedback is good UX anyway. A network request might be delayed for a whole lot of reasons and it is a good idea to give some feedback to let him know that something is happening.
I have a problem with chrome scrollbar,on Mozilla there is no such a problem.
I have couple of synchronous ajax requests and then some info appending on the page,they need about 2 secs to load.During this time the scrollbar freezes and is not usable,when the ajax end the scroll works fine.
Javascript is completely single-threaded.
If you make multiple AJAX calls, you will receive each response as
soon as the server sends it; the order depends on the amount of time
that it takes the server to send each reply.
If your code is still running when the server replies, the reply will
only be processed after your code finishes.
You should try to load all of the data in a single request.
Source: SLaks
I might hit a wall here, in other programming languages you can start a new thread for connections/things like that, if you run multiply threads (kinda like layers) your interface stays the way it is/was.
When you use synchronous AJAX the page stops until ajax finish, so if you want the page not stop, must be asynchronous AJAX call.
You can see more here:
Documentation AJAX W3Schools
The problem you are describing is not browser behavior problem.
When you make an synchronous request it means that the the code pending for the response.
Since javascript is a single threaded language (lets ignore web-workers for now),
also the UI processing/manipulation is pending,
and this is why the browser or scroll-bar "stuck".
The reason it works on Firefox is that synchronous calls are deprecated (btw because of the "stuck" behavior), and what you are actually doing there is asynchronous request;
You can read more about it here: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Synchronous_and_Asynchronous_Requests
In my webpage many on-load ajax call those works fine.
Action takes time as per processing time. Means if any action that has been complete will send response and will not wait for first to finish.
But if same I am trying to do with on-lick or any user event. All ajax call works synchronously. I want these should not wait to finish the execution of first running action. All should start and complete independently.
I am using jquery 1.8 where default async= true;
Please help me here to resolve this.
Issue may be due to session lock.
more detail you can find it here
http://php.net/manual/en/function.session-write-close.php
call session_write_close function if there no session write activity in your script or function.
Such issues are observed in many concurrent ajax call and previous call has some session write activity. In this case session will be locked until it completed its execution.
If you set async to true, they will all fire at the same time.
make explicitly declaration of async = true option in each ajax call .
Useful jsfiddle link
http://jsfiddle.net/jquerybyexample/tS349/
I've been searching for any reasonable example of a situation, when synchronous AJAX makes sense. I've found this SO question, where the author mentions window.onbeforeunload and claims, that "the request would never stop" if the AJAX call was asynchronous.
Can anybody explain this? I mean - window.onbeforeunload is fired when the user wants to close the tab. What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
He didn't say the request will never stop; he said it will never complete. That is, your code has no hope of ever getting the response back, because the execution environment (the window) would disappear from existence before that happened.
The tab will close when window.onbeforeunload exits with a truthy value. Thus, as long as it is running, the page is waiting for the return value, and not closing. This allows a synchronous AJAX to be sent, and for the response to be received and processed. If the request is asynchronous, the code constructs XHR object, then exits, and the page (and your code) goes away.
I have never tested this, but the answerer apparently believes (and I don't think it unreasonable) that the page might not stick around long enough for an async XHR to even be sent, let alone to receive a response. Thus, if you want to be sure the server receives the information that the user closed the page, you want to have the request synchronous.
What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
Sending a synchronous XMLHttpRequest on unload is the only way to guarantee delivery of session data when a user-agent unloads the page (and may never re-visit your site again). There are two specific cases for this:
Tracking - Tracking and reporting the total session time for a user's page visit.
Batching - Coalescing and deferring delivery of batched session data to reduce the number of server requests.
The Beacon spec (navigator.sendBeacon) was designed to optimize this specific case, making it possible to send asynchronous requests guaranteed to still complete even after the page unloads.
Is it possible to stop all the requests/scripts that are happening on a page, when the refresh button is hit?
I have a $.ajax request that takes a few minutes. I have tried to do window.stop() in the unload event; I have tried xhr = $.ajax(...); and xhr.abort(); and any other methods found over the internet and nothing works.
Is this even possible? Why the browser still waits for that request to finish and send back a response and not refresh the page?
LATER EDIT (SOLVED): it seems that the waiting problem is actually from the server. If in the ajax call, the script uses the SESSION then the web page will freeze until the request is finished, even if we abort the xhr.
Why is that? Explanation:
Session data is usually stored after your script terminated without the need to call session_write_close(), but as session data is locked to prevent concurrent writes only one script may operate on a session at any time. When using framesets together with sessions you will experience the frames loading one by one due to this locking. You can reduce the time needed to load all the frames by ending the session as soon as all changes to session variables are done.
Maybe try canceling right before refresh as in here?
window.onbeforeunload = function(event) {
xhr.abort();
};