jQuery: How to call a function before unload window - javascript

I want to call a function before closing a window.
For example I want to call ajax before closing page. I am using following code but its not working.
$(window).bind('beforeunload', function(){
$.ajax({
url:'logout.php',
type:'POST'
});
});
What is the best way to call function before closing the window?

You code is most likely being run, but as it starts a asynchronous ajax request and the browser tears down the page immediately after triggering onbeforeunload, the request probably never gets sent (or if it gets sent, probably gets aborted).
The event handler for onbeforeunload is only allowed to do a very limited set of things (and the list varies from browser to browser, and the list frequently changes as browsers update things). For that reason, using onbeforeunload for anything other than the one purpose for which it was intended, giving you a last ditch chance to warn the user of losing information by leaving the page, is not a good idea.

The code you have written just makes an ajax call, but even before the call can be made, your window will be exit. And even if the request is sent, it may be aborted in between.

Try this code:
function WinClose() {
$.ajax({
url:'logout.php',
type:'POST'
});
return false;
}
window.onbeforeunload = WinClose;
Write this code in <head>

Related

Are XHR requests created in an anchor tags click handler guaranteed to be sent?

I'm sure this information exists out there but my googling abilities are failing me here.
What I want to know, is if I trigger a XHR request in an anchor tags click event handler, is that request guaranteed to be sent? I don't care about handling the response, but I don't know if it's possible for the request to never leave the unsent state before navigating away.
For example, say I want to track all anchor tag clicks so I do something like this:
$("a").on("click", () => $.ajax({ url: "/track_link_click", method: "POST" });
Is there a situation where the request in the event handler never gets sent? I could prevent the location change from taking place until response is received but I'd much rather not.
Thanks in advance!
There's no guarantee, by design, but in practice I'm pretty sure all browsers will at least get the request sent away.
I suggest you look into navigator.sendBeacon() which is designed for exactly this type of thing. Its purpose is to guarantee that the request gets sent even if it has to happen after the page unloads.
https://developer.mozilla.org/en-US/docs/Web/API/Navigator/sendBeacon
It's not supported in IE and not yet in Safari, so you'll need to rely on the regular AJAX method as a fallback.

JS BeforeUnload event doesn't fire sometimes

I need to do Async Http Post to external URL when user close the browser or navigates to a different page.
So, I attached a JS handler to BeforeUnload event.
Sometimes it gets triggered and sometimes it doesn't. I'm checking on chrome. Whenever I try to debug the script via inspector, it always work fine.
I must use JavaScript only (no external library like jQuery etc.)
It is fired only if there was ANY interaction of the user with the site. Without ANY interaction event(e.g. click, not hover), onbeforeunload won't be fired.
You can't use async operations in the beforeunload or unload events. In order for your script to work you have to make it synchronous. That would be easy by adding the async: false property to your ajax function (or similar if you are using a different approach). It might be that because you are postponing the disposal of the window when you are debugging that it works then. I've also read that in order to call a function you need to return something at the end of the event. Usually if you return a string that text will be displayed in a popup. I think you can find more useful info here: window.onbeforeunload not working

Using beforeunload to execute task before page exit

I'm trying to get an action executed before page leave, this requires a GET request being executed.
I would like to give the GET request at least 50-100ms in order to get the request forwarded to the web server.
I'm capturing the beforeunload action using:
$(window).bind('beforeunload', function() {
send_data();
});
Any tips?
You have the highest chance to get your request sent if you make it synchronous (async: false in the $.ajax() options if you are using jQuery).
But you cannot be sure it always gets through..
You should prevent default action with event.preventDefault().
Note that not every browser support beforeunload

Javascript: wait until ajax request finishes to close page [duplicate]

This question already has answers here:
JavaScript, browsers, window close - send an AJAX request or run a script on window closing
(9 answers)
Closed 6 years ago.
I would like the browser to keep the page open until the ajax requests are sent. This is what I imagine it would look like
var requestsPending = 0;
window.onbeforeunload = function() {
showPleaseWaitMessage();
while(requestsPending > 0);
}
// called before making ajax request, atomic somehow
function ajaxStarted() {
requestsPending++;
}
// called when ajax finishes, also atomic
function ajaxFinished() {
requestsPending--;
}
Unfortunately, JS doesn't do multi-threading. To my understanding, the callback (ajaxFinished) would never be executed because the browser would try to wait until the while loop finishes to execute it, and so the it would loop forever.
What's the right way to do this? Is there maybe a way to force JS to evaluate the next thing in its to-do list and then come back to the while loop? Or some syntax to "join" with an ajax call? I'm using DWR for my ajax.
Thanks,
-Max
Edit Based on your comment below, a revised answer:
If you want to block until a previously-initiated request completes, you can do it like this:
window.onbeforeunload = function(event) {
var s;
event = event || window.event;
if (requestsPending > 0) {
s = "Your most recent changes are still being saved. " +
"If you close the window now, they may not be saved.";
event.returnValue = s;
return s;
}
}
The browser will then prompt the user to ask whether they want to leave the page or stay on it, putting them in control. If they stay on the page and the request has completed while the prompt was up, the next time they go to close the page, it'll let them close it without asking.
Note that on modern browsers, your message will not be shown; instead, the browser will use a generic message. So on modern browsers, returning any non-blank string is sufficient. Still, you may want to return a useful string (such as the above) in case your user is using an obsolete browser that will still show it.
More on asking the user whether to cancel close events here and here.
Old answer :
Ideally, if possible, you want to avoid doing this. :-)
If you can't avoid it, it's possible to make an Ajax request synchronous, so that it blocks the onbeforeunload process until it completes. I don't know DWR, but I expect it has a flag to control whether the request is synchronous or not. In the raw XmlHTTPRequest API, this is the third parameter to open:
req.open('GET', 'http://www.mozilla.org/', false);
^ false = synchronous
Most libraries will have an equivalent. For instance, in Prototype, it's the asynchronous: false flag in the options.
But again, if you can possibly avoid firing off Ajax requests as part of the page unload, I would. There will be a noticeable delay while the request is set up, transmitted, and completed. Much better to have the server use a timeout to close down whatever it is that you're trying to close down with this. (It can be a fairly short timeout; you can keep the session alive by using asynchronous Ajax requests periodically in the page while it's open — say, one a minute, and time out after two minutes.)
In short, you cannot (and shouldn't) do this. If a user closes the browser, it's closing...no unload style events are guaranteed to finish, and something doing AJAX with involves latency is more unlikely to finish.
You should look at firing your events at another point, or change the approach altogether, but making an AJAX call in an unload event is going to unreliable, at best.
As an addendum to the above on the shouldn't part, think about it this way, how many tabs do you usually have open on any given window? I typically have 4-6 chrome windows open with 5-12 tabs each...should my browser window hang open because 1 of those tabs wants to make some AJAX request I don't care about? I wouldn't want it to as a user, so I wouldn't try and do it as a developer. This is just an opinion of course, but food for thought.

JavaScript/jQuery: How to make sure cross-domain click tracking event succeeds before the user leaves the page?

I'm implementing click tracking from various pages in our corporate intranet in order to add some sorely needed crowd-sourced popular link features ("most popular links in your department in the last 24 hours", etc.)
I'm using jQuery's .live() to bind to the mousedown event for all link elements on the page, filter the event, and then fire off a pseudo-ajax request with various data to a back-end server before returning true so that the link action fires:
$("#contentarea a").live("mousedown", function(ev) {
//
// detect event, find closest link, process it here
//
$.ajax({
url: 'my-url',
cache: false,
dataType: 'jsonp',
jsonp: 'cb',
data: myDataString,
success: function() {
// silence is golden -- server does send success JSONP but
// regardless of success or failure, we allow the user to continue
}
});
return true; // allow event to continue, user leaves the page.
}
As you can probably guess from the above, I have several constraints:
The back-end tracking server is on a different sub-domain from the calling page. I can't get round this. That's why I am using JSONP (and GET) as opposed to proper AJAX with POST. I can't implement an AJAX proxy as the web servers do not have outbound network access for scripts.
This is probably not relevant, but in the interest of full disclosure, the content and script is inside a "main content" iframe (and this is not going to change. I will likely eventually move the event listener to the parent frame to monitor it's links and all child content, but step 1 is getting it to work properly in the simplified case of "1 child window"). Parent and child are same domain.
The back-end is IIS/ASP (again, a constraint -- don't ask!), so I can't immediately fork the back-end process or otherwise terminate the response but keep processing like I could on a better platform
Despite all this, for the most part, the system works -- I click links on the page, and they appear in the database pretty seamlessly.
However it isn't reliable -- for a large number of links, particularly off-site links that have their target set to "_top", they don't appear. If the link is opened in a new tab or window, it registers OK.
I have ruled out script errors -- it seems that either:
(a) the request is never making it to the back-end in time; or
(b) the request is making it, but ASP is detecting that the client is disconnecting shortly afterwards, and as it is a GET request, is not processing it.
I suspect (b), since latency to the server is very fast and many links register OK. If I put in an alert pop-up after the event fires, or set the return value to false, the click is registered OK.
Any advice on how I can solve this (in the context that I cannot change my constraints)? I can't make the GET request synchronous as it is not true AJAX.
Q: Would it work better if I was making a POST request to ASP? If (b) is the culprit would it behave differently for POST vs GET? If so, I could use a hidden iframe/form to POST the data. however, I suspect this would be slower and more clunky, and might still not make it in time. I wouldn't be able to listen to see if the request completes because it is cross-domain.
Q: Can I just add a delay to the script after the GET request is fired off? How do I do this in a single-threaded way? I need to return true from my function, to ensure the default event eventually fires, so I can't use setTimeout(). Would a tight loop waiting for 'success' to fire and set some variable work? I'm worried that this would freeze up things too much and the response would be slowed down. I assume the jQuery delay() plugin is just a loop too?
Or is something else I haven't thought of likely to be the culprit?
I don't need bullet-proof reliability. If all links are equally catchable 95% of the time it is fine. However right now, some links are catchable 100% of the time, while others are uncatchable -- which isn't going to cut it for what I want to achieve.
Thanks in advance.
I would try a different approach. You can bind to a different event like:
$(window).unload(function(event) {
// tracking code here
});
I would try to return false from the link event handler, remember the URL and navigate away only when JSONP request succeeds. Hopefully it shouldn't add too much latency. Considering you are on the inranet, it might be OK.
Solved!
The short answer is: there is no reliable way to do this cross-domain with a GET request. I tried all sorts, including storing the event and trying to replay the event later, and all manner of hacks to try to get that to work.
I then tried tight loops, and they weren't reliable either.
Finally, I just gave in and used a dynamically created form that POSTed the results, with the target set to a hidden iFrame.
That works reliably -- it seems the browser pauses to finish its POST request before moving on, and ASP honours the POST. Turns out it's not 'clunky' at all. Sure, due to the browser security model I can't see the result... but it doesn't matter in this case.
I am now kicking myself that I didn't try that option first.

Categories