Capturing onbeforeunload when page is contained in iframe - javascript

I've set up a fiddle to try to verify if the beforeunload event is triggered when the page is used in an iframe.
Since the fiddle show its result in an iframe, I figured it could be easy to verify by just closing the page. I've set up a request bin at pipedream just to see if any requests gets sent, but it doesn't seem to trigger in Chrome.
window.onbeforeunload = function() {
fetch('https://eoaczcjrpegb7wv.m.pipedream.net')
}
Is is possible to use this event from an iframe or do I need to look in to a different approach?
After a closer look it seem to capture some of the requests. Is this prone to race conditions? If so, are there any more robust alternatives?

Yes, there is a race condition. Since your event handler does nothing to stop the page unload in any way (it does not even trigger a confirmation prompt to delay it), immediately after your event handler is processed, the browser will proceed to unload the page. This aborts most pending requests; if a request did not manage to be submitted at that point, it will not be sent to the server at all.
Sending a request from a beforeunload event handler is a poor idea anyway. For starters, you are not even guaranteed that the event will fire at all; the browser may be unable or unwilling to trigger the event. MDN warns that it will only fire after the user had interacted with the page and it may fail when the session is terminated out of browser’s control. The only legitimate purpose of beforeunload is to check whether the page contains any unsaved state that the user may lose, and to trigger a confirmation prompt; even that should be understood to work on a best-effort basis. Anything else is prone to abuse and suspect; I would not be surprised if a future browser plug-in or vendor ‘intervention’ were to block all web requests when a page is about to unload.
However, if you insist, there are ways to make the request survive unload. You can use navigator.sendBeacon:
window.onbeforeunload = function () {
navigator.sendBeacon('https://example.net', '');
};
or the keepalive fetching option:
window.onbeforeunload = function () {
fetch('https://example.net', { keepalive: true });
};
Chrome provides both, but Firefox, as of version 106, only implements the former. Using those APIs comes with some restrictions: at any given moment, the total amount of data sent by active keep-alive requests must fit within 64 KiB, as per the Fetch specification.
You may notice using those APIs in a beforeunload handler is still not recommended usage, as it worsens performance of navigating back to the page with the back button. MDN suggests listening for the visibilitychange event, but that is of course not the same thing.
Last but not least, nothing stops the user from having the browser lie to you that the request has been sent, like with this uBlock filter:
##+js(no-fetch-if, keepalive:true)
##+js(set, navigator.sendBeacon, trueFunc)
So try not to be too obnoxious with your spyware ‘analytics’.

Related

JS BeforeUnload event doesn't fire sometimes

I need to do Async Http Post to external URL when user close the browser or navigates to a different page.
So, I attached a JS handler to BeforeUnload event.
Sometimes it gets triggered and sometimes it doesn't. I'm checking on chrome. Whenever I try to debug the script via inspector, it always work fine.
I must use JavaScript only (no external library like jQuery etc.)
It is fired only if there was ANY interaction of the user with the site. Without ANY interaction event(e.g. click, not hover), onbeforeunload won't be fired.
You can't use async operations in the beforeunload or unload events. In order for your script to work you have to make it synchronous. That would be easy by adding the async: false property to your ajax function (or similar if you are using a different approach). It might be that because you are postponing the disposal of the window when you are debugging that it works then. I've also read that in order to call a function you need to return something at the end of the event. Usually if you return a string that text will be displayed in a popup. I think you can find more useful info here: window.onbeforeunload not working

Correct way to use onbeforeunload?

I am trying to post data when user arrives my page. This works in chrome and explorer, and also firefox, but however, on firefox, it strangely only works if user closes the page. If they go back, or goes another site (by typing to address bar or whaever) it doesnt post the data. My question is, what is the correct way to use onbeforeunload to post data ?
$(window).bind('beforeunload', function () {
$.post("track.php", {
async: false,
ip: ip,
referer: referer,
clicks: kactane2,
scrolls: kactane,
time: time,
refid: refid,
country: country,
});
});
There isn't a good way because that is not how onbeforeunload was meant to be used.
The correct way to use onbeforeunload is to listen for this event and then unload any data or resources you might be using because the user is leaving the page. You should not use it to try to start new things. According to the HTML5 specification showModalDialog(), alert(), confirm() and prompt() are explicitly not allowed and the idea is to give you a moment to clean up any event handlers, web workers and other stuff cleanly.
If an event handler is defined then the user may be presented with a page that says "Are you sure you want to leave?" but for security reasons the form is generally not able to be customized, but it depends on the browser.
You will probably be better off setting the data in a cookie or something that can be done quickly and that is only occurring in the browser, then just look for that data on the next page load.

Evil Firefox Error -- "A parameter or an operation is not supported by the underlying object"

I'm trying to figure out what is going on here. I've been at it for hours now and can't seem to get a grip on why this is happening.
I'm making a few AJAX calls, and I keep getting this error back only in Firefox (version 21) on Mac OS X.
Here is the error:
"[Exception... "A parameter or an operation is not supported by the underlying object"
code: "15" nsresult: "0x8053000f (InvalidAccessError)" location:
"https://ajax.googleapis.com/ajax/libs/jquery/2.0.0/jquery.min.js Line: 6"
I'm making a CORS call, so I set up my AJAX like so:
$.ajaxSetup({
crossDomain: true,
xhrFields: {
withCredentials: true
}
});
And continue calls henceforth. Basically, does anyone out there have ANY experience with this error? I see some posts online but they all seem to do with Cross-Domain CSS, which I'm not using.
Okay, so after of hours of testing (and great discussion from #Dave and #danronmoon, I've finally figured out what's going on.
The CORS (Cross-Domain Resource Sharing) calls I was making were set to 'async: false' -- (which I realize I did not include in my original post, as I thought it was inconsequential) this, seems to operate fine in all browsers except Firefox, where jQuery will bark at you and your ajax call will fail.
Thank you all for your help and I hope this helps someone else!
Since this is the first duckduckgo result for InvalidAccessError: A parameter or an operation is not supported by the underlying object I will add another source for this.
If you deal with such error when doing iframe/window actions, then you're probably prevented by the iframe's sandbox attribute (see https://html.spec.whatwg.org/multipage/iframe-embed-object.html#attr-iframe-sandbox ) even when being on the same origin.
In my case, an iframe was trying to do a window.top.location.href = ... after a form submission success. The allow-top-navigation sandbox option is mandatory to do so.
Funny thing, this sandbox option is not mandatory to reload the top browsing context... it's only required for navigating in it.
For me, I was using WebSockets and called WebSocket.close(1001). It doesn't like my status code. Changing it to 1000 or not specifying a code (default 1005) works just fine.
this is the real solution by Diogo Cardoso, the xhr object or parent seems to lack a toString() method
CORS synchronous requests not working in firefox
Yes, it is a CORS problem caused by using ajax. But as user320550 asks, what if you NEED to use the property 'async:false'? I found that using the 'withCredentials:false' property as a workaround fixes the issue on firefox and doesn't affect other browsers.
Just want to add a somewhat nasty intermittent variant of Xenos's answer. As he mentioned, you can get this problem if you try and navigate the window by setting window.top.location.href = ... from within a sandboxed iframe, and that this can be prevented if your iframe has the allow-top-navigation option set.
But you might also find your iframe has the more restrictive allow-top-navigation-by-user-activation option. This will allow navigation, but only in response to a user action such as clicking a link or a button. For example, it will be allowed within a form submit event handler, but you can't just trigger it at an arbitrary point in time, such as from a setTimeout() callback with a long delay.
This can be problematic if you are (for example) using AJAX form submission before performing a redirect. The browser needs to decide if the navigation is in response to a user action or not. It does this by only allowing the navigation if it is considered to have happened within an acceptable time period of the user interaction. The HTML standard refers to this as transient activation.
The bottom line is that if your AJAX call is too slow, or if your user has a poor network connection, the navigation will fail. How slow is too slow? I have only tested Firefox, but it appears to allow 5 seconds before it considers the user interaction to have expired.
Possible solutions:
Ask whoever is responsible for the iframe options to upgrade to the blanket allow-top-navigation option
Don't perform async work such as AJAX requests in between user actions and top navigation. For example, use old-school POST form submission directly to the back-end, rather than using an AJAX request
Make sure your responses are as fast as possible. Catch any errors, and prompt the user to click something to trigger the navigation manually. For example:
async function submitForm() {
await doPotentiallySlowAsyncFormSubmit()
try {
window.top.location.href = ...
} catch (e) {
// Show message to user, e.g. "Form submitted, click here to go to the next step"
}
}

JavaScript/jQuery: How to make sure cross-domain click tracking event succeeds before the user leaves the page?

I'm implementing click tracking from various pages in our corporate intranet in order to add some sorely needed crowd-sourced popular link features ("most popular links in your department in the last 24 hours", etc.)
I'm using jQuery's .live() to bind to the mousedown event for all link elements on the page, filter the event, and then fire off a pseudo-ajax request with various data to a back-end server before returning true so that the link action fires:
$("#contentarea a").live("mousedown", function(ev) {
//
// detect event, find closest link, process it here
//
$.ajax({
url: 'my-url',
cache: false,
dataType: 'jsonp',
jsonp: 'cb',
data: myDataString,
success: function() {
// silence is golden -- server does send success JSONP but
// regardless of success or failure, we allow the user to continue
}
});
return true; // allow event to continue, user leaves the page.
}
As you can probably guess from the above, I have several constraints:
The back-end tracking server is on a different sub-domain from the calling page. I can't get round this. That's why I am using JSONP (and GET) as opposed to proper AJAX with POST. I can't implement an AJAX proxy as the web servers do not have outbound network access for scripts.
This is probably not relevant, but in the interest of full disclosure, the content and script is inside a "main content" iframe (and this is not going to change. I will likely eventually move the event listener to the parent frame to monitor it's links and all child content, but step 1 is getting it to work properly in the simplified case of "1 child window"). Parent and child are same domain.
The back-end is IIS/ASP (again, a constraint -- don't ask!), so I can't immediately fork the back-end process or otherwise terminate the response but keep processing like I could on a better platform
Despite all this, for the most part, the system works -- I click links on the page, and they appear in the database pretty seamlessly.
However it isn't reliable -- for a large number of links, particularly off-site links that have their target set to "_top", they don't appear. If the link is opened in a new tab or window, it registers OK.
I have ruled out script errors -- it seems that either:
(a) the request is never making it to the back-end in time; or
(b) the request is making it, but ASP is detecting that the client is disconnecting shortly afterwards, and as it is a GET request, is not processing it.
I suspect (b), since latency to the server is very fast and many links register OK. If I put in an alert pop-up after the event fires, or set the return value to false, the click is registered OK.
Any advice on how I can solve this (in the context that I cannot change my constraints)? I can't make the GET request synchronous as it is not true AJAX.
Q: Would it work better if I was making a POST request to ASP? If (b) is the culprit would it behave differently for POST vs GET? If so, I could use a hidden iframe/form to POST the data. however, I suspect this would be slower and more clunky, and might still not make it in time. I wouldn't be able to listen to see if the request completes because it is cross-domain.
Q: Can I just add a delay to the script after the GET request is fired off? How do I do this in a single-threaded way? I need to return true from my function, to ensure the default event eventually fires, so I can't use setTimeout(). Would a tight loop waiting for 'success' to fire and set some variable work? I'm worried that this would freeze up things too much and the response would be slowed down. I assume the jQuery delay() plugin is just a loop too?
Or is something else I haven't thought of likely to be the culprit?
I don't need bullet-proof reliability. If all links are equally catchable 95% of the time it is fine. However right now, some links are catchable 100% of the time, while others are uncatchable -- which isn't going to cut it for what I want to achieve.
Thanks in advance.
I would try a different approach. You can bind to a different event like:
$(window).unload(function(event) {
// tracking code here
});
I would try to return false from the link event handler, remember the URL and navigate away only when JSONP request succeeds. Hopefully it shouldn't add too much latency. Considering you are on the inranet, it might be OK.
Solved!
The short answer is: there is no reliable way to do this cross-domain with a GET request. I tried all sorts, including storing the event and trying to replay the event later, and all manner of hacks to try to get that to work.
I then tried tight loops, and they weren't reliable either.
Finally, I just gave in and used a dynamically created form that POSTed the results, with the target set to a hidden iFrame.
That works reliably -- it seems the browser pauses to finish its POST request before moving on, and ASP honours the POST. Turns out it's not 'clunky' at all. Sure, due to the browser security model I can't see the result... but it doesn't matter in this case.
I am now kicking myself that I didn't try that option first.

Javascript: onrefresh or onreload?

I want an event handler that fires when the user hits reload. Is onrefresh or onreload the correct handler to add to ? Also, will this even fire before or after onunload? Are there an browser inconsistencies? Thanks.
I don't think there are events called onrefresh or onreload. You can know when the page is unloading, but knowing why (i.e. where the user is going next) is outside JavaScript's security sandbox. The only way to know whether the page has been reloaded is to know where the user was on the last page request, which is also outside the scope of JavaScript. You can sometimes get that via document.referrer, but it relies on the browser's security settings to permit access to that information.
The WindowEventHandlers.onbeforeunload event handler property contains the code executed when the beforeunload is sent. This event fires when a window is about to unload its resources.
window.onbeforeunload = function () {
return 'Are you sure you want to leave?';
}
This will show a confirm dialog to the user with the message you returned in your function. It will give the user a leave this page or cancel option.
There is no way around the confirm as it could be used for malicious reasons.
https://developer.mozilla.org/en-US/docs/Web/API/WindowEventHandlers/onbeforeunload
If you combine setting a cookie with a for the specific page, with a check for the onload event, you can simulate the nonexistent event you seek. You might adjust the cookie expiration so that a reload is counted only if the initial onload was a certain time interval ago.
There are no onreload or onrefresh events that I'm aware of. Certainly from javascript running in a browser this make little sense. The existing window and all its contents are effectively discarded. Hence you either need to use onunload of the existing context or the load event of the new context that is created as result of reload.
I do believe artlung may have indeed found a way, actually... his version, however, relies on cookies, and those can be cut off from use in numerous ways; the solution, then, is to use a server-side language of your choice to save the timestamp of when the page is unloaded via JavaScript (still a vulnerability, yes, but why not throw another idea out there, huh?) and then testing it again upon every page load; if you detect a difference of less than a few seconds, the user probably just reloaded your page. : )
could use a session. easier than a cookie and don't have to worry about expiration or database. That would cover you for any page except the first one. I don't think the session superglobal is available til the second page. If that's a problem, you could start a session and reload the page immediately if there is no active session.
its onunload
because when you hit refresh the browser "unloads" then loads again

Categories