I'm trying to execute the following code
window.onunload = function () {
var sTag = document.createElement('script');
sTag.src = 'mysrc';
document.getElementsByTagName('head')[0].appendChild(sTag);
return false; };
}
Itseems to work fine in FF but in chrome I'm getting the download status as cancelled as soon as the unload event is fired. I saw some posts in SO with ajax solutions but I'm executing this script inside a cross domain iframe. Im just trying to log the time for which my api was live in a page per visitor. So, I'm sending some time log information on unload of the page. Is there any work around for the above?
For the purpose of what you described, you can have that script loaded in your page or parent window (you are saying it is an iframe right?) and run a function on window.unload:
window.onunload = function(){
window.top.logtime(); // if it is in the parent, or
window.logtime() //if it is in the same window
};
and don't return false, unload event cannot be cancelled. (in best cases, the user gets an alert dialog that will override the return false statement.)
I think what makes this different is how fast it carries out a new function, before the body gets unloaded. Manipulating the DOM is definitely much slower than making a call.
Related
I'm using the Twilio JavaScript SDK to place and receive calls in the browser. As part of this I have a requirement to make and receive calls in a new popup window. This is so that a user can continue browsing the site without disconnecting the call.
I have got this working for outgoing calls (a user clicks a number, and on the back of this I call window.open which initiates the call).
However for incoming calls, I'm attempting to do the following in the initiating browser window:
Twilio.Device.incoming(function (connection) {
$('#callPopup').show();
$('.js-answer-call').on('click', function (event) {
event.preventDefault();
var receiveCallWindow = window.open('/call/incoming', '', 'width=350,height=200');
receiveCallWindow.connection = connection;
$('#callPopup').hide();
});
$('.js-reject-call').on('click', function (event) {
event.preventDefault();
connection.reject();
$('#callPopup').hide();
});
});
This passes the connection object to the popup window, and then the popup window runs the following code:
var connection = window.connection;
$(document).ready(function () {
connection.accept();
});
This does answer the call, however the call context is still within the parent window and if a user navigates away from that page it will end the call.
I understand that I can achieve this via a master iframe containing just the call logic, with the main web app inside the frame, however I think that's a very messy implementation and want to avoid that.
Twilio developer evangelist here.
Your problem in this case is indeed that the call context is in the parent window. The only thing that comes to mind to me right now is to popup the call window when your user logs on and is prepared to answer calls and then initialise the Twilio.Device in the popup window.
You could still connect this popup window to the code for outbound calls too.
Does this help at all?
I have set up an unload listener which sets a flag that error handlers from an ajax request check.
jQuery(window).unload(function() {
unloadhappening = true;
});
However, the ajax request can be aborted (when the user navigates to another page) and the error handler for the ajax request invoked before the unload event is fired.
I was wondering could I get an event earlier than unload? Obviously I could put a listener on every link to move from the page but was looking for a neater way if there is one.
Thanks
You could probably use onbeforeunload or $(window).on('beforeunload') but return an empty string from the function to prevent the prompt about navigation.
window.onbeforeunload = function(e) {
unloadhappening = true;
// maybe other logic
return ''; // or maybe return null
}
I haven't tested the solution to avoid the popup box in all browsers, so your milage may vary.
I need to call a JavaScript/jQuery function which has a few lines of code in it, on a PHP page when the user closes his window/tab or navigates away by clicking a link. I've tried the onbeforeunload function but only the return "blah blah blah;" part executes and everything else is ignored. I've also tried the .unload method from jQuery but for some reason this code doesn't run.
$(window).unload(function() {
alert('blah blah blah');
});
Please suggest alternatives. Thanks..
Here is a simple working example. Whatever you return from the unload callback will be displayed in a browser popup confirmation.
Working example sending Ajax request before unload
http://jsfiddle.net/alexflav23/hujQs/7/
The easiest way to do this:
window.onbeforeunload = function(event) {
// do stuff here
return "you have unsaved changes. Are you sure you want to navigate away?";
};
in jQuery:
$(window).on("beforeunload", function() {
$.ajax("someURL", {
async: false,
data: "test",
success: function(event) {
console.log("Ajax request executed");
}
});
return "This is a jQuery version";
});
Look into the Network tab of the browser. You will see how the request is being sent as you wanted to do. Just send the appropriate data.
Bear in mind all operations triggered must be synchronous, so you can only make synchronous ajax requests for instance. However, the above is not entirely reliable for any purpose.
Opt for periodic back-up of user data to localStorage and sync with the server automatically . Keep window.onbeforeunload just as an extra precaution, but not as a main mechanism. It's well known to cause problems.
This is an old question, but I wanted to share an alternative approach that has the benefit of working with high consistency:
Establish a WebSocket connection to the server, and when the client navigates away the WebSocket connection will be closed. Server-side, you can detect the closed connection in a callback and run whatever code you need on the server.
Executing Javascript on page unload is often unreliable (as discussed in the other answer) because it's inherently at odds with the user's intention. This method will always work, although it is admittedly quite a bit more cumbersome to implement.
This does change the context of your "run before leaving" code from client-side to server-side, but I imagine for most cases the difference is inconsequential. Anything you want to run client-side before the client leaves your page is probably not going to change anything the client sees, so it's probably fine to run it server side. If there is specific data you need from the client you can send it through the WebSocket to the server.
The only situation I can think of off the top of my head where this might cause unexpected behavior is if the user loses the WS connection without actually navigating away, e.g. they lose internet or put their computer to sleep. Whether or not that's a big deal is probably highly dependent on what kind of code you're trying to execute.
In many projects of mine, the mentioned methods here are instable. The only thing that works for me is to bind the event as original attribute on the body element.
<body onunload="my_function_unload()">
jQuery method:
$('body').attr('onunload', 'my_function_unload()');
From an iframe:
<body onunload="window.parent.my_function_unload()">
jQuery method:
$('<iframe />').load(function(){
$body = $(this).contents().find('body');
$body.attr('onunload', 'window.parent.my_function_unload()');
}
Also, important, no arguments in the attribute, and the function must be in the global window scope, otherwise nothing happens.
For example, common mistake If your my_function_unload() are wrapped inside a ;( function( $ ) {... OR $(document).ready(function(){... AS my_function_unload() must be outside that private scope. And dont forget to use jQuery instead of $ prefix then. (Working with Wordpress for example)
This is kind of a pain, as Chrome, at least in Version 92.0.4515.131, seems to be clamping the security screws on what you can get away with in beforeunload. I'm unable to make a synchronous ajax request, for example.
If there's any chance the user will be back to your site and you can wait until then to deal with their having closed a window (which does work in my use case), know that setting cookies does currently seem to be fair game during the beforeunload event. Even works when I close the browser. Covers most anything but power cycling the computer, it appears.
Here's a reference example (with getCookie stolen from this SO question):
function setCookie(name, value) {
document.cookie =
'{0}={1};expires=Fri, 31 Dec 9999 23:59:59 GMT;path=/;SameSite=Lax'
.replace("{0}", name)
.replace("{1}", value);
}
// https://stackoverflow.com/a/25490531/1028230
function getCookie(cookieName) {
var b = document.cookie.match('(^|;)\\s*' + cookieName + '\\s*=\\s*([^;]+)');
return b ? b.pop() : '';
}
window.addEventListener('beforeunload', function (e) {
console.log('cookie value before reset: ' + getCookie('whenItHappened'));
var now = +new Date();
console.log("value to be set: " + now);
setCookie('whenItHappened', now);
return "some string if you want the 'are you sure you want to leave' dialog to appear";
});
We're creating a click tracking app, that builds heatmaps. I'm writing a script which users are suppose to insert into their pages for tracking to work.
It works fine on elements, which doesn't require a redirect or form submit. For example, if I click on h1 or p or whatever, it works perfectly correct. But, if I click on a a, request to our server never happens before the normal redirect.
In the last couple of days I tried a lot of ways to do that. First of, I tried a normal AJAX call, since it was a cross-domain request I had to use JSONP, but again, that AJAX call did not have time to execute before the redirect. Adding async: false would have solved the problem, but it doesn't work with JSONP requests. So I decided to add a flag variable which indicates that it is safe to move on with redirect and used an empty while loop to wait until it becomes try in the ajax callback. But the while loop was blocking the execution flow, so callback never got a chance to set that variable to true. Here is some simplified code:
$(document).on('click', function (e) {
//part of the code is omitted
$.ajax({
url: baseUrl,
data: data,
type: "get",
dataType: "jsonp",
crossDomain: true,
complete: function (xhr, status,) {
itsSafeToMoveOn = true;
}
});
while(!itsSafeToMoveOn){}
return true;
});
The next thing I tried is to use unload page event to wait until total ajax calls in progress would become zero (I had a counter implemented) and then to move on with redirect. It worked in Firefox and IE, but in WebKit there was this error:
Error: Too much time spent in unload handler
After that I realized that I don't care about the server response and using img.src for the request would be an ideal fit for this case. So at this point code looks like this:
$(document).click(function (e) {
//part of the code is ommited
(new Image).src = baseUrl + '?' + data;
if (tag === "a" || clickedElement.parents().has("a")) {
sleep(100);
}
return true;
});
That way I increased the overall script performance slightly, but problem with links remains unchanged. The sleep function appears to be also blocking the execution flow and request never happens.
The only idea left is to return false from the event handler and than redirect manually to the clicked element's href or to call submit() on the form, but it will complicate things to much and believe me it's already a huge pain in the ass to debug this script in different browsers.
Does anyone have any other ideas?
var globalStopper = true;
$(document).on('click', function (e) {
if (globalStopper === false)
return true; //proceed with click if stopper is NOT set
else {
globalStopper = false; //release the breaks
$.ajax({
//blahblah
complete: function (xhr, status,) {
$(elem).click(); //when ajax request done - "rerun" the click
}
});
return false; //DO NOT let browser process the click
}
});
Also, instead of adding image, try adding script. And then add the script to the HEAD section. This way the browser will "wait" until it's loaded.
$(document).on('click', function (e) {
var scriptTag = document.createElement("script");
scriptTag.setAttribute("type", "text/javascript");
scriptTag.setAttribute("src", url);
document.getElementsByTagName("head")[0].appendChild(scriptTag);
return true;
}
I would take a look at the navigator sendBeacon API mentioned in this stack overflow answer or directly linked to here.
From the description on the site
navigator.sendBeacon(url, data) - This method addresses the needs of analytics and diagnostics code that typically attempts to send data to a web server prior to the unloading of the document.
You can save information to ajax request in cookies or localStorage and make any worker that will send information. Saving to cookies or localStorage is faster then ajax-request. You can do next:
$(document).click(function (e) {
var queue = localStorage.getItem('requestQueue');
queue.push(data);
localStorage.setItem('requestQueue',queue);
});
$(function(){
setInterval(function(){
var queue = localStorage.getItem('requestQueue');
while (queue.length > 0) {
var data = queue.pop();
$.ajax({
...
success: function(){
localStorage.setItem('requestQueue', queue);
}
});
}
},intervalToSendData);
});
So, when user click on link or send a form, data will be saved to storage and after user go to next page, this worker starts and send data to your server.
The JavaScript is basically executed in single thread. It is not possible to have your callback function executed and at the same time have an infinite loop waiting for a flag variable from it. The infinite loop will occupy the single execution thread and the callback will never be called.
Best approach is to cancel the default handler of your event and bubbling for it (basically return false if you are really building your tracking code with jQuery), and do the necessary actions (redirect page to the necessary address if a link was clicked or trigger other default actions), but this would take a lot of careful work to recreate all the possible combinations of actiona and callbacks.
Another approach is to:
1) Look for something specific to your code in the event data
2) If it is not present - make an AJAX call and in its callback re-trigger the same even on the same element, but this time with your specific bit added to the even data; after the AJAX call return false
3) If your specific bits are present in the data - simply do nothing, allowing the default event processing to take place.
The either approach may bite, however.
So if I understand right, you want your ajax logs completed before the page unloads and follows a link href. This sounds like a perfect case where you could consider using Deferreds in jQuery.
When your user clicks on anything that's supposed to take him away from the page, just check your promise status. If it's not resolved, you could throw a modal window over the page, and ask the user to wait til the progress is complete. Then, add a new pipe to your deferred, telling it to change the location href once everything is complete.
Let me know if this is the scenario. If it is, I'll explain in more detail. No use continuing if I didn't understand your requirement properly
My javascript code is added to random websites. I would like to be able to report to my server when a (specific) link/button on the a website is clicked. However I want to do it without any possible interruption to the website execution under any circumstances (such as error in my code, or my server id down etc.). In other words I want the site to do its default action regardless of my code.
The simple way to do it is adding event listener to the click event, calling the server synchronously to make sure the call is registered and then to execute the click. But I don't want my site and code to be able to cause the click not to complete.
Any other ideas on how to do that?
As long as you don't return false; inside your callback and your AJAX is asynchronous I don't think you'll have any problems with your links not working.
$("a.track").mousedown(function(){ $.post("/tools/track.php") })
I would also suggest you encapsulating this whole logyc inside a try{} catch() block so that any errors encauntered will not prevent the normal click behaviour to continue.
Perhaps something like this? I haven't tested it so it may contain some typo's but the idea is the same...
<script type="text/javascript">
function mylinkwasclicked(id){
try{
//this function is called asynchronously
setTimeOut('handlingfunctionname('+id+');',10);
}catch(e){
//on whatever error occured nothing was interrupted
}
//always return true to allow the execution of the link
return true;
}
</script>
then your link could look like this:
<a id="linkidentifier" href="somelink.html" onclick="mylinkwasclicked(5)" >click me!</a>
or you could add the onclick dynamically:
<script type="text/javascript">
var link = document.getElementById('linkidentifier');
link.onclick=function(){mylinkwasclicked(5);};
</script>
Attach this function:
(new Image()).src = 'http://example.com/track?url=' + escape(this.href)
+ '&' + Math.random();
It is asynchronous (the 'pseudo image' is loaded in the background)
It can be cross domain (unlike ajax)
It uses the most basic Javascript functionalities
It can, however, miss some clicks, due to site unloading before the image request is done.
The click should be processed normally.
1) If your javascript code has an error, the page might show an error icon in the status bar but it will continue the processing, it won't hang.
2) If your ajax request is asynchronous, the page will make that request and process the click simultaneously. If your server was down and the ajax request happening in the background timed out, it won't cause the click event to not get processed.
If you do the request to your server synchronously, you'll block the execution of the original event handler until the response is received, if you do it asynchronously, the original behaviour of the link or button may be doing a form post or changing the url of the document, which will interrupt your asynchronous request.
Delay the page exit just long enough to ping your server url
function link_clicked(el)
{
try {
var img = new Image();
img.src = 'http://you?url=' + escape(el.href) + '&rand=' + math.random();
window.onbeforeunload = wait;
}catch(e){}
return true;
}
function wait()
{
for (var a=0; a<100000000; a++){}
// do not return anything or a message window will appear
}
so what we've done is add a small delay to the page exit to give the outbound ping enough time to register. You could also just call wait() in the click handler but that would add an unnecessary delay to links that don't exit the page. Set the delay to whatever gives good results without slowing down the user noticeably. Anything more than a second would be rude but a second is a long time for a request roundtrip that returns no data. Just make sure your server doesn't need any longer to process the request or simply dump to a log and process that later.