Trigger an event when user navigates away - javascript

I need to call a JavaScript/jQuery function which has a few lines of code in it, on a PHP page when the user closes his window/tab or navigates away by clicking a link. I've tried the onbeforeunload function but only the return "blah blah blah;" part executes and everything else is ignored. I've also tried the .unload method from jQuery but for some reason this code doesn't run.
$(window).unload(function() {
alert('blah blah blah');
});
Please suggest alternatives. Thanks..

Here is a simple working example. Whatever you return from the unload callback will be displayed in a browser popup confirmation.
Working example sending Ajax request before unload
http://jsfiddle.net/alexflav23/hujQs/7/
The easiest way to do this:
window.onbeforeunload = function(event) {
// do stuff here
return "you have unsaved changes. Are you sure you want to navigate away?";
};
in jQuery:
$(window).on("beforeunload", function() {
$.ajax("someURL", {
async: false,
data: "test",
success: function(event) {
console.log("Ajax request executed");
}
});
return "This is a jQuery version";
});
Look into the Network tab of the browser. You will see how the request is being sent as you wanted to do. Just send the appropriate data.
Bear in mind all operations triggered must be synchronous, so you can only make synchronous ajax requests for instance. However, the above is not entirely reliable for any purpose.
Opt for periodic back-up of user data to localStorage and sync with the server automatically . Keep window.onbeforeunload just as an extra precaution, but not as a main mechanism. It's well known to cause problems.

This is an old question, but I wanted to share an alternative approach that has the benefit of working with high consistency:
Establish a WebSocket connection to the server, and when the client navigates away the WebSocket connection will be closed. Server-side, you can detect the closed connection in a callback and run whatever code you need on the server.
Executing Javascript on page unload is often unreliable (as discussed in the other answer) because it's inherently at odds with the user's intention. This method will always work, although it is admittedly quite a bit more cumbersome to implement.
This does change the context of your "run before leaving" code from client-side to server-side, but I imagine for most cases the difference is inconsequential. Anything you want to run client-side before the client leaves your page is probably not going to change anything the client sees, so it's probably fine to run it server side. If there is specific data you need from the client you can send it through the WebSocket to the server.
The only situation I can think of off the top of my head where this might cause unexpected behavior is if the user loses the WS connection without actually navigating away, e.g. they lose internet or put their computer to sleep. Whether or not that's a big deal is probably highly dependent on what kind of code you're trying to execute.

In many projects of mine, the mentioned methods here are instable. The only thing that works for me is to bind the event as original attribute on the body element.
<body onunload="my_function_unload()">
jQuery method:
$('body').attr('onunload', 'my_function_unload()');
From an iframe:
<body onunload="window.parent.my_function_unload()">
jQuery method:
$('<iframe />').load(function(){
$body = $(this).contents().find('body');
$body.attr('onunload', 'window.parent.my_function_unload()');
}
Also, important, no arguments in the attribute, and the function must be in the global window scope, otherwise nothing happens.
For example, common mistake If your my_function_unload() are wrapped inside a ;( function( $ ) {... OR $(document).ready(function(){... AS my_function_unload() must be outside that private scope. And dont forget to use jQuery instead of $ prefix then. (Working with Wordpress for example)

This is kind of a pain, as Chrome, at least in Version 92.0.4515.131, seems to be clamping the security screws on what you can get away with in beforeunload. I'm unable to make a synchronous ajax request, for example.
If there's any chance the user will be back to your site and you can wait until then to deal with their having closed a window (which does work in my use case), know that setting cookies does currently seem to be fair game during the beforeunload event. Even works when I close the browser. Covers most anything but power cycling the computer, it appears.
Here's a reference example (with getCookie stolen from this SO question):
function setCookie(name, value) {
document.cookie =
'{0}={1};expires=Fri, 31 Dec 9999 23:59:59 GMT;path=/;SameSite=Lax'
.replace("{0}", name)
.replace("{1}", value);
}
// https://stackoverflow.com/a/25490531/1028230
function getCookie(cookieName) {
var b = document.cookie.match('(^|;)\\s*' + cookieName + '\\s*=\\s*([^;]+)');
return b ? b.pop() : '';
}
window.addEventListener('beforeunload', function (e) {
console.log('cookie value before reset: ' + getCookie('whenItHappened'));
var now = +new Date();
console.log("value to be set: " + now);
setCookie('whenItHappened', now);
return "some string if you want the 'are you sure you want to leave' dialog to appear";
});

Related

simultaneous runs of two JS webworkers: one gets stuck

I'm working on a closed system web application to aid companies in their everyday online commerce chores. That means on the one hand that it won't be open to the public, on the other: it will have to deal with large amounts of data while maintaining a fluent work experience.
This is why I turned to web workers in JS to run all sorts of database access and data loading in the background.
My understanding is, that not only the main UI/main JS remains uninterrupted but also the different web workers run without hindering each other.
I now have the following setup:
mainJS: function statusCheck which runs on pageload:
function statusCheck() {
if(typeof(w__statusCheck) == "undefined") {
var w__statusCheck = new Worker("...statusCheck.js");
w__statusCheck.postMessage("go");
w__statusCheck.onmessage = function(e) {
var message = JSON.parse(e.data);
if(message.text!=undefined) displayMessage(message.text);
}
}
statusCheck.js which is the worker simply goes like this:
function checkStatus() {
console.log("statusCheck started");
// I will leave standard parts out:
// creating and testing the ajax variable against different browsers
ajaxRequest.onreadystatechange = function() {
if(ajaxRequest.readyState == 4) {
self.postMessage(ajaxRequest.responseText);
var timer;
timer = self.setTimeout(function(){
checkStatus();
}, 1000);
}
}
ajaxRequest.open("GET", "...worker_statusCheck.php", true);
ajaxRequest.send(null);
}
this.onmessage = function(e){
checkStatus();
};
As you can see, this restarts itself every second (for now). The intervall might be longer in production.
worker_statusCheck.php simply gets different things from the database and knits them into a JSON object which gives me the system status.
This works beautifully.
Now I have another worker which is supposed to get initiated by a click on a link to effectively call some php to perform actions:
mainJS loadWorker
function loadWorker(url="") {
console.log("loadWorker started");
if(url!="") {
var uniqueID = "XXX" // creating a random ID based on timestamp and Math.random()
if(typeof(window[uniqueID]) == "undefined") {
var variables = { ajaxURL: url };
window[uniqueID] = new Worker("....loadWorker.js");
window[uniqueID].postMessage(JSON.stringify(variables));
window[uniqueID].onmessage = function(e) {
var message = JSON.parse(e.data);
if(message["success"]!=undefined) {
variables["close"] = "yes";
window[uniqueID].postMessage(JSON.stringify(variables));
}
}
}
With every click on a certain link this gets called, creates a uniquely named worker, runs it, receives the data and tells the worker to close().
The php again does its thing and writes a progress update in the DB after each step of the lengthy procedure. These progress updates I fetch from the DB with the above repeating statusCheck.
Now, I can see the entries in the DB with timestamp, so I know they get written each at their time.
So, both workers do their job and run reliably. But I have noticed, that whenever I initiate the manual (randomly named) worker the statusCheck actually stops performing. It just gets stuck... I was able to confirm this with console output from both workers. So it's not the main JS that seems stuck, but the statusCheck actually pauses... and resumes when loadWorker is done.
Am I missing something fundamental here? Any insight would be appreciated since I'm new to this concept of web workers.
Thanx :)
Your question lacks resources to truly figure out what exactly goes wrong. I can concur that two web workers can operate at the same time, even with synchronous operations. I tested this for both for loops and sync XHR requests.
There are multiple things I would recommend though.
First - unless you're processing the data with some CPU heavy algorithm, web workers are waste of time. XHR requests do not block main thread (unless you explicitly ask them to).
In statusCheck() you declare var w__statusCheck which means a local variable. Therefore it will always be null as seen from outer scope. It might get garbage-collected once no code is running in the worker.
Do not use XMLHttpRequest.onreadystatechange. Use onload and onerror.
Random unique ID's for variables are almost always wrong. If you need to store the worker refference at all, either give it a reasonable name (eg. the url it's supposed to load) or use incremental id.
Do NOT stringify data that you post to web worker. It's already done for you by the browser, possibly in more optimal manner. Converting the data to something is a single most common stupid thing people do with web workers.
Also when posting question, at least make sure the code makes some sense. In your post curly braces do not match.
Alright.. I figured it out:
I was looking in all the wrong places. Turns out, I had initialized my php session in all the php scripts which are called by the workers. And my two parallel workers both called one. So the session file was locked by the first php script and the second had to wait until it was back open again. It was not the workers or the JS being hindered, it was the php.
I now took out the session initialization from my statusCheck.php and it works like a charm. I will keep it in those others that handle the user input responses because there it actually makes sense: user clicks on button "compile data XY" which is run by the worker and takes a while. Impatient as he is he already clicks the next button "show this data"... and due to the locked session file I have sort of a neat queue for those actions. :)
I still will take above recommendations to heart and see to it to improve my code. :)

Firefox return false; (preventDefault();) and window.location.reload(); together

I think it very stupid question, but after hours of google it - i have no idea or solution.
Point is that i need reload page after handling "click" event on my web-site. In chrome\opera it's works without problems. But in firefox i have some bug. My JS code with comments:
$('#project_create').click(function() {
var projectname = $("#project_name").val();
var projectdescription = $("#project_description").val();
$.post("projects/add_project.php", {
project_name: projectname,
project_description: projectdescription
});
$("#create_project_form").hide("slow");
$("#project_list").show("slow");
//return false; - if it uncomment, all work, but next page reloader doesn't work.
window.location.reload(); //but if it's work, FireFox doesn't send $.post-query
});
I need to work booth methods, because after click - script put in $_SESSION['my_var'] some variable, and it variable is avaliable after reload page only. Maybe there are other ways to do it? As I understand the problem here in features with firefox and preventDefault();
Thanks!
The issue is just you reload the page before performing the ajax request.
Try to reload page in the ajax success callback handler :
$.post("projects/add_project.php", {
project_name: projectname,
project_description: projectdescription
}, function(){
window.location.reload();
});
And remove your old window.location.reload()
When you do a return, code after that line will not be reached anymore and is considered "dead code". One does not simply put code after a return.
Another is that there's and issue when using return false to prevent default default actions. It prevents delegation/bubbling. Event handlers hooked higher up in the DOM tree (especially ones hooked with on()) won't be executed. If delegation matters, don't use it.
If your goal is to prevent the default action of the link and do stuff in JS, use event.preventDefault instead. The event object is passed in as the first argument in the handler.
$('#project_create').click(function(event) {
event.preventDefault();
// rest of the code
});
In addition to what the other answers suggest, you can also execute the location.reload method asynchronously using setTimeout:
setTimeout(function() { location.reload(); }, 1);
return false;
EDIT: The entire idea of running an asynchonous AJAX request and reloading the page immediately afterwards is flawed, of course, since the AJAX request may not have been completed by the time you reload the page.
You should therefore use a callback, as suggested by the accepted answer. Alternatively, you could use a synchronous AJAX request, but that would freeze execution until the request has completed, which is generally not desirable.

doing an ajax call on window.unload

In my application, there's an object that needs to be ajaxed back to the server before the user switches to another page or closes his browser.
For the moment, I'm using something like this:
$(window).on('unload', function () {
$.ajax(....);
});
Will the ajax call fire in all browsers or are there situations where this will not work and where this situation needs to be handled differently? I don't need to deal with anything in terms of a success function, I'm only concerned about the information making it to the server.
Thanks.
If you're using jQuery, you can set async to false in the ajax call. And it might work, but your results may vary by browser. Here's a jsFiddle example. http://jsfiddle.net/jtaylor/wRkZr/4/
// Note: I came across a couple articles saying we may should to use window.onbeforeunload instead of or in addition to jQuery's unload. Keep an eye on this.
// http://vidasp.net/jQuery-unload.html
// https://stackoverflow.com/questions/1802930/setting-onbeforeunload-on-body-element-in-chrome-and-ie-using-jquery
var doAjaxBeforeUnloadEnabled = true; // We hook into window.onbeforeunload and bind some jQuery events to confirmBeforeUnload. This variable is used to prevent us from showing both messages during a single event.
var doAjaxBeforeUnload = function (evt) {
if (!doAjaxBeforeUnloadEnabled) {
return;
}
doAjaxBeforeUnloadEnabled = false;
jQuery.ajax({
url: "/",
success: function (a) {
console.debug("Ajax call finished");
},
async: false /* Not recommended. This is the dangerous part. Your mileage may vary. */
});
}
$(document).ready(function () {
window.onbeforeunload = doAjaxBeforeUnload;
$(window).unload(doAjaxBeforeUnload);
});
In Google Chrome, the ajax call always completes before I navigate away from the page.
However, I would VERY MUCH NOT RECOMMEND going that route. The "a" in ajax is for "asynchronous", and if you try to force to act like a synchronous call, you're asking for trouble. That trouble usually manifests as freezing the browser -- which might happen if the ajax call took a long time.
If viable, consider prompting the user before navigating away from the page if the page has data that needs to be posted via ajax. For an example, see this question: jquery prompt to save data onbeforeunload
No, unfortunatelly your Ajax call will not get completed as the document will unload during the async call.
You cannot do many things when the user closes the window.
Instead of doing an ajax sync call (deprecated on latest browsers and can get exception), you can open a popup:
$(window).on('unload', function () {
window.open("myscript.php");
});
You can add obviously parameters to the link and you can automatically close the popup if you like.
Popup blocker must be disactivated for your domain in the browser options.
You have to use the onbeforeunload event and make a synchronous AJAX call.
$.ajax({
...
"url": "http://www.example.com",
"async": false,
...
});

Send information about clicked link to the server before redirect

We're creating a click tracking app, that builds heatmaps. I'm writing a script which users are suppose to insert into their pages for tracking to work.
It works fine on elements, which doesn't require a redirect or form submit. For example, if I click on h1 or p or whatever, it works perfectly correct. But, if I click on a a, request to our server never happens before the normal redirect.
In the last couple of days I tried a lot of ways to do that. First of, I tried a normal AJAX call, since it was a cross-domain request I had to use JSONP, but again, that AJAX call did not have time to execute before the redirect. Adding async: false would have solved the problem, but it doesn't work with JSONP requests. So I decided to add a flag variable which indicates that it is safe to move on with redirect and used an empty while loop to wait until it becomes try in the ajax callback. But the while loop was blocking the execution flow, so callback never got a chance to set that variable to true. Here is some simplified code:
$(document).on('click', function (e) {
//part of the code is omitted
$.ajax({
url: baseUrl,
data: data,
type: "get",
dataType: "jsonp",
crossDomain: true,
complete: function (xhr, status,) {
itsSafeToMoveOn = true;
}
});
while(!itsSafeToMoveOn){}
return true;
});
The next thing I tried is to use unload page event to wait until total ajax calls in progress would become zero (I had a counter implemented) and then to move on with redirect. It worked in Firefox and IE, but in WebKit there was this error:
Error: Too much time spent in unload handler
After that I realized that I don't care about the server response and using img.src for the request would be an ideal fit for this case. So at this point code looks like this:
$(document).click(function (e) {
//part of the code is ommited
(new Image).src = baseUrl + '?' + data;
if (tag === "a" || clickedElement.parents().has("a")) {
sleep(100);
}
return true;
});
That way I increased the overall script performance slightly, but problem with links remains unchanged. The sleep function appears to be also blocking the execution flow and request never happens.
The only idea left is to return false from the event handler and than redirect manually to the clicked element's href or to call submit() on the form, but it will complicate things to much and believe me it's already a huge pain in the ass to debug this script in different browsers.
Does anyone have any other ideas?
var globalStopper = true;
$(document).on('click', function (e) {
if (globalStopper === false)
return true; //proceed with click if stopper is NOT set
else {
globalStopper = false; //release the breaks
$.ajax({
//blahblah
complete: function (xhr, status,) {
$(elem).click(); //when ajax request done - "rerun" the click
}
});
return false; //DO NOT let browser process the click
}
});
Also, instead of adding image, try adding script. And then add the script to the HEAD section. This way the browser will "wait" until it's loaded.
$(document).on('click', function (e) {
var scriptTag = document.createElement("script");
scriptTag.setAttribute("type", "text/javascript");
scriptTag.setAttribute("src", url);
document.getElementsByTagName("head")[0].appendChild(scriptTag);
return true;
}
I would take a look at the navigator sendBeacon API mentioned in this stack overflow answer or directly linked to here.
From the description on the site
navigator.sendBeacon(url, data) - This method addresses the needs of analytics and diagnostics code that typically attempts to send data to a web server prior to the unloading of the document.
You can save information to ajax request in cookies or localStorage and make any worker that will send information. Saving to cookies or localStorage is faster then ajax-request. You can do next:
$(document).click(function (e) {
var queue = localStorage.getItem('requestQueue');
queue.push(data);
localStorage.setItem('requestQueue',queue);
});
$(function(){
setInterval(function(){
var queue = localStorage.getItem('requestQueue');
while (queue.length > 0) {
var data = queue.pop();
$.ajax({
...
success: function(){
localStorage.setItem('requestQueue', queue);
}
});
}
},intervalToSendData);
});
So, when user click on link or send a form, data will be saved to storage and after user go to next page, this worker starts and send data to your server.
The JavaScript is basically executed in single thread. It is not possible to have your callback function executed and at the same time have an infinite loop waiting for a flag variable from it. The infinite loop will occupy the single execution thread and the callback will never be called.
Best approach is to cancel the default handler of your event and bubbling for it (basically return false if you are really building your tracking code with jQuery), and do the necessary actions (redirect page to the necessary address if a link was clicked or trigger other default actions), but this would take a lot of careful work to recreate all the possible combinations of actiona and callbacks.
Another approach is to:
1) Look for something specific to your code in the event data
2) If it is not present - make an AJAX call and in its callback re-trigger the same even on the same element, but this time with your specific bit added to the even data; after the AJAX call return false
3) If your specific bits are present in the data - simply do nothing, allowing the default event processing to take place.
The either approach may bite, however.
So if I understand right, you want your ajax logs completed before the page unloads and follows a link href. This sounds like a perfect case where you could consider using Deferreds in jQuery.
When your user clicks on anything that's supposed to take him away from the page, just check your promise status. If it's not resolved, you could throw a modal window over the page, and ask the user to wait til the progress is complete. Then, add a new pipe to your deferred, telling it to change the location href once everything is complete.
Let me know if this is the scenario. If it is, I'll explain in more detail. No use continuing if I didn't understand your requirement properly

What is the best method to detect offline mode in the browser?

I have a web application where there are number of Ajax components which refresh themselves every so often inside a page (it's a dashboard of sorts).
Now, I want to add functionality to the page so that when there is no Internet connectivity, the current content of the page doesn't change and a message appears on the page saying that the page is offline (currently, as these various gadgets on the page try to refresh themselves and find that there is no connectivity, their old data vanishes).
So, what is the best way to go about this?
navigator.onLine
That should do what you're asking.
You probably want to check that in whatever code you have that updates the page. Eg:
if (navigator.onLine) {
updatePage();
} else {
displayOfflineWarning();
}
It seems like you've answered your own question. If the gadgets send an asynch request and it times out, don't update them. If enough of them do so, display the "page is offline" message.
See the HTML 5 draft specification. You want navigator.onLine. Not all browsers support it yet. Firefox 3 and Opera 9.5 do.
It sounds as though you are trying to cover up the problem rather than solve it. If a failed request causes your widgets to clear their data, then you should fix your code so that it doesn't attempt to update your widgets unless it receives a response, rather than attempting to figure out whether the request will succeed ahead of time.
One way to handle this might be to extend the XmlHTTPRequest object with an explicit timeout method, then use that to determine if you're working in offline mode (that is, for browsers that don't support navigator.onLine). Here's how I implemented Ajax timeouts on one site (a site that uses the Prototype library). After 10 seconds (10,000 milliseconds), it aborts the call and calls the onFailure method.
/**
* Monitor AJAX requests for timeouts
* Based on the script here: http://codejanitor.com/wp/2006/03/23/ajax-timeouts-with-prototype/
*
* Usage: If an AJAX call takes more than the designated amount of time to return, we call the onFailure
* method (if it exists), passing an error code to the function.
*
*/
var xhr = {
errorCode: 'timeout',
callInProgress: function (xmlhttp) {
switch (xmlhttp.readyState) {
case 1: case 2: case 3:
return true;
// Case 4 and 0
default:
return false;
}
}
};
// Register global responders that will occur on all AJAX requests
Ajax.Responders.register({
onCreate: function (request) {
request.timeoutId = window.setTimeout(function () {
// If we have hit the timeout and the AJAX request is active, abort it and let the user know
if (xhr.callInProgress(request.transport)) {
var parameters = request.options.parameters;
request.transport.abort();
// Run the onFailure method if we set one up when creating the AJAX object
if (request.options.onFailure) {
request.options.onFailure(request.transport, xhr.errorCode, parameters);
}
}
},
// 10 seconds
10000);
},
onComplete: function (request) {
// Clear the timeout, the request completed ok
window.clearTimeout(request.timeoutId);
}
});
Hmm actually, now I look into it a bit, it's a bit more complicated than that. Have a read of these links on John Resig's blog and the Mozilla site. The above poster may also have a good point - you're making requests anyway, so you should be able to work out when they fail.. That might be a much more reliable way to go.
Make a call to a reliable destination, or perhaps a series of calls, ones that should go through and return if the user has an active net connection - even something as simple as a token ping to google, yahoo, and msn, or something like that. If at least one comes back green, you know you're connected.
I think google gears have such functionality, maybe you could check how they did that.
Use the relevant HTML5 API: online/offline status/events.
One possible solution is that if the page and the cached page have a different url to just look and see what url you are on. If you are on the url of the cached page then you are in offline mode. This blog makes a good point about why navigator.online is broke

Categories