onBeforeUnload with ajax does not work with IE - javascript

I'm just wondering why this is not working with IE. It works fine with Chrome and Firefox.
window.onbeforeunload = function()
{
fetch("http://"+window.location.hostname+"/process_sc.php?cC=" + 1);
}
function fetch(url) {
var x = (window.ActiveXObject) ? new ActiveXObject('Microsoft.XMLHTTP') : new XMLHttpRequest();
x.open("GET", url, false);
x.send(null);
}

How can you tell it isn't working?
In general, there's little time between beforeunload event, unload event and actual page exit. At page unload all running scripts are dropped (browser than closes the window or navigates to address provided by user for example).
What might be happening here is browser doesn't really have time to send ajax request before page is unloaded.
I've seen couple of ways to ensure your final request before page unload will be completed. One of them is sending request and then introducing loop that is running for X number of miliseconds, postponing unload event and ensuring ajax request can be completed.
window.onbeforeunload = function() {
fetch("http://"+window.location.hostname+"/process_sc.php?cC=" + 1);
// here we force browser to wait for 300 seconds before proceeding with unload
var t = Date.now() + 300;
while(Date.now() < t) {};
}

The problem is that you use a GET instead of a POST request. The browser may use a cached result from the very first request.
This explains the fact that "it works only on the first time I open IE" as written as in response to another answer.
By the way, the AJAX call in onunload seems to work reliably in IE10 only if you use the parameter async = false in XMLHttpRequest.open(...), which you already did.

Related

Post method in $window.unload never called on change url, closing the tab, but ran on closing the browser

My unlock method never gets called on changing URL. Closing the browser executes the method. Closing the tab does not execute the method... I draw this conclusion since no post is received at the server-side (from examining the console).
/* Callback function that unlocks the current time report when leaving the angular app */
$window.onbeforeunload = function (event) {
event.preventDefault();
if ($scope.reportData != undefined && $scope.reportData.superId != undefined && !archive) {
$http.post(Settings.timereportBaseURLhttp + 'monthlyreport/' + $routeParams.office + '/current/' + $routeParams.tmsstep + '/' + $scope.reportData.superId + '/unlock');
}
};
First of all, unless $window refers to window, try removing the $. Secondly, event.preventDefault() won't prevent the browser from unloading the window. In most browsers, it will just ask the user if he wants to leave the page or not. If the user chooses not to stay on the page, and the AJAX call hasn't finished yet, the browser can close it immediately, and thus, you won't receive any POST request server-side.
To achieve what I think you're trying to do, I'd ping the server every once in a while and unlock the report if the server hasn't received a ping within a reasonable amount of time.

Firefox Extension : Stopping the page load when suspicious url found

I am working on a simple firefox extension that tracks the url requested and call a web service at the background which detects whether the URL is suspicious or not and based on the result returned by the service, extension decides to stop the page load and alert the user about the case of forgery or whatever, and if user still wishes to go to that page he can get redirected to the original page he has requested for
I have added a http-on-modify-request observer
var observerService = Components.classes["#mozilla.org/observer-service;1"].getService(Components.interfaces.nsIObserverService);
observerService.addObserver(requestObserverListener.observe, "http-on-modify-request", false);
and the observer
var requestObserverListener = {observe: function (subject, topic, data) {
//alert("Inside observe");
if (topic == "http-on-modify-request") {
subject.QueryInterface(Components.interfaces.nsIHttpChannel);
var url = subject.URI.spec; //url being requested. you might want this for something else
//alert("inside modify request");
var urlbarvalue = document.getElementById("urlbar").value;
urlbarvalue = processUrl(urlbarvalue, url);
//alert("url bar: "+urlbarvalue);
//alert("url: "+url);
document.getElementById("urlbar").style.backgroundColor = "white";
if(urlbarvalue == url && url != "")
{
var browser = getWindowForRequest(subject);
if (browser != null) {
//alert(""+browser.contentDocument.body.innerHTML);
alert("inside browser: "+url);
getXmlHttpRequest(url);
}
}
}
},
}
so when the URL in the URLbar and the requested url matches REST service will be called through ajax getXmlHttpRequest(url); method
now when i am running this extension call is made to the service but before the service return any response the page gets loaded which is not appropriate because user might enter his credentials in the meanwhile and get compromised
I want to first display user a warning message on the browser tab and if he still wanted to visit to that page he can then be redirected to that page on a link click in warning message window
I haven't tried this code out so I'm not sure that suspend and resume will work well but here's what I would try. You're working with an nsIRequest object as your subject so you can call subject.suspend() on it. From there use callbacks to your XHR call to either cancel() or resume() the nsIRequest.
Here's the relevant (untested) snippet of code. My XHR assumes some kind of promise .the() return but hopefully you understand the intention:
if(urlbarvalue == url && url != "")
{
var browser = getWindowForRequest(subject);
if (browser != null) {
// suspend the pending request
subject.suspend();
getXmlHttpRequest(url).then(
function success() { subject.resume(); },
function failure() { subject.cancel(Components.results.NS_BINDING_ABORTED); });
}
}
Just some fair warning that you actually don't want to implement an add-on in this way.
It's going to be extremely slow to do a remote call for every HTTP request. The safe browsing module does a single call to download a database of sites considered 'unsafe', it then can quickly check the database against the HTTP request page such that it doesn't have to make individual calls every time.
Here's some more info on this kind of intercepting worth reading: https://developer.mozilla.org/en-US/docs/XUL/School_tutorial/Intercepting_Page_Loads#HTTP_Observers
Also I'd worry that your XHR request will actually loop because XHR calls creates an http-on-modify-request event so your code might actually check that your XHR request is valid before being able to check the current URL. You probably want a safety check for your URL checking domain.
And here's another stackoverflow similar question to yours that might be useful: How to block HTTP request on a particular tab?
Good luck!

Check response time of a website with ajax

Hi I want to check response times of website. Here is my code, I got some values but doesn't show reality. What is the problem with these codes. Is it sth related with cache?? Furthermore how to show if page doesn't exit or unavailable.
<script type="text/javascript" src="jquery.min.js"></script> <script type="text/javascript">
var start = new Date();
$.ajax ({
url: 'http://www.example.com',
complete : function()
{
alert(new Date() - start)
},
});
</script>
The code itself is fine assuming that the code is running on the same origin as the one it's checking; you can't use ajax cross-origin unless both ends (client and server) support and are using CORS.
It could be caching, yes, you'd have to refer to the browser tools (any decent browser has a Network tab or similar in its developer tools) to know for sure. You can also disable caching by setting cache: false in the ajax call (see the ajax documentation for details), although that's a somewhat synthetic way to do it. A better way would be to ensure that whatever URL you're using for this timing responds with cache headers telling the browser (and proxies) not to cache it.
You can tell if the page doesn't exist or is "unavailable" (whatever that means) by hooking the error function and looking at the information it gives you:
var start = new Date();
$.ajax ({
url: 'http://www.example.com',
error : function(jqxhr, status, ex) {
// Look at status here
},
complete : function()
{
alert(new Date() - start)
},
});
The arguments given to error are also described in the docs linked above.
You can't do this due to the same origin policy.
One trick would be to create an image and measure the time until the onerror event fires.
var start = new Date();
var img = new Image();
img.onerror = function() {
alert(new Date() - start);
};
img.src = 'http://www.example.com/';
Append a random number to the url to prevent caching.
I'd use browser extensions such as Firebug or Chrome Developer Tools to measure Ajax response times.

Ajax Request Not Loading New Data

My application uses polling to update the status of a music player. I'm using setInterval to make an Ajax call every half a second to do this. It works on many browsers (Chrome,Firefox, Safari... ) except the Nook color's browser. When the page loads it updates the correct information, but after that it always loads the same information. This was confirmed using alert. Here's the original code
function getStatus() {
request = new XMLHttpRequest();
request.open("GET", SOME_URL, true);
request.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
request.onreadystatechange = function () {
if (request.readyState === 4 && request.status === 200)
updateStatus(request.responseText);
};
request.send()
}
setInterval(getStatus, 500);
Any ideas why it is always loading the same info (the info it fetches initially) ?
Also: it only loads the most current information if you clear the cache. This Nook was rooted and also had Firefox and it would work just fine. It's the Nook native browser that is doing this (rooted or unrooted).
Internet Explorer has a weird quirk where it caches AJAX content. I imagine you are seeing the same issue in the Nook browser. The solution is to add a "cache buster" parameter, which is basically just a random parameter so the URL is treated freshly:
"SOME_URL?random=" + Math.random()

link Click tracking does not work on Safari browser

I have a basic html page which has links that point to different site. What I want to do is track the clicks. I am doing so by sending a 0 pixel image call on Click event of the link without returning false on click event.
The same works fine on all the browsers except Safari(on windows OS).
when a link is clicked using javascript I delay the redirect and send an image request over to the server and log the click on server side. I have tried increasing the delay but with no success... The trackers work gr8 on all the browsers except Safari which does not sent the request at all.
I dont know why but possibly its that safari waits for the complete js to be executed before making the request and after the whole js is executed it gets redirected....
=========================================================
<html>
<head>
<script type="text/javascript">
function logEvent(){
image = new Image(1,1);
image.onLoad=function(){alert("Loaded");};
image.onLoad=function(){alert("Error");};
image.src='http://#path_to_logger_php#/log.php?'+Math.random(0, 1000) + '=' + Math.random(0, 1000);
pauseRedirect(500);
}
function pauseRedirect(millis){
var date = new Date();
var curDate = null;
do {curDate = new Date();}
while(curDate-date < millis);
}
</script>
</head>
<body>
Site 1<br/>
Site 2<br/>
</body>
</html>
=========================================================
Code works in chrome, firefox, ie and Opera. Does not work on Safari only..... any clues....
I had the same issue with all WebKit browsers. In all others you only need to do new Image().src = "url", and the browser will send the request even when navigating to a new page. WebKit will stop the request before it's sent when you navigate to a new page right after. Tried several hacks that inject the image to the document and even force a re-paint through img.clientHeight. I really don't want to use event.preventDefault, since that causes a lot of headaches when a link has target="_blank", form submit, etc. Ended up using a synchronous XmlHttpRequest for browsers supporting Cross Origin Resource Sharing, since it will send the request to the server even though you don't get to read the response. A synchronous request has the unfortunate side-effect of locking the UI-thread while waiting for response, so if the server is slow the page/browser will lock up until it receives a response.
var supportsCrossOriginResourceSharing = (typeof XMLHttpRequest != "undefined" && "withCredentials" in new XMLHttpRequest());
function logEvent() {
var trackUrl = 'http://#path_to_logger_php#/log.php?'+Math.random(0, 1000) + '=' + Math.random(0, 1000);
if(supportsCrossOriginResourceSharing) {
xhrTrack(trackUrl);
} else {
imgTrack(trackUrl);
}
}
function xhrTrack(trackUrl) {
var xhr = new XMLHttpRequest();
xhr.open("GET", trackUrl, false);
xhr.onreadystatechange = function() {
if(xhr.readyState >= this.OPENED) xhr.abort();
}
try { xhr.send() } catch(e) {}
}
function imgTrack(trackUrl) {
var trackImg = new Image(1,1);
trackImg.src = trackUrl;
}

Categories