How to detect negative user response for geolocation - javascript

When using geolocation API's navigator.geolocation.getCurrentPosition() how to deal with a negative response?
It says that the second callback function is called when there is an error. However when user chooses not to reveal his location by cancelling the request that function is never fired.
It seems that getCurrentPosition() waits for an answer indefinitely. (at least in Firefox 4)
How can I know when user presses cancel (or no etc.)
Any ideas?

See edit below
You are correct, the error handler should fire when a user denies the location request. The error object passed into the error handler should contain an error code and message letting you know the user denied the request. However, I'm not seeing this in FF4 when selecting the option Not Now from the location request dialogue.
In Chrome, the API/callbacks work exactly as expected, but in Chrome there is no 3rd option.
EDIT
Ahhh okay I found a little quirk in the behavior of this in FF4. In normal mode (not private browsing), the user will be presented 3 options:
Always share
Never share
Not Now
Never share triggers the error handler correctly, but Not Now does not.
What does this mean and how to handle it?
Well, it looks like if the user hits Not Now, you aren't going to get a response. Therefore, I would set a timeout which checks a flag that would be set by one of the handlers. If this flag is not set (meaning the handlers didn't fire in the allotted time), you can do one of two things:
Assume that the user denied the request (even though the denial was temporary)
You can ask the user for permission again (via the same call) and the user will be presented with the dialog again.
Option 2 is probably bad usability (and annoying), so it is probably best to assume they denied temporarily and ask them again (politely!) the next time they visit the site.
I created a JsFiddle to play around with this API:
http://jsfiddle.net/7yYpn/11/

I don't think it's a bug, but an intentional choice when it comes to making it difficult to make websites that provides undesirable functionalities.. (as the top answer implied; IF you request again- when someone already said no- is rather annoying...)...
The difference between "not now".. and "never".. is that the programmer of the website KNOWS.. that if "not now" was triggered.. there would be an actual prompt to the user IF he sent the request again.. hence he would be able to "force" the user's hand to EITHER accept it.. or simply block data until the user agrees..
Decent and respectful programmers want to use such information to better provide a service (and to not wait for things that won't happen).. but truth is that there are enough spammers out there to overwhelm the end user..
(and there is no need to even TRY to send the request again, if it has been answered with "never".. because.. the user will not be terrorized in the same manner.. and if the site becomes sluggish and unresponsive, the user will just close it)
Ps. OH, and SERIOUS programmers might actually take a rejection as an actual.. rejection.. and store this choice somewhere.. despite the fact that "not now" is actually not intended as an ABSOLUTE rejection, but rather a "I have decided to not take any definite stand as of yet".. so.. someone who say "not now".. if the server knows of this choice and takes it as a "no".. then there might NEVER be another request sent.. despite the person WANTING to be able to reconsider at a later date)

Related

window.onbeforeunload, closing browser and synchronous AJAX

I've been searching for any reasonable example of a situation, when synchronous AJAX makes sense. I've found this SO question, where the author mentions window.onbeforeunload and claims, that "the request would never stop" if the AJAX call was asynchronous.
Can anybody explain this? I mean - window.onbeforeunload is fired when the user wants to close the tab. What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
He didn't say the request will never stop; he said it will never complete. That is, your code has no hope of ever getting the response back, because the execution environment (the window) would disappear from existence before that happened.
The tab will close when window.onbeforeunload exits with a truthy value. Thus, as long as it is running, the page is waiting for the return value, and not closing. This allows a synchronous AJAX to be sent, and for the response to be received and processed. If the request is asynchronous, the code constructs XHR object, then exits, and the page (and your code) goes away.
I have never tested this, but the answerer apparently believes (and I don't think it unreasonable) that the page might not stick around long enough for an async XHR to even be sent, let alone to receive a response. Thus, if you want to be sure the server receives the information that the user closed the page, you want to have the request synchronous.
What had to be going on to make the tab still alive, even though somebody clicked to close it? Can somebody give more specific example?
Sending a synchronous XMLHttpRequest on unload is the only way to guarantee delivery of session data when a user-agent unloads the page (and may never re-visit your site again). There are two specific cases for this:
Tracking - Tracking and reporting the total session time for a user's page visit.
Batching - Coalescing and deferring delivery of batched session data to reduce the number of server requests.
The Beacon spec (navigator.sendBeacon) was designed to optimize this specific case, making it possible to send asynchronous requests guaranteed to still complete even after the page unloads.

Javascript Abort POST Request on Retry/Refresh (HIGH CPU Usage)

Recently my Apache server's CPU has been at 100% (all day, that is). I think I have found my issue.
On a page, a user has the ability to click a button and send a POST request to a PHP file on my server. Sometimes the PHP file takes a LONG (and I mean VERY long) time to respond or sometimes does not respond at all:
function buttonFunction() {
$.post("http://ipaddress/core/file.php",{username:username, password:pword, coins:coins}, function(data) {
// Stuff
});
}
Hypothesis 1:
I believe that sometimes people might click this button again (while it is still trying to get the result/response from file.php from the previous click), and therefore causing two simultaneous processes from PHP on the Apache server - causing higher CPU usage (I think that's what happens, correct me if I'm wrong because I'm new to this server stuff).
Hypothesis 2:
Another thing that may be causing the high CPU usage (that I believe) is the user refreshing the page while it is still trying to get the result/response from file.php. After 12 seconds (with no response/result), I have a message appear saying "Please refresh if this takes too long." With that being the case (after refreshing the page), the user once again tries to send a post request to file.php while the old one may still be running - causing higher CPU usage (again, I think that's what happens, correct me if I'm wrong because I'm new to this server stuff).
Reasoning:
I'm saying this because on my site it may say that there are only 12 people online (and probably 12 people sending the post requests), however when I run the top command on PuTTY to see what processes are currently running, it shows nearly 30-40+ processes running (some taking as long as 17 minutes).
So, is there a way that I can abort the request on a refresh (if it's still going on) or on the click on the button (again, if the request is still going on)? In fact, can somebody either confirm or deny if my hypotheses (especially hypothesis 2) are correct - if those actually ARE causing the high CPU? Furthermore, if anybody has an idea for a more efficient way that I can go about this (sending these requests), it would be highly appreciated.
Edit 1:
I can fix the possible issue stated in my first hypothesis. However, can somebody please either confirm or deny if my second hypothesis is true/valid?

Evil Firefox Error -- "A parameter or an operation is not supported by the underlying object"

I'm trying to figure out what is going on here. I've been at it for hours now and can't seem to get a grip on why this is happening.
I'm making a few AJAX calls, and I keep getting this error back only in Firefox (version 21) on Mac OS X.
Here is the error:
"[Exception... "A parameter or an operation is not supported by the underlying object"
code: "15" nsresult: "0x8053000f (InvalidAccessError)" location:
"https://ajax.googleapis.com/ajax/libs/jquery/2.0.0/jquery.min.js Line: 6"
I'm making a CORS call, so I set up my AJAX like so:
$.ajaxSetup({
crossDomain: true,
xhrFields: {
withCredentials: true
}
});
And continue calls henceforth. Basically, does anyone out there have ANY experience with this error? I see some posts online but they all seem to do with Cross-Domain CSS, which I'm not using.
Okay, so after of hours of testing (and great discussion from #Dave and #danronmoon, I've finally figured out what's going on.
The CORS (Cross-Domain Resource Sharing) calls I was making were set to 'async: false' -- (which I realize I did not include in my original post, as I thought it was inconsequential) this, seems to operate fine in all browsers except Firefox, where jQuery will bark at you and your ajax call will fail.
Thank you all for your help and I hope this helps someone else!
Since this is the first duckduckgo result for InvalidAccessError: A parameter or an operation is not supported by the underlying object I will add another source for this.
If you deal with such error when doing iframe/window actions, then you're probably prevented by the iframe's sandbox attribute (see https://html.spec.whatwg.org/multipage/iframe-embed-object.html#attr-iframe-sandbox ) even when being on the same origin.
In my case, an iframe was trying to do a window.top.location.href = ... after a form submission success. The allow-top-navigation sandbox option is mandatory to do so.
Funny thing, this sandbox option is not mandatory to reload the top browsing context... it's only required for navigating in it.
For me, I was using WebSockets and called WebSocket.close(1001). It doesn't like my status code. Changing it to 1000 or not specifying a code (default 1005) works just fine.
this is the real solution by Diogo Cardoso, the xhr object or parent seems to lack a toString() method
CORS synchronous requests not working in firefox
Yes, it is a CORS problem caused by using ajax. But as user320550 asks, what if you NEED to use the property 'async:false'? I found that using the 'withCredentials:false' property as a workaround fixes the issue on firefox and doesn't affect other browsers.
Just want to add a somewhat nasty intermittent variant of Xenos's answer. As he mentioned, you can get this problem if you try and navigate the window by setting window.top.location.href = ... from within a sandboxed iframe, and that this can be prevented if your iframe has the allow-top-navigation option set.
But you might also find your iframe has the more restrictive allow-top-navigation-by-user-activation option. This will allow navigation, but only in response to a user action such as clicking a link or a button. For example, it will be allowed within a form submit event handler, but you can't just trigger it at an arbitrary point in time, such as from a setTimeout() callback with a long delay.
This can be problematic if you are (for example) using AJAX form submission before performing a redirect. The browser needs to decide if the navigation is in response to a user action or not. It does this by only allowing the navigation if it is considered to have happened within an acceptable time period of the user interaction. The HTML standard refers to this as transient activation.
The bottom line is that if your AJAX call is too slow, or if your user has a poor network connection, the navigation will fail. How slow is too slow? I have only tested Firefox, but it appears to allow 5 seconds before it considers the user interaction to have expired.
Possible solutions:
Ask whoever is responsible for the iframe options to upgrade to the blanket allow-top-navigation option
Don't perform async work such as AJAX requests in between user actions and top navigation. For example, use old-school POST form submission directly to the back-end, rather than using an AJAX request
Make sure your responses are as fast as possible. Catch any errors, and prompt the user to click something to trigger the navigation manually. For example:
async function submitForm() {
await doPotentiallySlowAsyncFormSubmit()
try {
window.top.location.href = ...
} catch (e) {
// Show message to user, e.g. "Form submitted, click here to go to the next step"
}
}

Behaviour of navigator.geolocation.getCurrentPosition

I know that when I deny the browser getting the location, it will call the error callback.
However, it doesn't seem to be the case for FF4.
Can anyone enlighten me as to how to control what my js does if the user clicks dun share.
Thanks =D
Is the user clicking "don't share for now" (which is equivalent to just not having made a decision yet, and hence doesn't make either the error or the success callback), or "never share" (which will give you an error callback last I checked)?
As for your JS... just do whatever you would do if the user completely ignores the geolocation notification, which the user is completely free to do.

JavaScript/jQuery: How to make sure cross-domain click tracking event succeeds before the user leaves the page?

I'm implementing click tracking from various pages in our corporate intranet in order to add some sorely needed crowd-sourced popular link features ("most popular links in your department in the last 24 hours", etc.)
I'm using jQuery's .live() to bind to the mousedown event for all link elements on the page, filter the event, and then fire off a pseudo-ajax request with various data to a back-end server before returning true so that the link action fires:
$("#contentarea a").live("mousedown", function(ev) {
//
// detect event, find closest link, process it here
//
$.ajax({
url: 'my-url',
cache: false,
dataType: 'jsonp',
jsonp: 'cb',
data: myDataString,
success: function() {
// silence is golden -- server does send success JSONP but
// regardless of success or failure, we allow the user to continue
}
});
return true; // allow event to continue, user leaves the page.
}
As you can probably guess from the above, I have several constraints:
The back-end tracking server is on a different sub-domain from the calling page. I can't get round this. That's why I am using JSONP (and GET) as opposed to proper AJAX with POST. I can't implement an AJAX proxy as the web servers do not have outbound network access for scripts.
This is probably not relevant, but in the interest of full disclosure, the content and script is inside a "main content" iframe (and this is not going to change. I will likely eventually move the event listener to the parent frame to monitor it's links and all child content, but step 1 is getting it to work properly in the simplified case of "1 child window"). Parent and child are same domain.
The back-end is IIS/ASP (again, a constraint -- don't ask!), so I can't immediately fork the back-end process or otherwise terminate the response but keep processing like I could on a better platform
Despite all this, for the most part, the system works -- I click links on the page, and they appear in the database pretty seamlessly.
However it isn't reliable -- for a large number of links, particularly off-site links that have their target set to "_top", they don't appear. If the link is opened in a new tab or window, it registers OK.
I have ruled out script errors -- it seems that either:
(a) the request is never making it to the back-end in time; or
(b) the request is making it, but ASP is detecting that the client is disconnecting shortly afterwards, and as it is a GET request, is not processing it.
I suspect (b), since latency to the server is very fast and many links register OK. If I put in an alert pop-up after the event fires, or set the return value to false, the click is registered OK.
Any advice on how I can solve this (in the context that I cannot change my constraints)? I can't make the GET request synchronous as it is not true AJAX.
Q: Would it work better if I was making a POST request to ASP? If (b) is the culprit would it behave differently for POST vs GET? If so, I could use a hidden iframe/form to POST the data. however, I suspect this would be slower and more clunky, and might still not make it in time. I wouldn't be able to listen to see if the request completes because it is cross-domain.
Q: Can I just add a delay to the script after the GET request is fired off? How do I do this in a single-threaded way? I need to return true from my function, to ensure the default event eventually fires, so I can't use setTimeout(). Would a tight loop waiting for 'success' to fire and set some variable work? I'm worried that this would freeze up things too much and the response would be slowed down. I assume the jQuery delay() plugin is just a loop too?
Or is something else I haven't thought of likely to be the culprit?
I don't need bullet-proof reliability. If all links are equally catchable 95% of the time it is fine. However right now, some links are catchable 100% of the time, while others are uncatchable -- which isn't going to cut it for what I want to achieve.
Thanks in advance.
I would try a different approach. You can bind to a different event like:
$(window).unload(function(event) {
// tracking code here
});
I would try to return false from the link event handler, remember the URL and navigate away only when JSONP request succeeds. Hopefully it shouldn't add too much latency. Considering you are on the inranet, it might be OK.
Solved!
The short answer is: there is no reliable way to do this cross-domain with a GET request. I tried all sorts, including storing the event and trying to replay the event later, and all manner of hacks to try to get that to work.
I then tried tight loops, and they weren't reliable either.
Finally, I just gave in and used a dynamically created form that POSTed the results, with the target set to a hidden iFrame.
That works reliably -- it seems the browser pauses to finish its POST request before moving on, and ASP honours the POST. Turns out it's not 'clunky' at all. Sure, due to the browser security model I can't see the result... but it doesn't matter in this case.
I am now kicking myself that I didn't try that option first.

Categories