I am working with cross-domain remote resources that require locking. CORs headers are set appropriately.
I am trying to solve the case where the resource is not released by the client (remains locked until the lock expires) when the browser window is closed.
I had hoped to send a synchronous DELETE request on window unload. I am using jquery (answer can be plain javascript if necessary... mentioning jquery for context) and noticed their docs say "Cross-domain requests ... do not support synchronous operation" and I became very sad.
Is it possible to make a synchronous cross-domain ajax request? Is the jquery limitation due to older browsers? Everything I've read indicates the unload event listener will not be around long enough for the ajax call to complete if it is async and suggests using a synchronous request for this type of cleanup. Unfortunately the call is cross-domain... what can I do?
EDIT
So I am curious if I am getting lucky during local development (i.e. client on 127.0.0.1:8080 and api on 127.0.0.1:8081) or the jquery docs are just misleading. Will the following end up causing me problems down the road?
This appears to be working in Chrome45:
var unload_event = "unload." + lock.id
function release_lock(sync) {
$.ajax({
method: "DELETE",
async: !sync,
data: lock,
url: lock.url,
error: function(){
console.log("failed to release lock " + JSON.stringify(lock));
},
success: function(){
console.log("lock " + lock.id + " released");
lock = null;
$(window).off(unload_event);
}
});
}
$(window).on(unload_event, function(){
release_lock(true);
});
It does generate the following warning in the console:
Synchronous XMLHttpRequest on the main thread is deprecated because of
its detrimental effects to the end user's experience.
For more help, check http://xhr.spec.whatwg.org/.
I would avoid doing this in the unload event due to the fact that synchronous ajax is the only way that will work, and synchronous ajax requests are deprecated in some modern browsers.
Alternatives include:
keepalive requests
This would involve periodically sending a request to the server indicating that the user is still editing the resource. The downside to this technique is that the resource will remain locked until the timeout happens, so if you're keepalive is set to an interval of 1 minute with a 3 minute lock timeout, it will remain locked for up to 3 minutes after the user has left the page. Additionally, if the user loses network connection for 3 minutes or longer, it will also become unlocked.
websockets
This would create an open connection between the client and the server, and while this connection is open, you can keep the resource locked. As soon as the client disconnects, you can assume that the client has closed the page and unlock it. The downside here is if the client loses network connection, it will also become unlocked.
Related
I am trying to detect if a user is using some kind of extension in his browser that could prevent my site from working properly and if so, display a message for the user.
I am testing with an extension called uMatrix, but there are others with a similar approach.
My problem is, that kind of extension will block my HTTP request, but that doesn't return a proper status code (like 403, 404, 500, etc). Instead, when I catch my request, I just get a
Error: Network Error
at [my file name and line number here]
at XMLHttpRequest.handleError (xhr.js:83)
I believe this same error would be thrown in other circumstances, like lack of internet connection, so I can't assume this Network Error means that the user has a "HTTP request blocker".
I was reading a lot about identifying AdsBlocker on this thread and other places, but I don't think it applies to my issue.
Any ideas on how to identify that a user is blocking my HTTP Requests? (Either on purpose or through a browser extension)
I thought I would share here the solution I found, even though I don't think that's the best answer yet.
I am using Axios for my API requests and I found this thread here:
https://github.com/axios/axios/issues/383#issuecomment-234079506
What they suggest it's to check if the response has a status (but in latest Axios, they don't even return a response). If not, it means the server was never reached. That could still mean that there is no internet connection, not necessarily an extension blocked the request. So I adjusted my message to cover both scenarios.
My final code was something like this:
// Note the use of "HEAD", since that will be faster than a "GET"
axios.head(server_url_for_testing).catch(function(error) {
if (!error.response) {
// Display my warning message
}
})
If my web app is incurring Head-of-line blocking and I'd like to prioritize incoming requests being blocked, is there a way I can close the blocking requests from JavaScript? Thanks
Yes.
The handle to xmlHTTPRequestObject (XHR) can be used to invoke an abort request. For example, if you want to cancel a request that's already taking a second to run:
var xhr=xmlHTTPRequestObject();
xhr.open(...);
xhr.send(...);
...
setTimeout(function(){
if (xhr) xhr.abort();
},1000);
Very important: that an in-flight request is cancelled does not cancel for the server. For example, a request to delete a record might take a long time to respond, but the server will honor the request regardless whether it's aborted or not.
Have you taken a look at the abort() method? In theory you can listen for the event and abort the requests.
Also, could you please elaborate further upon your question.
I have a navigation.sendBeacon request being sent during a pagehide event on Safari with some analytics data to an endpoint on the same domain as the current page. This works fine when the tab is being closed, but when navigating to a new url, Safari throws Beacon API Cannot load <url> due to access control checks while trying to make the request.
This issue does not occur on Chrome, and there are no other logs shown. I don't think this is a CORS request, all domains and subdomains are the same.
Has anyone else seen this or know how to fix?
Using any sort of Asynchronous HTTP request, whether it is sendBeacon, fetch, or XMLHttpRequest seems to have problems in both desktop and iOS Safari at the moment when inside a pagehide event. I have received versions of the same error such as Fetch API cannot load ... due to access control checks when I use different types of HTTP requesters within the pagehide event. I am sure that it is not a CORS error, since the exact same request does not have a problem outside of a pagehide event.
While not recommended due the its blocking of the main thread, I am using synchronous requests until the bug is patched in Safari. For my use case, it is more critical that the analytics data from pagehide is successfully sent even even though it causes a small delay to the end user. Synchronous HTTP requests are a meh workaround until the bug is remediated, which hopefully is soon since the link from #Phillip Walton suggests that a patch has been accepted but obviously has not been released yet.
if (isSafari && pageHideBroken) {
$.ajax({
type: "POST",
async: false, //The most important line
url: `https://`,
data: 'Goodbye',
timeout: 5000
});
}
else {
navigator.sendBeacon(`https://`, 'Goodbye');
}
I have confirmed that on both Desktop Safari and iOS Safari that my backend successfully receives the data using this approach. JQuery is not required to make a sync HTTP request, but I just used $.ajax as the example due to its conciseness compared to XMLHttpRequest. If you make this workaround conditional like I have, then it is easy to swap back to navigator.sendBeacon once the bug is fixed! This type of browser-dependent behavior is never fun to code around.
If I make a AJAX reqeust it will be displayed in the network tab in Chrome. If I in the same moment makes a client based redirect, the AJAX request will canceled. But will the request make it to the server and execute as normal? Is it something in HTTP/TCP that know's that the client has canceled the request? I don't think so, but I want to be sure.
If you're running PHP server-side, it will stop processing in the event of a client-side abort. (From what I've read, this isn't the case with other server-side technologies, which will continue processing after a client aborts.) See:
http://php.net/manual/en/features.connection-handling.php
But, it's best not to assume anything one way or another. The browser may cancel the request. And this cancellation may occur in time to stop processing server-side. But, that's not necessarily the case. The client could cancel at any stage during the request -- from right before the request is actually sent to just after a response body is sent. Also bear in mind, there are other things which can interrupt server-side request processing (hardware, power, OS failure, etc.). Expect some unpredictability.
From this, I'd make two recommendations:
Write your code be as transaction-safe as possible. If a request makes data changes, don't commit them until all changes have been piped to the database. And if your application relies on multiple AJAX requests to change some data, don't commit any of the changes until the the end of "final" AJAX request.
Do not assume, even if a request finishes, that the client receives the response. Off the top of my head, that means if your application is AJAX-heavy, always rely on client-side state to tell the server what information it has, rather than relying on server-side state to assume the client "knows" something.
This is one of the few cases where synchronous requests (async: false in the $.ajax(...) options) are appropriate. This usually avoids the browser from navigating to the other page until the request has finished.
I have a javascript web app set up to use long polling with a .net server, using the ajax call:
var _listenTimeout = 5 * 60 * 1000;
//...
$.ajax({
type: "GET",
url: url,
async: true,
cache: false,
dataType: "jsonp",
timeout: _listenTimeout,
success: callback
});
However, there is an issue when the page is open for a while, where requests to the server stop responding. It seems to be in a limbo state, where packets are stuck between the web browser and being sent out of the network to the server.
I've dug around, and I know a couple of things:
The request gets sent because it shows up on the chrome web developer network tab (also chrome://net-internals) and firebug. The status of the request is always pending, and it just stalls in that state.
The request url is correct, because sending the same url through curl returns an immediate response.
It hasn't left my network because there aren't any packets detected using wireshark. The server logs (and wireshark on that side) don't show anything either.
The strange thing is, sometimes it works sporadically, but with a delay of a couple of minutes ie, the browser network log detects the request gets a response (a 204 in my case), and wireshark detects the request being sent (and received on the server end). The delay is the same for both the browser and wireshark ie, the request is made, a delay occurs, then chrome detects a response and wireshare detects packets sent.
A clean refresh of the page also fixes the issue, with the same request/response occurring almost immediately (with the relevant browser network and wireshark logs).
I've tried it on Chrome 21.0.1180.89 and Firefox 14.01.
Is there some browser queue of some sort that gets clogged up in the browser with ajax requests? Or maybe too many concurrent requests? No idea right now...
$.ajax wont make periodic requests. timeout option specifies the delay in making actual request not for polling. read about it here. http://api.jquery.com/jQuery.ajax/. Thats why you are seeing only one ajax request.If you want it manually you have to do it with setTimeout javascript method.
You can call setTimeout in the success and error callbacks so that you can get rid of concurrent ajax requests.