Force a reload of page in Chrome using Javascript [no cache] - javascript

I need to reload a page using JavaScript and ensure that it does not pull from the browser cache but instead reloads the page from the server.
[As elements of the page will have changed in the interim]
On IE and FF I found that the following code worked fine;
window.location.reload(true);
However it does not work on Chrome or Safari.
I tried the following, but also to no avail;
window.location.replace(location.href);
document.location.reload(true);
document.location.replace(location.href);
Is there a solution to this issue?
Findings
After looking into this I have found that this issue is HTTP Protocol handling;
Chrome sends a request with Pragma: no-cache HTTP field
Server responds with Last-Modified: DATE1 field
JS uses location.reload(true) to force a reload from server not cache
Chrome sends a request with If-Modified-Since: DATE1 field
Server responds with HTTP Status 304 Not Modified
The server application is at fault for not noticing the state change in the dynamic page content, and thus not returning a 200.
However, Chrome/WebKit is the only browser that sends a If-Modified-Since field when the JS location.reload(true) is called.
I thought I would put my findings here in-case someone else comes across the same issue.

You can use this hack:
$.ajax({
url: window.location.href,
headers: {
"Pragma": "no-cache",
"Expires": -1,
"Cache-Control": "no-cache"
}
}).done(function () {
window.location.reload(true);
});

To ensure the page isn't loaded from cache you can add some unique number to query:
window.location = location.href + '?upd=' + 123456;
You also can use date instead of 123456

This is what I do to ensure my application file is force reloaded on chrome:
var oAjax = new XMLHttpRequest;
oAjax.open( 'get', '/path/to/my/app.js' );
oAjax.setRequestHeader( 'Pragma', 'no-cache' );
oAjax.send();
oAjax.onreadystatechange = function() {
if( oAjax.readyState === 4 ) {
self.location.reload();
}
}

Try window.location = window.location

Great findings! I just encountered the same issue and this really helps a lot!
However, in addition to your finding, it seems that Chrome always sends a GET request for location.reload()...IE/FF is repeating the last request instead.

Related

XMLHttpRequest doesn't update in javascript's interval

I use AJAX to check user updates every 2 seconds, but my javascript does not update the response.
I have one javascript file with XMLHttpRequest object and every 2 seconds it sends a request to another file (.php) where it gets XML with updates. For some reason, it doesn't always get the newest content and seems to have some old cached.
My javascript file contains this code (simplified):
var updates = new XMLHttpRequest();
updates.onreadystatechange = function(){
"use strict";
if(updates.readyState === 4 && updates.status === 200){
console.log(updates.responseXML);
}
};
var timer = 0;
clearInterval(timer);
timer = setInterval(function(){
"use strict";
updates.open('GET','scripts/check_for_notifications.php', true);
updates.send();
},2000);
Then I have the PHP file (check_for_notifications.php), where I have this code:
$response = new SimpleXMLElement('<xml/>');
$update = $response->addChild('update');
$update->addChild('content', 'New message');
$update->addChild('redirect', 'some link');
$update->addChild('date', '1.1.2019 12:00');
header('Content-type: text/xml');
print($response->asXML());
Every two second I receive a log in my console, but when I change the PHP file, while the interval is in progress (e.g. I change the date to '1.1.2019 11:00' and save it) I still receive the '12:00' in the console. For me, it seems that it doesn't update and it still has the repsonseXML cached. Is there any way I could "flush" the output or am I doing it wrong?
It's probably a cache problem. In the network browser console, you should see a response of type 304 Not modified.
To be sure, you can add an element in the url to bypass the cache:
updates.open('GET','scripts/check_for_notifications.php&nocache=' + new Date().getTime(), true);
If this works, you will need to configure the server (apache or nginx) to prevent the file from being cached. It's cleaner than the timestamp solution. the browser does not store cached files unnecessarily.
Apache .htaccess or Apache conf, something like
<Files "check.php">
<IfModule mod_headers.c>
Header set Cache-Control "no-cache, no-store, must-revalidate"
Header set Pragma "no-cache"
Header set Expires 0
</IfModule>
</Files>
Nginx conf, something like
location = /check.php {
add_header 'Cache-Control' 'no-cache, no-store, must-revalidate';
expires off;
}
You can also see the fetch api : https://hacks.mozilla.org/2016/03/referrer-and-cache-control-apis-for-fetch/
Be careful with your code, it can launch several requests simultaneously and find yourself with a DOS if there are too many users.
If requests take more than 2 seconds because the server is slow, others requests will be sent in the meantime, which will slow down the server even more....

how to make a cross domain api call using jQuery [duplicate]

I'm trying to load a cross-domain HTML page using AJAX but unless the dataType is "jsonp" I can't get a response. However using jsonp the browser is expecting a script mime type but is receiving "text/html".
My code for the request is:
$.ajax({
type: "GET",
url: "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute",
dataType: "jsonp",
}).success( function( data ) {
$( 'div.ajax-field' ).html( data );
});
Is there any way of avoiding using jsonp for the request? I've already tried using the crossDomain parameter but it didn't work.
If not is there any way of receiving the html content in jsonp? Currently the console is saying "unexpected <" in the jsonp reply.
jQuery Ajax Notes
Due to browser security restrictions, most Ajax requests are subject to the same origin policy; the request can not successfully retrieve data from a different domain, subdomain, port, or protocol.
Script and JSONP requests are not subject to the same origin policy restrictions.
There are some ways to overcome the cross-domain barrier:
CORS Proxy Alternatives
Ways to circumvent the same-origin policy
Breaking The Cross Domain Barrier
There are some plugins that help with cross-domain requests:
Cross Domain AJAX Request with YQL and jQuery
Cross-domain requests with jQuery.ajax
Heads up!
The best way to overcome this problem, is by creating your own proxy in the back-end, so that your proxy will point to the services in other domains, because in the back-end not exists the same origin policy restriction. But if you can't do that in back-end, then pay attention to the following tips.
**Warning!**
Using third-party proxies is not a secure practice, because they can keep track of your data, so it can be used with public information, but never with private data.
The code examples shown below use jQuery.get() and jQuery.getJSON(), both are shorthand methods of jQuery.ajax()
CORS Anywhere
2021 Update
Public demo server (cors-anywhere.herokuapp.com) will be very limited by January 2021, 31st
The demo server of CORS Anywhere (cors-anywhere.herokuapp.com) is meant to be a demo of this project. But abuse has become so common that the platform where the demo is hosted (Heroku) has asked me to shut down the server, despite efforts to counter the abuse. Downtime becomes increasingly frequent due to abuse and its popularity.
To counter this, I will make the following changes:
The rate limit will decrease from 200 per hour to 50 per hour.
By January 31st, 2021, cors-anywhere.herokuapp.com will stop serving as an open proxy.
From February 1st. 2021, cors-anywhere.herokuapp.com will only serve requests after the visitor has completed a challenge: The user (developer) must visit a page at cors-anywhere.herokuapp.com to temporarily unlock the demo for their browser. This allows developers to try out the functionality, to help with deciding on self-hosting or looking for alternatives.
CORS Anywhere is a node.js proxy which adds CORS headers to the proxied request.
To use the API, just prefix the URL with the API URL. (Supports https: see github repository)
If you want to automatically enable cross-domain requests when needed, use the following snippet:
$.ajaxPrefilter( function (options) {
if (options.crossDomain && jQuery.support.cors) {
var http = (window.location.protocol === 'http:' ? 'http:' : 'https:');
options.url = http + '//cors-anywhere.herokuapp.com/' + options.url;
//options.url = "http://cors.corsproxy.io/url=" + options.url;
}
});
$.get(
'http://en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
Whatever Origin
Whatever Origin is a cross domain jsonp access. This is an open source alternative to anyorigin.com.
To fetch the data from google.com, you can use this snippet:
// It is good specify the charset you expect.
// You can use the charset you want instead of utf-8.
// See details for scriptCharset and contentType options:
// http://api.jquery.com/jQuery.ajax/#jQuery-ajax-settings
$.ajaxSetup({
scriptCharset: "utf-8", //or "ISO-8859-1"
contentType: "application/json; charset=utf-8"
});
$.getJSON('http://whateverorigin.org/get?url=' +
encodeURIComponent('http://google.com') + '&callback=?',
function (data) {
console.log("> ", data);
//If the expected response is text/plain
$("#viewer").html(data.contents);
//If the expected response is JSON
//var response = $.parseJSON(data.contents);
});
CORS Proxy
CORS Proxy is a simple node.js proxy to enable CORS request for any website.
It allows javascript code on your site to access resources on other domains that would normally be blocked due to the same-origin policy.
CORS-Proxy gr2m (archived)
CORS-Proxy rmadhuram
How does it work?
CORS Proxy takes advantage of Cross-Origin Resource Sharing, which is a feature that was added along with HTML 5. Servers can specify that they want browsers to allow other websites to request resources they host. CORS Proxy is simply an HTTP Proxy that adds a header to responses saying "anyone can request this".
This is another way to achieve the goal (see www.corsproxy.com). All you have to do is strip http:// and www. from the URL being proxied, and prepend the URL with www.corsproxy.com/
$.get(
'http://www.corsproxy.com/' +
'en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
The http://www.corsproxy.com/ domain now appears to be an unsafe/suspicious site. NOT RECOMMENDED TO USE.
CORS proxy browser
Recently I found this one, it involves various security oriented Cross Origin Remote Sharing utilities. But it is a black-box with Flash as backend.
You can see it in action here: CORS proxy browser
Get the source code on GitHub: koto/cors-proxy-browser
You can use Ajax-cross-origin a jQuery plugin.
With this plugin you use jQuery.ajax() cross domain. It uses Google services to achieve this:
The AJAX Cross Origin plugin use Google Apps Script as a proxy jSON
getter where jSONP is not implemented. When you set the crossOrigin
option to true, the plugin replace the original url with the Google
Apps Script address and send it as encoded url parameter. The Google
Apps Script use Google Servers resources to get the remote data, and
return it back to the client as JSONP.
It is very simple to use:
$.ajax({
crossOrigin: true,
url: url,
success: function(data) {
console.log(data);
}
});
You can read more here:
http://www.ajax-cross-origin.com/
If the external site doesn't support JSONP or CORS, your only option is to use a proxy.
Build a script on your server that requests that content, then use jQuery ajax to hit the script on your server.
Just put this in the header of your PHP Page and it ill work without API:
header('Access-Control-Allow-Origin: *'); //allow everybody
or
header('Access-Control-Allow-Origin: http://codesheet.org'); //allow just one domain
or
$http_origin = $_SERVER['HTTP_ORIGIN']; //allow multiple domains
$allowed_domains = array(
'http://codesheet.org',
'http://stackoverflow.com'
);
if (in_array($http_origin, $allowed_domains))
{
header("Access-Control-Allow-Origin: $http_origin");
}
I'm posting this in case someone faces the same problem I am facing right now. I've got a Zebra thermal printer, equipped with the ZebraNet print server, which offers a HTML-based user interface for editing multiple settings, seeing the printer's current status, etc. I need to get the status of the printer, which is displayed in one of those html pages, offered by the ZebraNet server and, for example, alert() a message to the user in the browser. This means that I have to get that html page in Javascript first. Although the printer is within the LAN of the user's PC, that Same Origin Policy is still staying firmly in my way. I tried JSONP, but the server returns html and I haven't found a way to modify its functionality (if I could, I would have already set the magic header Access-control-allow-origin: *). So I decided to write a small console app in C#. It has to be run as Admin to work properly, otherwise it trolls :D an exception. Here is some code:
// Create a listener.
HttpListener listener = new HttpListener();
// Add the prefixes.
//foreach (string s in prefixes)
//{
// listener.Prefixes.Add(s);
//}
listener.Prefixes.Add("http://*:1234/"); // accept connections from everywhere,
//because the printer is accessible only within the LAN (no portforwarding)
listener.Start();
Console.WriteLine("Listening...");
// Note: The GetContext method blocks while waiting for a request.
HttpListenerContext context;
string urlForRequest = "";
HttpWebRequest requestForPage = null;
HttpWebResponse responseForPage = null;
string responseForPageAsString = "";
while (true)
{
context = listener.GetContext();
HttpListenerRequest request = context.Request;
urlForRequest = request.RawUrl.Substring(1, request.RawUrl.Length - 1); // remove the slash, which separates the portNumber from the arg sent
Console.WriteLine(urlForRequest);
//Request for the html page:
requestForPage = (HttpWebRequest)WebRequest.Create(urlForRequest);
responseForPage = (HttpWebResponse)requestForPage.GetResponse();
responseForPageAsString = new StreamReader(responseForPage.GetResponseStream()).ReadToEnd();
// Obtain a response object.
HttpListenerResponse response = context.Response;
// Send back the response.
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseForPageAsString);
// Get a response stream and write the response to it.
response.ContentLength64 = buffer.Length;
response.AddHeader("Access-Control-Allow-Origin", "*"); // the magic header in action ;-D
System.IO.Stream output = response.OutputStream;
output.Write(buffer, 0, buffer.Length);
// You must close the output stream.
output.Close();
//listener.Stop();
All the user needs to do is run that console app as Admin. I know it is way too ... frustrating and complicated, but it is sort of a workaround to the Domain Policy problem in case you cannot modify the server in any way.
edit: from js I make a simple ajax call:
$.ajax({
type: 'POST',
url: 'http://LAN_IP:1234/http://google.com',
success: function (data) {
console.log("Success: " + data);
},
error: function (e) {
alert("Error: " + e);
console.log("Error: " + e);
}
});
The html of the requested page is returned and stored in the data variable.
To get the data form external site by passing using a local proxy as suggested by jherax you can create a php page that fetches the content for you from respective external url and than send a get request to that php page.
var req = new XMLHttpRequest();
req.open('GET', 'http://localhost/get_url_content.php',false);
if(req.status == 200) {
alert(req.responseText);
}
as a php proxy you can use https://github.com/cowboy/php-simple-proxy
Your URL doesn't work these days, but your code can be updated with this working solution:
var url = "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute";
url = 'https://google.com'; // TEST URL
$.get("https://images"+~~(Math.random()*33)+"-focus-opensocial.googleusercontent.com/gadgets/proxy?container=none&url=" + encodeURI(url), function(data) {
$('div.ajax-field').html(data);
});
<div class="ajax-field"></div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
You need CORS proxy which proxies your request from your browser to requested service with appropriate CORS headers. List of such services are in code snippet below. You can also run provided code snippet to see ping to such services from your location.
$('li').each(function() {
var self = this;
ping($(this).text()).then(function(delta) {
console.log($(self).text(), delta, ' ms');
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdn.rawgit.com/jdfreder/pingjs/c2190a3649759f2bd8569a72ae2b597b2546c871/ping.js"></script>
<ul>
<li>https://crossorigin.me/</li>
<li>https://cors-anywhere.herokuapp.com/</li>
<li>http://cors.io/</li>
<li>https://cors.5apps.com/?uri=</li>
<li>http://whateverorigin.org/get?url=</li>
<li>https://anyorigin.com/get?url=</li>
<li>http://corsproxy.nodester.com/?src=</li>
<li>https://jsonp.afeld.me/?url=</li>
<li>http://benalman.com/code/projects/php-simple-proxy/ba-simple-proxy.php?url=</li>
</ul>
Figured it out.
Used this instead.
$('.div_class').load('http://en.wikipedia.org/wiki/Cross-origin_resource_sharing #toctitle');

POST going as OPTIONS in firebase node js http request [duplicate]

I am working on an internal web application at work. In IE10 the requests work fine, but in Chrome all the AJAX requests (which there are many) are sent using OPTIONS instead of whatever defined method I give it. Technically my requests are "cross domain." The site is served on localhost:6120 and the service I'm making AJAX requests to is on 57124. This closed jquery bug defines the issue, but not a real fix.
What can I do to use the proper http method in ajax requests?
Edit:
This is in the document load of every page:
jQuery.support.cors = true;
And every AJAX is built similarly:
var url = 'http://localhost:57124/My/Rest/Call';
$.ajax({
url: url,
dataType: "json",
data: json,
async: true,
cache: false,
timeout: 30000,
headers: { "x-li-format": "json", "X-UserName": userName },
success: function (data) {
// my success stuff
},
error: function (request, status, error) {
// my error stuff
},
type: "POST"
});
Chrome is preflighting the request to look for CORS headers. If the request is acceptable, it will then send the real request. If you're doing this cross-domain, you will simply have to deal with it or else find a way to make the request non-cross-domain. This is why the jQuery bug was closed as won't-fix. This is by design.
Unlike simple requests (discussed above), "preflighted" requests first
send an HTTP request by the OPTIONS method to the resource on the
other domain, in order to determine whether the actual request is safe
to send. Cross-site requests are preflighted like this since they may
have implications to user data. In particular, a request is
preflighted if:
It uses methods other than GET, HEAD or POST. Also, if POST is used to send request data with a Content-Type other than
application/x-www-form-urlencoded, multipart/form-data, or text/plain,
e.g. if the POST request sends an XML payload to the server using
application/xml or text/xml, then the request is preflighted.
It sets custom headers in the request (e.g. the request uses a header such as X-PINGOTHER)
Based on the fact that the request isn't sent on the default port 80/443 this Ajax call is automatically considered a cross-origin resource (CORS) request, which in other words means that the request automatically issues an OPTIONS request which checks for CORS headers on the server's/servlet's side.
This happens even if you set
crossOrigin: false;
or even if you ommit it.
The reason is simply that localhost != localhost:57124. Try sending it only to localhost without the port - it will fail, because the requested target won't be reachable, however notice that if the domain names are equal the request is sent without the OPTIONS request before POST.
I agree with Kevin B, the bug report says it all. It sounds like you are trying to make cross-domain ajax calls. If you're not familiar with the same origin policy you can start here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Same_origin_policy_for_JavaScript.
If this is not intended to be a cross-domain ajax call, try making your target url relative and see if the problem goes away. If you're really desperate look into the JSONP, but beware, mayhem lurks. There really isn't much more we can do to help you.
If it is possible pass the params through regular GET/POST with a different name and let your server side code handles it.
I had a similar issue with my own proxy to bypass CORS and I got the same error of POST->OPTION in Chrome. It was the Authorization header in my case ("x-li-format" and "X-UserName" here in your case.) I ended up passing it in a dummy format (e.g. AuthorizatinJack in GET) and I changed the code for my proxy to turn that into a header when making the call to the destination. Here it is in PHP:
if (isset($_GET['AuthorizationJack'])) {
$request_headers[] = "Authorization: Basic ".$_GET['AuthorizationJack'];
}
In my case I'm calling an API hosted by AWS (API Gateway). The error happened when I tried to call the API from a domain other than the API own domain. Since I'm the API owner I enabled CORS for the test environment, as described in the Amazon Documentation.
In production this error will not happen, since the request and the api will be in the same domain.
I hope it helps!
As answered by #Dark Falcon, I simply dealt with it.
In my case, I am using node.js server, and creating a session if it does not exist. Since the OPTIONS method does not have the session details in it, it ended up creating a new session for every POST method request.
So in my app routine to create-session-if-not-exist, I just added a check to see if method is OPTIONS, and if so, just skip session creating part:
app.use(function(req, res, next) {
if (req.method !== "OPTIONS") {
if (req.session && req.session.id) {
// Session exists
next();
}else{
// Create session
next();
}
} else {
// If request method is OPTIONS, just skip this part and move to the next method.
next();
}
}
"preflighted" requests first send an HTTP request by the OPTIONS method to the resource on the other domain, in order to determine whether the actual request is safe to send. Cross-site requests
https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS
Consider using axios
axios.get( url,
{ headers: {"Content-Type": "application/json"} } ).then( res => {
if(res.data.error) {
} else {
doAnything( res.data )
}
}).catch(function (error) {
doAnythingError(error)
});
I had this issue using fetch and axios worked perfectly.
I've encountered a very similar issue. I spent almost half a day to understand why everything works correctly in Firefox and fails in Chrome. In my case it was because of duplicated (or maybe mistyped) fields in my request header.
Use fetch instead of XHR,then the request will not be prelighted even it's cross-domained.
$.ajax({
url: '###',
contentType: 'text/plain; charset=utf-8',
async: false,
xhrFields: {
withCredentials: true,
crossDomain: true,
Authorization: "Bearer ...."
},
method: 'POST',
data: JSON.stringify( request ),
success: function (data) {
console.log(data);
}
});
the contentType: 'text/plain; charset=utf-8', or just contentType: 'text/plain', works for me!
regards!!

Random API calls always give same result [duplicate]

How to prevent browsers from caching Ajax results? I have and event triggered Ajax script the displays results only when the browsers data has been cleared.
Tested in IE6 and Firefox 3.0.10
The random URL works, but it's kind of a hack. HTTP has solutions built in that should work. Try using the solution indicated here. Basically, set the headers:
"Pragma": "no-cache",
"Cache-Control": "no-store, no-cache, must-revalidate, post-check=0, pre-check=0",
"Expires": 0,
"Last-Modified": new Date(0), // January 1, 1970
"If-Modified-Since": new Date(0)
Add a random query string to the URL you are sending.
E.g. if the Ajax request is sent to "http://www.xyz.com/a"
then add a random string at the end: "http://www.xyz.com/a?q=39058459ieutm39"
I've used the jQuery {cache: false} method and it worked like a charm.
The complete code example is like this:
$.ajaxSetup({cache: false});
There are two techniques for this that I'm aware of.
Add some sort of query string to the AJAX request URL so that it's always unique. A millisecond timestamp (perhaps combined with a random value) is good for this
Set HTTP cache control headers on the AJAX response so that the browser doesn't cache it
using jQuery you can set global ajax setting: { cache: false }. See it in jquery ajax docs

Navigator.sendBeacon() to pass header information

I am using navigator for communicating with the server , but problem is that we need to pass some header information as there is filter which recognise the request is from the valid source.
Can anybody help on this?
Thanks.
See the Navigator.sendBeacon MDN documentation for further information.
Create a blob to provide headers. Here is an example:
window.onunload = () => {
const body = {
id,
email,
};
const headers = {
type: 'application/json',
};
const blob = new Blob([JSON.stringify(body)], headers);
navigator.sendBeacon('url', blob);
};
navigator.sendBeacon will send a POST request with the Content-Type request header set to whatever is in headers.type. This seems to be the only header you can set in a beacon though, per W3C:
The sendBeacon method does not provide ability to customize the request method, provide custom request headers, or change other processing properties of the request and response. Applications that require non-default settings for such requests should use the [FETCH] API with keepalive flag set to true.
I was able to observe some of how this worked through this Chromium bug report.
As written in the Processing Model of sendBeacon :
Extract object's byte stream (transmittedData) and content type (contentType).
How extraction is performed is described here
What I've gathered is that the content type of the transmitted data is extracted, and it is set as the Content-Type of the HTTP request.
1) If a Blob object is sent, the Content-Type becomes the Blob's type.
2) If a FormData object is sent, the Content-Type becomes multipart/form-data
3) If a URLSearchParams object is sent, the Content-Type becomes application/x-www-form-urlencoded
4) If a normal string is sent, the Content-Type becomes text/plain
Javascript code to implement different objects can be found here
If you're using Chrome and you're trying to set the content-type header, you'll probably have some issues due to security restrictions:
Uncaught DOMException: Failed to execute 'sendBeacon' on 'Navigator': sendBeacon() with a Blob whose type is not any of the CORS-safelisted values for the Content-Type request header is disabled temporarily. See http://crbug.com/490015 for details.
See sendBeacon API not working temporarily due to security issue, any workaround?
I want to call an api when someone close the tab, so I tried to use navigator.sendBeacon() but the problem is we need to pass the Authorization token into it and sendBeacon does not provide that, so I found other solution that is more effective and very easy to implement.
The solution is a native fetch API with a keepalive flag in pagehide event.
Code
window.addEventListener('pagehide', () => {
fetch(`<URL>`, {
keepalive: true,
method: '<METHOD>',
headers: {
'content-type': 'application/json',
// any header you can pass here
},
body: JSON.stringify({ data: 'any data' }),
});
});
FAQs / TL;DR Version
Why should we need to use the keepalive flag?
The keepalive option can be used to allow the request to outlive the page. Fetch with the keepalive flag is a replacement for the Navigator.sendBeacon() API.
Learn more about it, please visit https://developer.mozilla.org/en-US/docs/Web/API/fetch#parameters
What is PageLifecycle API
Learn more about it, please visit https://developer.chrome.com/blog/page-lifecycle-api/
From the Page Lifecycle image, shouldn't unload be considered as the best choice?
unload is the best event for this case but unload is not firing in some cases on mobile and it also does not support the bfcache functionality.
I also notice that when I am using unload then I am not getting proper output in the server log. why? IDK, if you know about it then comments are welcome.
Nowadays, It's also not recommended by the developers.
Learn more about why unload is not recommended: https://developer.mozilla.org/en-US/docs/Web/API/Window/unload_event#usage_notes
Learn more about pagehide: https://developer.mozilla.org/en-US/docs/Web/API/Window/pagehide_event
Because the method sendBeacon(..) does not allow headers manipulation, I added them into the form as normal fields:
const formData = new FormData();
formData.append('authorization', myAuthService.getCachedToken());
navigator.sendBeacon(myURL, formData);
Then on the host side I added a simple Middleware class (.Net) which catches POST requests without headers and copies them from the body:
public class AuthMiddleware
{
...
...
public async Task Invoke(HttpContext context)
{
string authHeader = context.Request.Headers["Authorization"];
if (authHeader == null && context.Request.Method=="POST")
{
context.Request.Headers["Authorization"] = string.Format("Bearer {0}",
context.Request.Form["authorization"].ToString());
}
await _next.Invoke(context);
}
}
Posting as an answer as I'm not allowed to post a comment under the answer:
For Chrome, issue with navigator.sendBeacon sending Blob for with non CORS-safelisted types was fixed in Chrome version 81 so this should be safe to use now.
https://bugs.chromium.org/p/chromium/issues/detail?id=724929
For IE, an alternative in unload event is to use synchronous ajax request, as IE doesn't support sendBeacon but supports synchronous ajax call in my case.
You can't send data with JSON after Chrome 39, has been disabled due to a security concern.
You can try to send data with plain text. But don't forget the parseing text from the backend.
After searching for an answer for this question I found out that for passing header with navigator we need to pass a blob object.
For example
var headers = {type: 'application/json'};
var blob = new Blob(request, headers);
navigator.sendBeacon('url/to/send', blob);

Categories