I have a bunch of REST APIs which send cache-control as public, only-if-cached, max-stale=2419200, max-age=60
max-age=60 makes sure that if the client sends an identical API request, the client picks the response from the cache instead of forwarding it to the server for a fresh response.
Now, this works perfectly for our mobile app because it is read only and a minute delay in data update is acceptable. However, the website, which uses the same set of APIs to add/ delete data, needs to be real-time i.e. always ignore the max-age=60 part of the header. Since, currently, the caching header is not ignored all PUT, PATCH or DELETE request reflect the changes only after a minute.
Is there any way I can let my website to ignore cache-control. The web pages are written in plain JS and HTML. JS uses fetch for all REST requests.
You can disable cache in fetch by appending in headers
var headers = new Headers();
headers.append('pragma', 'no-cache');
headers.append('cache-control', 'no-cache');
var init = {
method: 'GET',
headers: headers,
};
var request = new Request(YOUR_URL);
fetch(request, init)
.....
You can also use cache-control: no-store, which will never store a cache version.
alternately, you can use a dynamic string in URL, it will still store a version in your browser's cache , like
const ms = Date.now();
const data = await fetch(YOUR_URL+"?t="+ms)
Implement Timestamped Requests Logic:
Append timestamp to each of the requests. With the help of a javascript utility method something like below, this method will append time stamp to all the requests urls along with the exisitng query params to the URL if exists.
private getTimeStampedUrl(url:string){
var timestamp = Date.now();
var timestampedUrl = (url.indexOf('?') == -1) ? url + '?' : url + '&';
timestampedUrl += 'timestamp=' + timestamp;
return timestampedUrl;
}
This will treat each of the requests as new requests from browser and caching will not be working. In general, browser cache works if we send similar requests, which we are breaking here, by appending unique time stamps to each requests.
Related
I use AJAX to check user updates every 2 seconds, but my javascript does not update the response.
I have one javascript file with XMLHttpRequest object and every 2 seconds it sends a request to another file (.php) where it gets XML with updates. For some reason, it doesn't always get the newest content and seems to have some old cached.
My javascript file contains this code (simplified):
var updates = new XMLHttpRequest();
updates.onreadystatechange = function(){
"use strict";
if(updates.readyState === 4 && updates.status === 200){
console.log(updates.responseXML);
}
};
var timer = 0;
clearInterval(timer);
timer = setInterval(function(){
"use strict";
updates.open('GET','scripts/check_for_notifications.php', true);
updates.send();
},2000);
Then I have the PHP file (check_for_notifications.php), where I have this code:
$response = new SimpleXMLElement('<xml/>');
$update = $response->addChild('update');
$update->addChild('content', 'New message');
$update->addChild('redirect', 'some link');
$update->addChild('date', '1.1.2019 12:00');
header('Content-type: text/xml');
print($response->asXML());
Every two second I receive a log in my console, but when I change the PHP file, while the interval is in progress (e.g. I change the date to '1.1.2019 11:00' and save it) I still receive the '12:00' in the console. For me, it seems that it doesn't update and it still has the repsonseXML cached. Is there any way I could "flush" the output or am I doing it wrong?
It's probably a cache problem. In the network browser console, you should see a response of type 304 Not modified.
To be sure, you can add an element in the url to bypass the cache:
updates.open('GET','scripts/check_for_notifications.php&nocache=' + new Date().getTime(), true);
If this works, you will need to configure the server (apache or nginx) to prevent the file from being cached. It's cleaner than the timestamp solution. the browser does not store cached files unnecessarily.
Apache .htaccess or Apache conf, something like
<Files "check.php">
<IfModule mod_headers.c>
Header set Cache-Control "no-cache, no-store, must-revalidate"
Header set Pragma "no-cache"
Header set Expires 0
</IfModule>
</Files>
Nginx conf, something like
location = /check.php {
add_header 'Cache-Control' 'no-cache, no-store, must-revalidate';
expires off;
}
You can also see the fetch api : https://hacks.mozilla.org/2016/03/referrer-and-cache-control-apis-for-fetch/
Be careful with your code, it can launch several requests simultaneously and find yourself with a DOS if there are too many users.
If requests take more than 2 seconds because the server is slow, others requests will be sent in the meantime, which will slow down the server even more....
I'm trying to load a cross-domain HTML page using AJAX but unless the dataType is "jsonp" I can't get a response. However using jsonp the browser is expecting a script mime type but is receiving "text/html".
My code for the request is:
$.ajax({
type: "GET",
url: "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute",
dataType: "jsonp",
}).success( function( data ) {
$( 'div.ajax-field' ).html( data );
});
Is there any way of avoiding using jsonp for the request? I've already tried using the crossDomain parameter but it didn't work.
If not is there any way of receiving the html content in jsonp? Currently the console is saying "unexpected <" in the jsonp reply.
jQuery Ajax Notes
Due to browser security restrictions, most Ajax requests are subject to the same origin policy; the request can not successfully retrieve data from a different domain, subdomain, port, or protocol.
Script and JSONP requests are not subject to the same origin policy restrictions.
There are some ways to overcome the cross-domain barrier:
CORS Proxy Alternatives
Ways to circumvent the same-origin policy
Breaking The Cross Domain Barrier
There are some plugins that help with cross-domain requests:
Cross Domain AJAX Request with YQL and jQuery
Cross-domain requests with jQuery.ajax
Heads up!
The best way to overcome this problem, is by creating your own proxy in the back-end, so that your proxy will point to the services in other domains, because in the back-end not exists the same origin policy restriction. But if you can't do that in back-end, then pay attention to the following tips.
**Warning!**
Using third-party proxies is not a secure practice, because they can keep track of your data, so it can be used with public information, but never with private data.
The code examples shown below use jQuery.get() and jQuery.getJSON(), both are shorthand methods of jQuery.ajax()
CORS Anywhere
2021 Update
Public demo server (cors-anywhere.herokuapp.com) will be very limited by January 2021, 31st
The demo server of CORS Anywhere (cors-anywhere.herokuapp.com) is meant to be a demo of this project. But abuse has become so common that the platform where the demo is hosted (Heroku) has asked me to shut down the server, despite efforts to counter the abuse. Downtime becomes increasingly frequent due to abuse and its popularity.
To counter this, I will make the following changes:
The rate limit will decrease from 200 per hour to 50 per hour.
By January 31st, 2021, cors-anywhere.herokuapp.com will stop serving as an open proxy.
From February 1st. 2021, cors-anywhere.herokuapp.com will only serve requests after the visitor has completed a challenge: The user (developer) must visit a page at cors-anywhere.herokuapp.com to temporarily unlock the demo for their browser. This allows developers to try out the functionality, to help with deciding on self-hosting or looking for alternatives.
CORS Anywhere is a node.js proxy which adds CORS headers to the proxied request.
To use the API, just prefix the URL with the API URL. (Supports https: see github repository)
If you want to automatically enable cross-domain requests when needed, use the following snippet:
$.ajaxPrefilter( function (options) {
if (options.crossDomain && jQuery.support.cors) {
var http = (window.location.protocol === 'http:' ? 'http:' : 'https:');
options.url = http + '//cors-anywhere.herokuapp.com/' + options.url;
//options.url = "http://cors.corsproxy.io/url=" + options.url;
}
});
$.get(
'http://en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
Whatever Origin
Whatever Origin is a cross domain jsonp access. This is an open source alternative to anyorigin.com.
To fetch the data from google.com, you can use this snippet:
// It is good specify the charset you expect.
// You can use the charset you want instead of utf-8.
// See details for scriptCharset and contentType options:
// http://api.jquery.com/jQuery.ajax/#jQuery-ajax-settings
$.ajaxSetup({
scriptCharset: "utf-8", //or "ISO-8859-1"
contentType: "application/json; charset=utf-8"
});
$.getJSON('http://whateverorigin.org/get?url=' +
encodeURIComponent('http://google.com') + '&callback=?',
function (data) {
console.log("> ", data);
//If the expected response is text/plain
$("#viewer").html(data.contents);
//If the expected response is JSON
//var response = $.parseJSON(data.contents);
});
CORS Proxy
CORS Proxy is a simple node.js proxy to enable CORS request for any website.
It allows javascript code on your site to access resources on other domains that would normally be blocked due to the same-origin policy.
CORS-Proxy gr2m (archived)
CORS-Proxy rmadhuram
How does it work?
CORS Proxy takes advantage of Cross-Origin Resource Sharing, which is a feature that was added along with HTML 5. Servers can specify that they want browsers to allow other websites to request resources they host. CORS Proxy is simply an HTTP Proxy that adds a header to responses saying "anyone can request this".
This is another way to achieve the goal (see www.corsproxy.com). All you have to do is strip http:// and www. from the URL being proxied, and prepend the URL with www.corsproxy.com/
$.get(
'http://www.corsproxy.com/' +
'en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
The http://www.corsproxy.com/ domain now appears to be an unsafe/suspicious site. NOT RECOMMENDED TO USE.
CORS proxy browser
Recently I found this one, it involves various security oriented Cross Origin Remote Sharing utilities. But it is a black-box with Flash as backend.
You can see it in action here: CORS proxy browser
Get the source code on GitHub: koto/cors-proxy-browser
You can use Ajax-cross-origin a jQuery plugin.
With this plugin you use jQuery.ajax() cross domain. It uses Google services to achieve this:
The AJAX Cross Origin plugin use Google Apps Script as a proxy jSON
getter where jSONP is not implemented. When you set the crossOrigin
option to true, the plugin replace the original url with the Google
Apps Script address and send it as encoded url parameter. The Google
Apps Script use Google Servers resources to get the remote data, and
return it back to the client as JSONP.
It is very simple to use:
$.ajax({
crossOrigin: true,
url: url,
success: function(data) {
console.log(data);
}
});
You can read more here:
http://www.ajax-cross-origin.com/
If the external site doesn't support JSONP or CORS, your only option is to use a proxy.
Build a script on your server that requests that content, then use jQuery ajax to hit the script on your server.
Just put this in the header of your PHP Page and it ill work without API:
header('Access-Control-Allow-Origin: *'); //allow everybody
or
header('Access-Control-Allow-Origin: http://codesheet.org'); //allow just one domain
or
$http_origin = $_SERVER['HTTP_ORIGIN']; //allow multiple domains
$allowed_domains = array(
'http://codesheet.org',
'http://stackoverflow.com'
);
if (in_array($http_origin, $allowed_domains))
{
header("Access-Control-Allow-Origin: $http_origin");
}
I'm posting this in case someone faces the same problem I am facing right now. I've got a Zebra thermal printer, equipped with the ZebraNet print server, which offers a HTML-based user interface for editing multiple settings, seeing the printer's current status, etc. I need to get the status of the printer, which is displayed in one of those html pages, offered by the ZebraNet server and, for example, alert() a message to the user in the browser. This means that I have to get that html page in Javascript first. Although the printer is within the LAN of the user's PC, that Same Origin Policy is still staying firmly in my way. I tried JSONP, but the server returns html and I haven't found a way to modify its functionality (if I could, I would have already set the magic header Access-control-allow-origin: *). So I decided to write a small console app in C#. It has to be run as Admin to work properly, otherwise it trolls :D an exception. Here is some code:
// Create a listener.
HttpListener listener = new HttpListener();
// Add the prefixes.
//foreach (string s in prefixes)
//{
// listener.Prefixes.Add(s);
//}
listener.Prefixes.Add("http://*:1234/"); // accept connections from everywhere,
//because the printer is accessible only within the LAN (no portforwarding)
listener.Start();
Console.WriteLine("Listening...");
// Note: The GetContext method blocks while waiting for a request.
HttpListenerContext context;
string urlForRequest = "";
HttpWebRequest requestForPage = null;
HttpWebResponse responseForPage = null;
string responseForPageAsString = "";
while (true)
{
context = listener.GetContext();
HttpListenerRequest request = context.Request;
urlForRequest = request.RawUrl.Substring(1, request.RawUrl.Length - 1); // remove the slash, which separates the portNumber from the arg sent
Console.WriteLine(urlForRequest);
//Request for the html page:
requestForPage = (HttpWebRequest)WebRequest.Create(urlForRequest);
responseForPage = (HttpWebResponse)requestForPage.GetResponse();
responseForPageAsString = new StreamReader(responseForPage.GetResponseStream()).ReadToEnd();
// Obtain a response object.
HttpListenerResponse response = context.Response;
// Send back the response.
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseForPageAsString);
// Get a response stream and write the response to it.
response.ContentLength64 = buffer.Length;
response.AddHeader("Access-Control-Allow-Origin", "*"); // the magic header in action ;-D
System.IO.Stream output = response.OutputStream;
output.Write(buffer, 0, buffer.Length);
// You must close the output stream.
output.Close();
//listener.Stop();
All the user needs to do is run that console app as Admin. I know it is way too ... frustrating and complicated, but it is sort of a workaround to the Domain Policy problem in case you cannot modify the server in any way.
edit: from js I make a simple ajax call:
$.ajax({
type: 'POST',
url: 'http://LAN_IP:1234/http://google.com',
success: function (data) {
console.log("Success: " + data);
},
error: function (e) {
alert("Error: " + e);
console.log("Error: " + e);
}
});
The html of the requested page is returned and stored in the data variable.
To get the data form external site by passing using a local proxy as suggested by jherax you can create a php page that fetches the content for you from respective external url and than send a get request to that php page.
var req = new XMLHttpRequest();
req.open('GET', 'http://localhost/get_url_content.php',false);
if(req.status == 200) {
alert(req.responseText);
}
as a php proxy you can use https://github.com/cowboy/php-simple-proxy
Your URL doesn't work these days, but your code can be updated with this working solution:
var url = "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute";
url = 'https://google.com'; // TEST URL
$.get("https://images"+~~(Math.random()*33)+"-focus-opensocial.googleusercontent.com/gadgets/proxy?container=none&url=" + encodeURI(url), function(data) {
$('div.ajax-field').html(data);
});
<div class="ajax-field"></div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
You need CORS proxy which proxies your request from your browser to requested service with appropriate CORS headers. List of such services are in code snippet below. You can also run provided code snippet to see ping to such services from your location.
$('li').each(function() {
var self = this;
ping($(this).text()).then(function(delta) {
console.log($(self).text(), delta, ' ms');
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdn.rawgit.com/jdfreder/pingjs/c2190a3649759f2bd8569a72ae2b597b2546c871/ping.js"></script>
<ul>
<li>https://crossorigin.me/</li>
<li>https://cors-anywhere.herokuapp.com/</li>
<li>http://cors.io/</li>
<li>https://cors.5apps.com/?uri=</li>
<li>http://whateverorigin.org/get?url=</li>
<li>https://anyorigin.com/get?url=</li>
<li>http://corsproxy.nodester.com/?src=</li>
<li>https://jsonp.afeld.me/?url=</li>
<li>http://benalman.com/code/projects/php-simple-proxy/ba-simple-proxy.php?url=</li>
</ul>
Figured it out.
Used this instead.
$('.div_class').load('http://en.wikipedia.org/wiki/Cross-origin_resource_sharing #toctitle');
Using angular v1.3.1 i got a singular the following problem trying to implement a facade for making http request to a REST + JSON interface in the backend of the web app.
I got something like this in the code:
findSomething(value: number): ng.IPromise<api.DrugIndication[]> {
const getParams = { 'param' : 'value' };
const config:ng.IRequestShortcutConfig = {
headers: {
"Content-Type" : "application/json"
},
data: getParams
}
return this.$http.get(url,config);
}
And when the times comes to invoke it, i got an 400 Bad Request (btw: Great name for a band!) because the backend (made with Play for Scala) rejects the request inmediately. So making an inspection in the request i see that no data is being send in the body of the request/message.
So how i can send some data in the body of and HTTP Get request using angular "$http.get"?
Additional info: This doesn't happen if i the make request using the curl command from an ubuntu shell. So probably is an problem between Chrome and angular.js
If you inspect the network tab in chrome development tools you will see that this is a pre-flight OPTIONS request (Cross-Origin Resource Sharing (CORS)).
You have two ways to solve this.
Client side (this requires that your server does not require the application/json value)
GET, POST, HEAD methods only
Only browser set headers plus these
Content-Type only with:
application/x-www-form-urlencoded
multipart/form-data
text/plain
Server side
Set something like this as a middleware on your server framework:
if r.Method == "OPTIONS" {
w.Header().Set("Access-Control-Allow-Origin", "*")
w.Header().Set("Access-Control-Allow-Methods", "GET")
w.Header().Set("Access-Control-Allow-Headers", "Content-Type,Authorization")
w.Header().Set("Access-Control-Max-Age", "86400") // firefox: max 24h, chrome 10min
return
}
For your specific framework this should work
Using config.data will send the data in the request body, use
config.params = getParams
This is from the documentation :
params – {Object.} – Map of strings or objects which will be serialized with the paramSerializer and appended as GET parameters
I am using navigator for communicating with the server , but problem is that we need to pass some header information as there is filter which recognise the request is from the valid source.
Can anybody help on this?
Thanks.
See the Navigator.sendBeacon MDN documentation for further information.
Create a blob to provide headers. Here is an example:
window.onunload = () => {
const body = {
id,
email,
};
const headers = {
type: 'application/json',
};
const blob = new Blob([JSON.stringify(body)], headers);
navigator.sendBeacon('url', blob);
};
navigator.sendBeacon will send a POST request with the Content-Type request header set to whatever is in headers.type. This seems to be the only header you can set in a beacon though, per W3C:
The sendBeacon method does not provide ability to customize the request method, provide custom request headers, or change other processing properties of the request and response. Applications that require non-default settings for such requests should use the [FETCH] API with keepalive flag set to true.
I was able to observe some of how this worked through this Chromium bug report.
As written in the Processing Model of sendBeacon :
Extract object's byte stream (transmittedData) and content type (contentType).
How extraction is performed is described here
What I've gathered is that the content type of the transmitted data is extracted, and it is set as the Content-Type of the HTTP request.
1) If a Blob object is sent, the Content-Type becomes the Blob's type.
2) If a FormData object is sent, the Content-Type becomes multipart/form-data
3) If a URLSearchParams object is sent, the Content-Type becomes application/x-www-form-urlencoded
4) If a normal string is sent, the Content-Type becomes text/plain
Javascript code to implement different objects can be found here
If you're using Chrome and you're trying to set the content-type header, you'll probably have some issues due to security restrictions:
Uncaught DOMException: Failed to execute 'sendBeacon' on 'Navigator': sendBeacon() with a Blob whose type is not any of the CORS-safelisted values for the Content-Type request header is disabled temporarily. See http://crbug.com/490015 for details.
See sendBeacon API not working temporarily due to security issue, any workaround?
I want to call an api when someone close the tab, so I tried to use navigator.sendBeacon() but the problem is we need to pass the Authorization token into it and sendBeacon does not provide that, so I found other solution that is more effective and very easy to implement.
The solution is a native fetch API with a keepalive flag in pagehide event.
Code
window.addEventListener('pagehide', () => {
fetch(`<URL>`, {
keepalive: true,
method: '<METHOD>',
headers: {
'content-type': 'application/json',
// any header you can pass here
},
body: JSON.stringify({ data: 'any data' }),
});
});
FAQs / TL;DR Version
Why should we need to use the keepalive flag?
The keepalive option can be used to allow the request to outlive the page. Fetch with the keepalive flag is a replacement for the Navigator.sendBeacon() API.
Learn more about it, please visit https://developer.mozilla.org/en-US/docs/Web/API/fetch#parameters
What is PageLifecycle API
Learn more about it, please visit https://developer.chrome.com/blog/page-lifecycle-api/
From the Page Lifecycle image, shouldn't unload be considered as the best choice?
unload is the best event for this case but unload is not firing in some cases on mobile and it also does not support the bfcache functionality.
I also notice that when I am using unload then I am not getting proper output in the server log. why? IDK, if you know about it then comments are welcome.
Nowadays, It's also not recommended by the developers.
Learn more about why unload is not recommended: https://developer.mozilla.org/en-US/docs/Web/API/Window/unload_event#usage_notes
Learn more about pagehide: https://developer.mozilla.org/en-US/docs/Web/API/Window/pagehide_event
Because the method sendBeacon(..) does not allow headers manipulation, I added them into the form as normal fields:
const formData = new FormData();
formData.append('authorization', myAuthService.getCachedToken());
navigator.sendBeacon(myURL, formData);
Then on the host side I added a simple Middleware class (.Net) which catches POST requests without headers and copies them from the body:
public class AuthMiddleware
{
...
...
public async Task Invoke(HttpContext context)
{
string authHeader = context.Request.Headers["Authorization"];
if (authHeader == null && context.Request.Method=="POST")
{
context.Request.Headers["Authorization"] = string.Format("Bearer {0}",
context.Request.Form["authorization"].ToString());
}
await _next.Invoke(context);
}
}
Posting as an answer as I'm not allowed to post a comment under the answer:
For Chrome, issue with navigator.sendBeacon sending Blob for with non CORS-safelisted types was fixed in Chrome version 81 so this should be safe to use now.
https://bugs.chromium.org/p/chromium/issues/detail?id=724929
For IE, an alternative in unload event is to use synchronous ajax request, as IE doesn't support sendBeacon but supports synchronous ajax call in my case.
You can't send data with JSON after Chrome 39, has been disabled due to a security concern.
You can try to send data with plain text. But don't forget the parseing text from the backend.
After searching for an answer for this question I found out that for passing header with navigator we need to pass a blob object.
For example
var headers = {type: 'application/json'};
var blob = new Blob(request, headers);
navigator.sendBeacon('url/to/send', blob);
I am trying to make a POST request to the server (Which is a REST service)via javascript,and in my request i want to send a cookie.My below code is not working ,as I am not able to receive cookie at the server side.Below are my client side and server side code.
Client side :
var client = new XMLHttpRequest();
var request_data=JSON.stringify(data);
var endPoint="http://localhost:8080/pcap";
var cookie="session=abc";
client.open("POST", endPoint, false);//This Post will become put
client.setRequestHeader("Accept", "application/json");
client.setRequestHeader("Content-Type","application/json");
client.setRequestHeader("Set-Cookie","session=abc");
client.setRequestHeader("Cookie",cookie);
client.send(request_data);
Server Side:
public #ResponseBody ResponseEntity getPcap(HttpServletRequest request,#RequestBody PcapParameters pcap_params ){
Cookie cookies[]=request.getCookies();//Its coming as NULL
String cook=request.getHeader("Cookie");//Its coming as NULL
}
See the documentation:
Terminate these steps if header is a case-insensitive match for one of the following headers … Cookie
You cannot explicitly set a Cookie header using XHR.
It looks like you are making a cross origin request (you are using an absolute URI).
You can set withCredentials to include cookies.
True when user credentials are to be included in a cross-origin request. False when they are to be excluded in a cross-origin request and when cookies are to be ignored in its response. Initially false.
Such:
client.withCredentials = true;
This will only work if http://localhost:8080 has set a cookie using one of the supported methods (such as in an HTTP Set-Cookie response header).
Failing that, you will have to encode the data you wanted to put in the cookie somewhere else.
This can also be done with the more modern fetch
fetch(url, {
method: 'POST',
credentials: 'include'
//other options
}).then(response => console.log("Response status: ", response.status));