XMLHttpRequest doesn't update in javascript's interval - javascript

I use AJAX to check user updates every 2 seconds, but my javascript does not update the response.
I have one javascript file with XMLHttpRequest object and every 2 seconds it sends a request to another file (.php) where it gets XML with updates. For some reason, it doesn't always get the newest content and seems to have some old cached.
My javascript file contains this code (simplified):
var updates = new XMLHttpRequest();
updates.onreadystatechange = function(){
"use strict";
if(updates.readyState === 4 && updates.status === 200){
console.log(updates.responseXML);
}
};
var timer = 0;
clearInterval(timer);
timer = setInterval(function(){
"use strict";
updates.open('GET','scripts/check_for_notifications.php', true);
updates.send();
},2000);
Then I have the PHP file (check_for_notifications.php), where I have this code:
$response = new SimpleXMLElement('<xml/>');
$update = $response->addChild('update');
$update->addChild('content', 'New message');
$update->addChild('redirect', 'some link');
$update->addChild('date', '1.1.2019 12:00');
header('Content-type: text/xml');
print($response->asXML());
Every two second I receive a log in my console, but when I change the PHP file, while the interval is in progress (e.g. I change the date to '1.1.2019 11:00' and save it) I still receive the '12:00' in the console. For me, it seems that it doesn't update and it still has the repsonseXML cached. Is there any way I could "flush" the output or am I doing it wrong?

It's probably a cache problem. In the network browser console, you should see a response of type 304 Not modified.
To be sure, you can add an element in the url to bypass the cache:
updates.open('GET','scripts/check_for_notifications.php&nocache=' + new Date().getTime(), true);
If this works, you will need to configure the server (apache or nginx) to prevent the file from being cached. It's cleaner than the timestamp solution. the browser does not store cached files unnecessarily.
Apache .htaccess or Apache conf, something like
<Files "check.php">
<IfModule mod_headers.c>
Header set Cache-Control "no-cache, no-store, must-revalidate"
Header set Pragma "no-cache"
Header set Expires 0
</IfModule>
</Files>
Nginx conf, something like
location = /check.php {
add_header 'Cache-Control' 'no-cache, no-store, must-revalidate';
expires off;
}
You can also see the fetch api : https://hacks.mozilla.org/2016/03/referrer-and-cache-control-apis-for-fetch/
Be careful with your code, it can launch several requests simultaneously and find yourself with a DOS if there are too many users.
If requests take more than 2 seconds because the server is slow, others requests will be sent in the meantime, which will slow down the server even more....

Related

How to prevent HTTP caching of REST calls in browser?

I have a bunch of REST APIs which send cache-control as public, only-if-cached, max-stale=2419200, max-age=60
max-age=60 makes sure that if the client sends an identical API request, the client picks the response from the cache instead of forwarding it to the server for a fresh response.
Now, this works perfectly for our mobile app because it is read only and a minute delay in data update is acceptable. However, the website, which uses the same set of APIs to add/ delete data, needs to be real-time i.e. always ignore the max-age=60 part of the header. Since, currently, the caching header is not ignored all PUT, PATCH or DELETE request reflect the changes only after a minute.
Is there any way I can let my website to ignore cache-control. The web pages are written in plain JS and HTML. JS uses fetch for all REST requests.
You can disable cache in fetch by appending in headers
var headers = new Headers();
headers.append('pragma', 'no-cache');
headers.append('cache-control', 'no-cache');
var init = {
method: 'GET',
headers: headers,
};
var request = new Request(YOUR_URL);
fetch(request, init)
.....
You can also use cache-control: no-store, which will never store a cache version.
alternately, you can use a dynamic string in URL, it will still store a version in your browser's cache , like
const ms = Date.now();
const data = await fetch(YOUR_URL+"?t="+ms)
Implement Timestamped Requests Logic:
Append timestamp to each of the requests. With the help of a javascript utility method something like below, this method will append time stamp to all the requests urls along with the exisitng query params to the URL if exists.
private getTimeStampedUrl(url:string){
var timestamp = Date.now();
var timestampedUrl = (url.indexOf('?') == -1) ? url + '?' : url + '&';
timestampedUrl += 'timestamp=' + timestamp;
return timestampedUrl;
}
This will treat each of the requests as new requests from browser and caching will not be working. In general, browser cache works if we send similar requests, which we are breaking here, by appending unique time stamps to each requests.

how to make a cross domain api call using jQuery [duplicate]

I'm trying to load a cross-domain HTML page using AJAX but unless the dataType is "jsonp" I can't get a response. However using jsonp the browser is expecting a script mime type but is receiving "text/html".
My code for the request is:
$.ajax({
type: "GET",
url: "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute",
dataType: "jsonp",
}).success( function( data ) {
$( 'div.ajax-field' ).html( data );
});
Is there any way of avoiding using jsonp for the request? I've already tried using the crossDomain parameter but it didn't work.
If not is there any way of receiving the html content in jsonp? Currently the console is saying "unexpected <" in the jsonp reply.
jQuery Ajax Notes
Due to browser security restrictions, most Ajax requests are subject to the same origin policy; the request can not successfully retrieve data from a different domain, subdomain, port, or protocol.
Script and JSONP requests are not subject to the same origin policy restrictions.
There are some ways to overcome the cross-domain barrier:
CORS Proxy Alternatives
Ways to circumvent the same-origin policy
Breaking The Cross Domain Barrier
There are some plugins that help with cross-domain requests:
Cross Domain AJAX Request with YQL and jQuery
Cross-domain requests with jQuery.ajax
Heads up!
The best way to overcome this problem, is by creating your own proxy in the back-end, so that your proxy will point to the services in other domains, because in the back-end not exists the same origin policy restriction. But if you can't do that in back-end, then pay attention to the following tips.
**Warning!**
Using third-party proxies is not a secure practice, because they can keep track of your data, so it can be used with public information, but never with private data.
The code examples shown below use jQuery.get() and jQuery.getJSON(), both are shorthand methods of jQuery.ajax()
CORS Anywhere
2021 Update
Public demo server (cors-anywhere.herokuapp.com) will be very limited by January 2021, 31st
The demo server of CORS Anywhere (cors-anywhere.herokuapp.com) is meant to be a demo of this project. But abuse has become so common that the platform where the demo is hosted (Heroku) has asked me to shut down the server, despite efforts to counter the abuse. Downtime becomes increasingly frequent due to abuse and its popularity.
To counter this, I will make the following changes:
The rate limit will decrease from 200 per hour to 50 per hour.
By January 31st, 2021, cors-anywhere.herokuapp.com will stop serving as an open proxy.
From February 1st. 2021, cors-anywhere.herokuapp.com will only serve requests after the visitor has completed a challenge: The user (developer) must visit a page at cors-anywhere.herokuapp.com to temporarily unlock the demo for their browser. This allows developers to try out the functionality, to help with deciding on self-hosting or looking for alternatives.
CORS Anywhere is a node.js proxy which adds CORS headers to the proxied request.
To use the API, just prefix the URL with the API URL. (Supports https: see github repository)
If you want to automatically enable cross-domain requests when needed, use the following snippet:
$.ajaxPrefilter( function (options) {
if (options.crossDomain && jQuery.support.cors) {
var http = (window.location.protocol === 'http:' ? 'http:' : 'https:');
options.url = http + '//cors-anywhere.herokuapp.com/' + options.url;
//options.url = "http://cors.corsproxy.io/url=" + options.url;
}
});
$.get(
'http://en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
Whatever Origin
Whatever Origin is a cross domain jsonp access. This is an open source alternative to anyorigin.com.
To fetch the data from google.com, you can use this snippet:
// It is good specify the charset you expect.
// You can use the charset you want instead of utf-8.
// See details for scriptCharset and contentType options:
// http://api.jquery.com/jQuery.ajax/#jQuery-ajax-settings
$.ajaxSetup({
scriptCharset: "utf-8", //or "ISO-8859-1"
contentType: "application/json; charset=utf-8"
});
$.getJSON('http://whateverorigin.org/get?url=' +
encodeURIComponent('http://google.com') + '&callback=?',
function (data) {
console.log("> ", data);
//If the expected response is text/plain
$("#viewer").html(data.contents);
//If the expected response is JSON
//var response = $.parseJSON(data.contents);
});
CORS Proxy
CORS Proxy is a simple node.js proxy to enable CORS request for any website.
It allows javascript code on your site to access resources on other domains that would normally be blocked due to the same-origin policy.
CORS-Proxy gr2m (archived)
CORS-Proxy rmadhuram
How does it work?
CORS Proxy takes advantage of Cross-Origin Resource Sharing, which is a feature that was added along with HTML 5. Servers can specify that they want browsers to allow other websites to request resources they host. CORS Proxy is simply an HTTP Proxy that adds a header to responses saying "anyone can request this".
This is another way to achieve the goal (see www.corsproxy.com). All you have to do is strip http:// and www. from the URL being proxied, and prepend the URL with www.corsproxy.com/
$.get(
'http://www.corsproxy.com/' +
'en.wikipedia.org/wiki/Cross-origin_resource_sharing',
function (response) {
console.log("> ", response);
$("#viewer").html(response);
});
The http://www.corsproxy.com/ domain now appears to be an unsafe/suspicious site. NOT RECOMMENDED TO USE.
CORS proxy browser
Recently I found this one, it involves various security oriented Cross Origin Remote Sharing utilities. But it is a black-box with Flash as backend.
You can see it in action here: CORS proxy browser
Get the source code on GitHub: koto/cors-proxy-browser
You can use Ajax-cross-origin a jQuery plugin.
With this plugin you use jQuery.ajax() cross domain. It uses Google services to achieve this:
The AJAX Cross Origin plugin use Google Apps Script as a proxy jSON
getter where jSONP is not implemented. When you set the crossOrigin
option to true, the plugin replace the original url with the Google
Apps Script address and send it as encoded url parameter. The Google
Apps Script use Google Servers resources to get the remote data, and
return it back to the client as JSONP.
It is very simple to use:
$.ajax({
crossOrigin: true,
url: url,
success: function(data) {
console.log(data);
}
});
You can read more here:
http://www.ajax-cross-origin.com/
If the external site doesn't support JSONP or CORS, your only option is to use a proxy.
Build a script on your server that requests that content, then use jQuery ajax to hit the script on your server.
Just put this in the header of your PHP Page and it ill work without API:
header('Access-Control-Allow-Origin: *'); //allow everybody
or
header('Access-Control-Allow-Origin: http://codesheet.org'); //allow just one domain
or
$http_origin = $_SERVER['HTTP_ORIGIN']; //allow multiple domains
$allowed_domains = array(
'http://codesheet.org',
'http://stackoverflow.com'
);
if (in_array($http_origin, $allowed_domains))
{
header("Access-Control-Allow-Origin: $http_origin");
}
I'm posting this in case someone faces the same problem I am facing right now. I've got a Zebra thermal printer, equipped with the ZebraNet print server, which offers a HTML-based user interface for editing multiple settings, seeing the printer's current status, etc. I need to get the status of the printer, which is displayed in one of those html pages, offered by the ZebraNet server and, for example, alert() a message to the user in the browser. This means that I have to get that html page in Javascript first. Although the printer is within the LAN of the user's PC, that Same Origin Policy is still staying firmly in my way. I tried JSONP, but the server returns html and I haven't found a way to modify its functionality (if I could, I would have already set the magic header Access-control-allow-origin: *). So I decided to write a small console app in C#. It has to be run as Admin to work properly, otherwise it trolls :D an exception. Here is some code:
// Create a listener.
HttpListener listener = new HttpListener();
// Add the prefixes.
//foreach (string s in prefixes)
//{
// listener.Prefixes.Add(s);
//}
listener.Prefixes.Add("http://*:1234/"); // accept connections from everywhere,
//because the printer is accessible only within the LAN (no portforwarding)
listener.Start();
Console.WriteLine("Listening...");
// Note: The GetContext method blocks while waiting for a request.
HttpListenerContext context;
string urlForRequest = "";
HttpWebRequest requestForPage = null;
HttpWebResponse responseForPage = null;
string responseForPageAsString = "";
while (true)
{
context = listener.GetContext();
HttpListenerRequest request = context.Request;
urlForRequest = request.RawUrl.Substring(1, request.RawUrl.Length - 1); // remove the slash, which separates the portNumber from the arg sent
Console.WriteLine(urlForRequest);
//Request for the html page:
requestForPage = (HttpWebRequest)WebRequest.Create(urlForRequest);
responseForPage = (HttpWebResponse)requestForPage.GetResponse();
responseForPageAsString = new StreamReader(responseForPage.GetResponseStream()).ReadToEnd();
// Obtain a response object.
HttpListenerResponse response = context.Response;
// Send back the response.
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseForPageAsString);
// Get a response stream and write the response to it.
response.ContentLength64 = buffer.Length;
response.AddHeader("Access-Control-Allow-Origin", "*"); // the magic header in action ;-D
System.IO.Stream output = response.OutputStream;
output.Write(buffer, 0, buffer.Length);
// You must close the output stream.
output.Close();
//listener.Stop();
All the user needs to do is run that console app as Admin. I know it is way too ... frustrating and complicated, but it is sort of a workaround to the Domain Policy problem in case you cannot modify the server in any way.
edit: from js I make a simple ajax call:
$.ajax({
type: 'POST',
url: 'http://LAN_IP:1234/http://google.com',
success: function (data) {
console.log("Success: " + data);
},
error: function (e) {
alert("Error: " + e);
console.log("Error: " + e);
}
});
The html of the requested page is returned and stored in the data variable.
To get the data form external site by passing using a local proxy as suggested by jherax you can create a php page that fetches the content for you from respective external url and than send a get request to that php page.
var req = new XMLHttpRequest();
req.open('GET', 'http://localhost/get_url_content.php',false);
if(req.status == 200) {
alert(req.responseText);
}
as a php proxy you can use https://github.com/cowboy/php-simple-proxy
Your URL doesn't work these days, but your code can be updated with this working solution:
var url = "http://saskatchewan.univ-ubs.fr:8080/SASStoredProcess/do?_username=DARTIES3-2012&_password=P#ssw0rd&_program=%2FUtilisateurs%2FDARTIES3-2012%2FMon+dossier%2Fanalyse_dc&annee=2012&ind=V&_action=execute";
url = 'https://google.com'; // TEST URL
$.get("https://images"+~~(Math.random()*33)+"-focus-opensocial.googleusercontent.com/gadgets/proxy?container=none&url=" + encodeURI(url), function(data) {
$('div.ajax-field').html(data);
});
<div class="ajax-field"></div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
You need CORS proxy which proxies your request from your browser to requested service with appropriate CORS headers. List of such services are in code snippet below. You can also run provided code snippet to see ping to such services from your location.
$('li').each(function() {
var self = this;
ping($(this).text()).then(function(delta) {
console.log($(self).text(), delta, ' ms');
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdn.rawgit.com/jdfreder/pingjs/c2190a3649759f2bd8569a72ae2b597b2546c871/ping.js"></script>
<ul>
<li>https://crossorigin.me/</li>
<li>https://cors-anywhere.herokuapp.com/</li>
<li>http://cors.io/</li>
<li>https://cors.5apps.com/?uri=</li>
<li>http://whateverorigin.org/get?url=</li>
<li>https://anyorigin.com/get?url=</li>
<li>http://corsproxy.nodester.com/?src=</li>
<li>https://jsonp.afeld.me/?url=</li>
<li>http://benalman.com/code/projects/php-simple-proxy/ba-simple-proxy.php?url=</li>
</ul>
Figured it out.
Used this instead.
$('.div_class').load('http://en.wikipedia.org/wiki/Cross-origin_resource_sharing #toctitle');

how to remove cache from http get reqest on node js

I am working on node js web application and I am doing a http get request which address a db and takes data via query.
The http get request is working fine on chrome but on IE each get is not updated but returned from some sort of cache. That makes the result of the db query to be not updated (beacuse it is taken from chache).
I can see that it is being taken from chache form F12 developer tool in IE:
My code below. I know that I should add something like:
res.setHeader("Cache-Control", "no-cache, no-store, must-revalidate");
to my request but I think that I maybe put this line in the wrong place beacuse the get reuqest still taken from chache and gives me bad result...
client
$http.get('/users')
.success(function(data) {
$scope.usersNumber = data.length;
})
.error(function(data) {
console.log('Error: ' + data);
});
server
app.get('/users', function(req, res){
get_user(req, res);
});
var get_user= function(req, res){
var query= User_session.find();
query.exec( function(err, docs){
res.setHeader("Cache-Control", "no-cache, no-store, must-revalidate");
res.json(docs);
//mongoose.connection.close();
});
}
Try adding max-age to your header:
Cache-Control:private, no-cache, no-store, must-revalidate, max-age=0
Pragma:no-cache
The response that you captured from the developer tools shows that the response is picked from cache. The cached response had a expires header which specified the duration for which the response can be picked from cache and is not considered stale.
You would want to include a no-cache directive for End-to-end reload
The request includes a "no-cache" cache-control directive or, for
compatibility with HTTP/1.0 clients, "Pragma: no-cache". Field names
MUST NOT be included with the no-cache directive in a request. The
server MUST NOT use a cached copy when responding to such a request.
Reference: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html
You can also try this another solution to force client reloads. Add a query string parameter to your url.
Eg: If you have a /users, instead make the url as /users?uniqstamp=<timestamp>
Your browser always thinks that the entire path including the query string makes a URL. so you will always get a 200 ok that is served fresh. Make sure that timestamp that you use is from javascript time in millis so that the timestamp is always unique.

QTcpSocket does not flush HTTP headers (Comet pattern)

Implementing a simple HTTP server in Qt, with the purpose of streaming real time data to an XMLHttpRequest object (AJAX/JavaScript).
The problem is that the design pattern requires partial transmission of data via the socket connection, changing the readyState in the XHR from '1' (Request) to '2' (Headers received), and then to '3' (Data received) - keeping the request pending. This is also known as "long-polling", or "Comet" and should be possible in most browsers.
However, it stays in request state until the connection is closed, and then readyState '2' and '4' are received. This is normal for HTTP GET, but not desired for this application.
JavaScript:
var request = new XMLHttpRequest();
request.onreadystatechange = function() {
console.log('readyState: ' + this.readyState + ' ' + this.status)
}
request.open("get", "localhost:8080/", true);
request.send();
Qt:
connect(socket, &QTcpSocket::readyRead, [=]()
{
QByteArray data = m_socket->read(1000);
socket->write("HTTP/1.1 200 OK\r\n");
socket->write("Content-Type: text/octet-stream\r\n");
socket->write("Access-Control-Allow-Origin: *\r\n");
socket->write("Cache-Control: no-cache, no-store, max-age=0, must-revalidate\r\n");
socket->flush();
}
So the big question is: How can I make the network system underneath the QtTcpSocket flush pending data after writing the headers (and later, the data), without the need to disconnect first?
A side note: I originally implemented this using WebSockets, but the browser I have to use does not support this.
EDIT:
The HTTP header formatting must have an extra set of "\r\n". Now it works:
connect(socket, &QTcpSocket::readyRead, [=]()
{
QByteArray data = m_socket->read(1000);
socket->write("HTTP/1.1 200 OK\r\n");
socket->write("Content-Type: text/octet-stream\r\n");
socket->write("Access-Control-Allow-Origin: *\r\n");
socket->write("Cache-Control: no-cache, no-store, max-age=0, must-revalidate\r\n");
socket->write("\r\n");
socket->flush();
}
Got it working now after a full day of trying different HTTP header configurations. It seems like user 'peppe' was on to something, and the only thing I needed was to add "\r\n" after the headers! (See edit).

Cookie management after 302 redirection with XMLHttpRequest

I'm developing a Firefox OS client for ownCloud. When I try to login and send the user credentials to the server, I need to obtain in response the cookie that I will use to authenticate in ownCloud in each request.
My problem is that as I’ve seen in Wireshark, the cookie is sent in a HTTP 302 message, but I cannot read this message in my code because Firefox handles it automatically and I read the final HTTP 200 message without cookie information in the
request.reponseText;
request.getAllResponseHeaders();
So my question is if there is any way to read this HTTP 302 message headers, or if I can obtain the cookie from Firefox OS before I send the next request, or even make Firefox OS to add the cookie automatically. I use the following code to make the POST:
request = new XMLHttpRequest({mozSystem: true});
request.open('post', serverInput, true);
request.withCredentials=true;
request.addEventListener('error', onRequestError);
request.setRequestHeader("Cookie",cookie_value);
request.setRequestHeader("Connection","keep-alive");
request.setRequestHeader("Content-type","application/x-www-form-urlencoded");
request.send(send_string);
if(request.status == 200 || request.status==302){
response = request.responseText;
var headers = request.getAllResponseHeaders();
document.getElementById('results').innerHTML="Server found";
loginSuccessfull();
}else{
alert("Response not found");
document.getElementById('results').innerHTML="Server NOT found";
}
"mozAnon
Boolean: Setting this flag to true will cause the browser not to expose the origin and user credentials when fetching resources. Most important, this means that cookies will not be sent unless explicitly added using setRequestHeader.
mozSystem
Boolean: Setting this flag to true allows making cross-site connections without requiring the server to opt-in using CORS. Requires setting mozAnon: true, i.e. this can't be combined with sending cookies or other user credentials." [0]
I'm not sure if you're an owncloud developer, but if you are and have access to the server, you should try setting CORS headers. [1] Maybe if you can stand up a proxy server and have your app connect to the proxy server that does have CORS enabled?
There's also a withCredentials property [2] you can set on instances of xhr objects. It looks like it will add the header Access-Control-Request-Headers: "cookies" and send an HTTP OPTIONS request, which is the preflight [3]. So this would still require server side support for CORS. [4]
Though it seems like this shouldn't work based on internal comments [5], I was able to run this from a simulator and see the request and response headers:
var x = new XMLHttpRequest({ mozSystem: true });
x.open('get', 'http://stackoverflow.com');
x.onload = function () { console.log(x.getResponseHeader('Set-Cookie')); };
x.setRequestHeader('Cookie', 'hello=world;');
x.send();
You'd probably want to reassign document.cookie in the onload event, rather than logging it, if the response header exists (not every site sets cookies on every request). You'd also want to set the request header to document.cookie itself.
[0] https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest#XMLHttpRequest%28%29
[1] https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS
[2] https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest#Properties
[3] https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS#Preflighted_requests
[4] http://www.html5rocks.com/en/tutorials/cors/#toc-making-a-cors-request
[5] https://bugzilla.mozilla.org/show_bug.cgi?id=966216#c2

Categories