jQuery $.getScript last modified date - javascript

I've read that it could be wise (for caching purposes) to call a file with it's last modified date and let the server resolve it to the original file. In this way you could set caching to for example 10 years and use a static name for a certain version of the file.
However since I also load in javascript asynchronously on my site, I need to be able to do the same in javascript/jQuery.
This is my current code, how would I be able to get the last-modified date of the script in question being loaded?
//load new js
if (typeof loadedScripts[url] === 'undefined') {
$.getScript("javascript/" + url + ".js", function() {
if (typeof window["load_" + url] !== 'undefined') {
promises = promises.concat(window["load_" + url](html));
}
loadedScripts[url] = 1;
});
}
else {
if (typeof window["load_" + url] !== 'undefined') {
promises = promises.concat(window["load_" + url](html));
}
}
(It also executes a function called on load, but that is not interesting for this question)
I know it is possible to get the last modified date of the current document with document.lastModified, but I'm unsure how it would translate into a $.getScript call.
I have also enabled caching:
//set caching to true
$.ajaxSetup({
cache: true
});

For caching purposes, I rather would suggest the ETag.
http://en.wikipedia.org/wiki/HTTP_ETag
We use it for busting the cache if needed and it works great.
However, jQuery's Ajax function provides an ifModified param:
http://api.jquery.com/jquery.ajax/
Here's the explanation:
Allow the request to be successful only if the response has changed
since the last request. This is done by checking the Last-Modified
header. Default value is false, ignoring the header. In jQuery 1.4
this technique also checks the 'etag' specified by the server to catch
unmodified data.
Using this param, the first request to get the Script would look like this:
GET /script.js HTTP/1.1
Host: www.orange-coding.net
HTTP/1.1 200 OK
Last-Modified: Wed, 02 Jan 2013 14:20:58 GMT
Content-Length: 4096
And a second request would look like this:
GET /script.js HTTP/1.1
Host: www.orange-coding.net
If-Modified-Since: Wed, 06 Oct 2010 08:20:58 GMT
HTTP/1.1 304 Not Modified

Related

XMLHttpRequest doesn't update in javascript's interval

I use AJAX to check user updates every 2 seconds, but my javascript does not update the response.
I have one javascript file with XMLHttpRequest object and every 2 seconds it sends a request to another file (.php) where it gets XML with updates. For some reason, it doesn't always get the newest content and seems to have some old cached.
My javascript file contains this code (simplified):
var updates = new XMLHttpRequest();
updates.onreadystatechange = function(){
"use strict";
if(updates.readyState === 4 && updates.status === 200){
console.log(updates.responseXML);
}
};
var timer = 0;
clearInterval(timer);
timer = setInterval(function(){
"use strict";
updates.open('GET','scripts/check_for_notifications.php', true);
updates.send();
},2000);
Then I have the PHP file (check_for_notifications.php), where I have this code:
$response = new SimpleXMLElement('<xml/>');
$update = $response->addChild('update');
$update->addChild('content', 'New message');
$update->addChild('redirect', 'some link');
$update->addChild('date', '1.1.2019 12:00');
header('Content-type: text/xml');
print($response->asXML());
Every two second I receive a log in my console, but when I change the PHP file, while the interval is in progress (e.g. I change the date to '1.1.2019 11:00' and save it) I still receive the '12:00' in the console. For me, it seems that it doesn't update and it still has the repsonseXML cached. Is there any way I could "flush" the output or am I doing it wrong?
It's probably a cache problem. In the network browser console, you should see a response of type 304 Not modified.
To be sure, you can add an element in the url to bypass the cache:
updates.open('GET','scripts/check_for_notifications.php&nocache=' + new Date().getTime(), true);
If this works, you will need to configure the server (apache or nginx) to prevent the file from being cached. It's cleaner than the timestamp solution. the browser does not store cached files unnecessarily.
Apache .htaccess or Apache conf, something like
<Files "check.php">
<IfModule mod_headers.c>
Header set Cache-Control "no-cache, no-store, must-revalidate"
Header set Pragma "no-cache"
Header set Expires 0
</IfModule>
</Files>
Nginx conf, something like
location = /check.php {
add_header 'Cache-Control' 'no-cache, no-store, must-revalidate';
expires off;
}
You can also see the fetch api : https://hacks.mozilla.org/2016/03/referrer-and-cache-control-apis-for-fetch/
Be careful with your code, it can launch several requests simultaneously and find yourself with a DOS if there are too many users.
If requests take more than 2 seconds because the server is slow, others requests will be sent in the meantime, which will slow down the server even more....

Chrome/IE abort POST response

Chrome and Microsoft IE are aborting a POST response but it's working fine in Firefox. When I run a test through Fiddler I notice the POST headers in Chrome/IE do not contain Cache-Control: max-age=0 and Origin: ... but are otherwise the same.
In Chrome when I press the submit button the POST is sent to the server, processed, and the client aborts the response. After a few reposts, the client finally accepts the response; however the server has already processed the request so it results in duplicate info. This never happens on Firefox, it just always accepts the first response.
It seems to only happen if the request is large (ie: contains a lot more fields for the server to process) leading me to think this has something to do with the time it takes for the server to process the request (in Firefox the request shows as taking about 9 seconds).
Is there something in Firefox that would cause it to wait longer for this response? Or vice-versa, something in IE/Chrome that could be making it end prematurely?
It may not be relevant but this is a Perl Mason site. The response headers in the page which has the form being submitted:
HTTP/1.1 200 OK
Date: Tue, 07 Aug 2018 19:08:57 GMT
Server: Apache
Set-Cookie: ...; path=/
Set-Cookie: TUSKMasonCookie=...; path=/
Expires: Mon, 1 Jan 1990 05:00:00 GMT
Pragma: no-cache
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0, max-age=0
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
It turns out it was Javascript on the page responsible for the reposting. A setTimeout() which recursively called its own function was continuing to be set even after the form data had been posted within the method.
In Chrome/IE/Edge the form submission would be POSTed and the function would continue to set another timeout calling itself. The subsequent call would again POST and abort the connection waiting on the original.
Firefox however would not repost although it too would continue to set and recall the function.
The fix was to add a flag to track when the POST was submitted and when set, to stop the timeout cycle:
function countdown(secs, starttime) {
var d = new Date();
if (!starttime) {
starttime = d.getTime() / 1000;
}
var nowtime = d.getTime() / 1000;
var timeleft = (starttime + secs) - nowtime;
timeleft = Math.max(0, timeleft);
var isposted = false; // <-- Added as fix
if (timeleft == 0) {
isposted = true; // <-- Added as fix
alert("Time is up. Click OK.");
var frm = document.getElementById("frm");
frm.submit();
}
if (!isposted) { // <-- Added as fix
timeout = setTimeout(["countdown(", secs, ", ", starttime, ")"].join(""), 1000);
}
}

How to mock a 501 error from ajax call

I've got an http 501 error bubbling up in prod from this call:
return $.ajax({
url: finalUrl,
success: function (result) {
console.log('success call result');
console.log(result);
finalResult = result;
},
data: JSON.stringify(data),
type: 'PATCH',
contentType: 'application/json'
});
How can I return a mock simulating the error so I can test a fix outside of production? I looked at the response tab in chrome and I see an HTML message:
<HTML><HEAD>
<TITLE>Unsupported Request</TITLE>
</HEAD><BODY>
<H1>Unsupported Request</H1>
PATCH to http://demo.site.com/serviceName/v1/requests/9305e338-666a-e611-8516-000c291891bb not supported.<P>
Reference #8.f0fd717.1472154919.9959c96
</BODY></HTML>
We suspect that the API is not being hit at all, blocked by a firewall. I don't know if I should expect a string or object in this case? If object, then what are the members of that object?
Fiddler says:
Server: AkamaiGHost
Mime-Version: 1.0
Content-Type: text/html
Content-Length: 350
Expires: Thu, 25 Aug 2016 20:21:49 GMT
Date: Thu, 25 Aug 2016 20:21:49 GMT
Connection: close
We suspect that the API is not being hit at all, blocked by a firewall.
From the error, it looks like you're hitting a back-end that isn't configured for that request-type.
See these questions for a potential solution, assuming you control the back-end too.
Does tomcat support http PATCH method?
How do I stop Apache httpd from rejecting HTTP PATCH requests?
If you do control the back-end, but the above doesn't help, make sure that your controller function supports that request method. In Spring, for example, you have to declare that explicitly:
#Controller
MyController {
#Requestmapping(value = "/api/patchTheThing", method=RequestMethod.PATCH)
#ResponseBody String patchTheThing(....) {
...
}
}
Just set up an endpoint on a server of your choice and use your web server configure or coding language of your choice to issue a 501 response. But the problem seems pretty clear, the server is not expecting a PATCH request. If you had some networking problem you would not be getting a response at all.
you should strongly consider making sure your JavaScript code handles both happy path / success outcome as well as other types off sever response outcomes so your application can better handle how it wants to recover from such errors.

Something is breaking my Node.js HTTP requests and I don't know what is breaking it

I am parsing in about 4000 URLs with a generic Node.js HTTP request script:
(function (i){
http.get(options, function(res) {
var obj = {};
obj.url = hostNames[i];
obj.statusCode = res.statusCode;
obj.headers = res.headers;
db.scrape.save(obj);
}).on('error',function(e){
console.log("Error: " + hostNames[i] + "\n" + e.stack);
})
})(i);
Around 1300 URLs in, I get this error, which stops the entire script. I don't know what page.ly is, as I do not have that in my list of URLs. I've done a lot of research, but I could not pin-point what's causing this error.
If someone is familiar with HTTP requests on Node.js - could you help me out?
Error: key page.ly must not contain '.'
at Error (unknown source)
at Function.checkKey (/Users/loop/node_modules/mongojs/node_modules/mongodb/node_modules/bson/lib/bson/bson.js:1421:11)
at serializeObject (/Users/loop/node_modules/mongojs/node_modules/mongodb/node_modules/bson/lib/bson/bson.js:355:14)
at packElement (/Users/loop/node_modules/mongojs/node_modules/mongodb/node_modules/bson/lib/bson/bson.js:854:23)
at serializeObject (/Users/loop/node_modules/mongojs/node_modules/mongodb/node_modules/bson/lib/bson/bson.js:359:15)
at Function.serializeWithBufferAndIndex (/Users/loop/node_modules/mongojs/node_modules/mongodb/node_modules/bson/lib/bson/bson.js:332:10)
at BSON.serializeWithBufferAndIndex (/Users/loop/node_modules/mongojs/node_modules/mongodb/node_modules/bson/lib/bson/bson.js:1502:15)
at InsertCommand.toBinary (/Users/loop/node_modules/mongojs/node_modules/mongodb/lib/mongodb/commands/insert_command.js:132:37)
at Connection.write (/Users/loop/node_modules/mongojs/node_modules/mongodb/lib/mongodb/connection/connection.js:198:35)
at __executeInsertCommand (/Users/loop/node_modules/mongojs/node_modules/mongodb/lib/mongodb/db.js:1745:14)
at Db._executeInsertCommand (/Users/loop/node_modules/mongojs/node_modules/mongodb/lib/mongodb/db.js:1801:5)
Loops-MacBook-Air:JS loop$
What could prevent this? It seems my script does not scale very well.
EDIT: From the answers I am getting - there exists a key somewhere that has a ".", which isn't allowed in MongoDB, and I am supposed to escape it. But the question remains - if my keys are only url, statusCode, and headers, what is causing the key with a . in it to show up?
EDIT: Bug is found. Answer below.
This error is caused when you attempt persist an Object in MongoDB and one (or more) of the keys contain the character '.', e.g:
{
"name": "bob",
"url": "http://example.com",
"some.field": "value"
}
would raise the error Error: key some.field must not contain '.'.
Scrub your object keys of '.'s before saving to MongoDB!
The site "divensurf.com" has a header which is called page.ly: v4.0
I have no idea what the is, but that broke my import into MongoDB, since keys cannot symbols. I found the culprit by printing the output onto a .txt file, did a search on the header page.ly, found the site, and deleted it.
I will be sanitizing the headers before importing.
Thanks for the help guys.
HTTP/1.1 304 Not Modified
X-Varnish: 2236761436 2236710300
Vary: Accept-Encoding,Cookie,X-UA-Device
Cache-Control: max-age=7200, must-revalidate
X-Cache: V1HIT 5
Content-Type: text/html; charset=UTF-8
Page.ly: v4.0
Content-Encoding: gzip
X-Pingback: http://divensurf.com/xmlrpc.php
Date: Thu, 21 Mar 2013 19:45:35 GMT
Accept-Ranges: bytes
Via: 1.1 varnish
Connection: keep-alive
Last-Modified: Thu, 21 Mar 2013 19:40:57 GMT
Age: 278

Force a reload of page in Chrome using Javascript [no cache]

I need to reload a page using JavaScript and ensure that it does not pull from the browser cache but instead reloads the page from the server.
[As elements of the page will have changed in the interim]
On IE and FF I found that the following code worked fine;
window.location.reload(true);
However it does not work on Chrome or Safari.
I tried the following, but also to no avail;
window.location.replace(location.href);
document.location.reload(true);
document.location.replace(location.href);
Is there a solution to this issue?
Findings
After looking into this I have found that this issue is HTTP Protocol handling;
Chrome sends a request with Pragma: no-cache HTTP field
Server responds with Last-Modified: DATE1 field
JS uses location.reload(true) to force a reload from server not cache
Chrome sends a request with If-Modified-Since: DATE1 field
Server responds with HTTP Status 304 Not Modified
The server application is at fault for not noticing the state change in the dynamic page content, and thus not returning a 200.
However, Chrome/WebKit is the only browser that sends a If-Modified-Since field when the JS location.reload(true) is called.
I thought I would put my findings here in-case someone else comes across the same issue.
You can use this hack:
$.ajax({
url: window.location.href,
headers: {
"Pragma": "no-cache",
"Expires": -1,
"Cache-Control": "no-cache"
}
}).done(function () {
window.location.reload(true);
});
To ensure the page isn't loaded from cache you can add some unique number to query:
window.location = location.href + '?upd=' + 123456;
You also can use date instead of 123456
This is what I do to ensure my application file is force reloaded on chrome:
var oAjax = new XMLHttpRequest;
oAjax.open( 'get', '/path/to/my/app.js' );
oAjax.setRequestHeader( 'Pragma', 'no-cache' );
oAjax.send();
oAjax.onreadystatechange = function() {
if( oAjax.readyState === 4 ) {
self.location.reload();
}
}
Try window.location = window.location
Great findings! I just encountered the same issue and this really helps a lot!
However, in addition to your finding, it seems that Chrome always sends a GET request for location.reload()...IE/FF is repeating the last request instead.

Categories