I'm using a method of creating a .js file on server #1 which contains document.writes to write html code, then a simple js include inside html code on server #2 to load that html code (there are multiple server #2's). This is basically replacing an iframe method with the advantage being that each server #2 owner controls their own css.
The method works perfectly as is. My question has to deal with caching. Each time the page is loaded on server #2 I want the .js reloaded, as it will change frequently on server #1. This appears to be the case on each browser I tested, but can I rely on this as being the default case, or is it dependent on browser settings? Despite all I've read on caching I can't figure out what triggers the load for a case like this.
You can control browser caching using HTTP headers on the server side. Like cache-control and cache-expiration. More here - http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
In a case like this, the caching is triggered by the cache policy of the js file. Not the html file.
The browser doesn't cache the rendered page (well, it does for back buttons but that's not what we're talking about). The browser caches the source file. Therefore even if the html page is configured to be cached for a long time the javascript injected content will only be cached as long as its been configured to.
To configure caching policy you need to set specific headers on the server side. Sometimes you can do this in a CGI script. Sometimes you can do this in the server configuration files.
Google "http caching" and read up on how to configure a page to be cached or not cached (also google "json disable caching" or "ajax disable caching" because this issue crops up a lot with ajax).
Related
I have a locally-stored project whose directory structure is the following (I minimized non-relevant folders):
What I want to do is that in an HTML file, like index.html, to add a <header> such that its contents would be loaded from an external HTML file, so all of what I'll have to write in index.html would be <header>, and my solution would load the content automatically.
To do this, I'd like to use JavaScript (preferably jQuery, but I'll accept other solutions if they work and jQuery doesn't, or if they work and executed faster than jQuery).
I don't think that I should use an <iframe> due to the fact that it'd probably increase loading times more than using jQuery/JavaScript (which, like I said, is what works now, when the website is live).
Right now, I'm using the jQuery .load() function. I don't know much about jQuery, but I've been told that it should work locally - and it doesn't, for me.
My browser's console shows me the problem:
jquery-3.1.1.min.js:4 XMLHttpRequest cannot load file:///C:/Users/GalGr/Desktop/eiomw/header.html. Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https, chrome-extension-resource.
And I'm trying to overcome it.
This code works on my live website - it might not be updated to the code of the files that I linked to below, but it doesn't matter - their code matters.
This is the index.html file:
index.html
This is the header.html file:
header.html
This is `main_script.js:
main_script
The reason you're having a problem with this locally is mainly down to security measures in your browser.
Essentially whenever you're using jQuery's load() function it makes a separate HTTP request (approach known as AJAX) for the file or URL you give it.
Modern browsers enforce that the URL you request using AJAX methods is from the same origin (server) as a security feature to stop pages randomly loading content from anywhere on the internet in the background. In your case it seems like this shouldn't affect you because you're browsing your pages locally and the request you're making using load() is also for a local file (header.html).
However, I am assuming you're just opening up the page directly in your browser, so your browser's URL will look something like 'file:///C:/Users...' (similar example in the error message you gave). This means your browser is directly reading the file from disk and interpreting it as HTML to display the page. It seems likely you don't actually have a local HTTP server hosting the page, otherwise the URL would start with 'http://'. It is for this reason that the browser is giving the security error, even though your AJAX request for header.html is technically from the same source as the page it is executed on.
Your server will have an HTTP server which it's using to host the pages, and so everything works fine as you're then using HTTP as normal, and this security feature does not get in your way.
I would suggest that you simply install an HTTP server locally on your dev machine. You don't even need to 'install' one per-se, there are loads of development HTTP servers that just run standalone, so you start them up when you want to browse your local HTML files. As you appear to be on Windows, I'd check out either IIS (Windows' HTTP server) or IIS Express (like IIS but runs standalone). There are also many others available like Apache, Nginx, etc. etc.
If you do this, you can host your pages on something like 'http://localhost/index.html'. Then, any AJAX requests you make for local files will work fine, just like your server.
Hope that makes sense, and I'm not telling you something you already know?
Why not using something more straight foreword like mustache.js ?
I found a solution:
Using phpStorm's built-in localhost, I was able to emulate a server that handles my requests and responses.
I have a whole lot of css changes for my site. So I have used versioning to load the updated css files. But from some article I came to know that when some browsers like IE see a question mark they always hit the server to get the file but does not use the cache?
Is this true?
It varies. The main concern is not IE, but rather proxy servers between you and the client.
Personally, I use links of the form //example.com/t=12345/css/main.css
That t=12345 is the file's modification time, inserted by my "static resource management" class.
Then, a simple .htaccess rewrite rule strips that part out, leaving just /css/main.css as the target file.
From the browser's perspective, it's just a weirdly named folder, and it will cache according to the headers it receives. This will work for proxy servers too. Anything that can cache, will cache.
I version all of my client side JS files like "/js/myfile.js?v=3903948" so that my clients don't need to clear their browser cache to ensure they get the updated files. But every time I push an update, without fail, at least one person runs into a problem where they are running the old version and get some kind of error. I used to think that this was just them having already been on the page during the release and just needing to reload the browser, but this happened to me today when I was definitely not previously on the page. I browsed to the live site and was running the old code. I needed to do a browser refresh on that page to get the new file.
What can cause this?
PS I was using Chrome on Win7, but I have seen clients report this before on all different browsers.
If your main web page can also be cached, then the old version of that page can be requesting the old version of the JS file. JS file versioning works best if the page that actually refers to the JS file cannot be cached or has very short caching time.
I agree with jfriend00 about the webpage itself being cashed and thus requesting the old javascript version.
To prevent this, you can have the javascript file loaded by an ajax (Post) request, either requesting the server what is the accurate(latest) version number to download, or requesting the javascript itself and inserting it, e.g. in the head of the page.
Edit: see for example here
I make a quick AJAX request to the server for the version it expects them to have, then force them to refresh the page if the client's script is old.
Seems that proxy or some load balancer is serving old content instead of new. Also check IIS/webserver settings how are these files cached/expired.
You can check what is going on on the wire with tools like Fiddler.
I have a web app written in ASP.NET MVC 3.0. There are some largish scripts (jQuery, jQuery UI) which I want to ensure are cached for best performance. In the Chrome Developer Tools Network tab the scripts always take around 1.5 seconds to be received when a page is loaded. I would assume if they are cached this would be near instant.
Is there any way to ensure javascript is being cached and how to tell if it is or isn't?
For JQuery in particular it is better to use someone elses CDN - you will not have to stream this content from your server AND caching is properly done by someone else. See http://docs.jquery.com/Downloading_jQuery for recommended CDNs.
For files that you have to host yourself make sure you set correct caching headers.
For static content you need to rely on server (likley IIS in ASP.Net case) to set correct headers - see http://support.microsoft.com/kb/247404 for some detais, search for "iis cache control" to get more links.
For dynamic content choose needed values OutputCache attributes or set headers yourself (i.e. see http://www.asp.net/mvc/tutorials/improving-performance-with-output-caching-cs ).
We have an internal web application that acts as a repository to which users can upload files. These files can be any format, including HTML pages.
We have tested than in IE8, if you download an HTML file that contains some script that tries to access your cookies and, after downloading, you choose the "Open" option, the script executes and gets your cookie information with no problems at all.
Actually, that script could use XmlHttpRequest object to call the server and do some malicious operations within the session of the user who downloaded the file.
Is there any way to avoid this? We have tested that both Chrome and Firefox do not let this happen. How could this behaviour be avoided in any browser, including IE8?
Don't allow the upload of arbritary content. It's exclusively a terrible idea.
One potential "solution" could be to only host the untrusted uploads on a domain that doesn't have any cookies and that the user doesn't associate any trust with in any way. This would be a "solution", but certainly not the ideal one.
Some more practical options could be an authorisation-based process, where each file goes through an automated review and then a manual confirmation of the automated cleaning/analysis phase.
All in all though, it's a very bad idea to allow the general public to do this.
That's a really bad idea from a security point of view. Still, if you wish to do this, include HTTP response header Content-disposition: attachment It will force browser to download file instead of opening it. In Apache, it's done by adding Header set Content-disposition "attachment" to .htaccess file.
Note that it's a bad idea just to add Content-type: text/plain as mentioned in one of the answers, because it won't work for Internet Explorer. When IE receives file with text/plain content-type header, it turns on its MIME sniffer which tries to define file's real content-type (because some servers send all the files with text/plain). In case it meets HTML code inside a file, it will force the browser to serve file as text/html and render it.
If you really need to have the users upload HTML files, you should make sure the HTML files in this directory are served with the mime type text/plain rather than text/html or similar.
This will prevent the opened files from executing scripts in the browser. If you're using apache, see the AddType directive.