Reasons why Google Chrome is ignoring Cache-Control header - javascript

I have a specific asset on my server that responds requests with the following header:
cache-control:public,max-age=2592000,immutable
Sometimes the site might have to request the same file more than 10 times (The reason is not important). That's the reason why I configured this header.
Almost all mobile devices, desktop, and tables (along with the browsers) respect it, but some are not, they are just ignoring and requesting from the server again, E-V-E-R-Y single time. I'm using the BrowserStack to test, maybe the problem could be there, I'm not sure of anything now. Have you ever experienced such a thing? Is there a workaround, or something that I could do to debug it?
Thank you.

immutable is an Extension Cache-Control directive and is not supported by all browsers.
According to http://developer.mozilla.org :
Extension Cache-Control directives are not part of the core HTTP caching standards document.
If you check on the same page you can see that Chrome does not support it.
You may be better off using a simpler directive, like:
Cache-Control: public, max-age=31536000

Related

Ignore X-frame-header using javascript

I would like to ignore X-frame header on my website so that iframe can load external websites. There are chrome extension like
this one which works perfectly. How can I implement same concept through javascript?
You can’t use frontend JavaScript code running in a browser to cause the X-Frame-Options response header to be ignored. X-Frame-Options is a security feature designed in part as a defense against clickjacking attacks. If any site could just use some JavaScript code to cause browsers to ignore X-Frame-Options, that would pretty much make it completely useless.
That’s the reason why the only way you can cause it to be ignored in your own browser is by intentionally opting-in to insecure browsing by installing an extension as mentioned in the question.
But you can’t use JavaScript to force insecure browsing on other users by bypassing security features like X-Frame-Options that browsers have built-in support for.

WebSocket on IE10 giving a SecurityError

I am currently developing a website under IE10 (on Windows 8), using WebSockets in JavaScript. It runs fine under Firefox 18 and Chrome 25, but on IE10 I get a SecurityError when I establish the connection.
What I am doing seems pretty straghtforward :
websocket = new WebSocket('wss://hello.dev.mydomain.net');
But IE doesn't like it :
SCRIPT5022: SecurityError
The script is on "https://test.dev.mydomain.net" (not the real address obviously).
What bothers me is that if I just double-click the file on my local computer (e.g. file://...) it just works. Even worse: if I use fiddler to monitor HTTP traffic... it also works. Whereas there seems to be no connection at all without fiddler, as detailed in the API's specs. (See below.)
Judging by websocket spec, the exception should also appear on Chrome/Firefox... but it does not. So I doubt it has anything related to HTTP/HTTPS. In any case, I am using a wsS socket on a httpS page... Moreover: when I replace the wss address by another valid server found on an online example, it works.
I don't know if this is relevant, but the IP from test.dev.mydomain.net is 10.14.x.x where hello.dev.mydomain.net is 194.247.x.x. I don't know if it could trigger some kind of security on IE only...
One more thing: I have a certificate for *.dev.mydomain.net, IE does not seems to have problems with it. The script originally resides on a server called my.name.dev.mydomain.net, but since I am accessing it from another URL (I got a redirect since we first thought it could have been some kind of Same Origin Policy issue), I don't see how it could matter. At least I hope it does not...
Any idea is welcomed.
EDIT: adding the sites to the trusted zone does not work either.
It looks like IE throws a SecurityError if you're trying to open a websocket on a local (intranet) domain. To overcome this, you may disable IE's automatic algorithm for recognizing local sites. This can be done in Tools > Internet Options > Security > Local Intranet > Sites.
Uncheck all checkboxes (or only a particular one, if you know how exactly your domain did end up in intranet ones).
Note that IE uses (among other things) its proxy settings to determine local sites: if your domain is listed as excluded from proxying in proxy settings, then it will probably be treated as intranet one. This is why WebSockets work if you enable Fiddler: it modifies IE proxy settings and thus the list of intranet sites changes.
I had this problem in Windows7/IE11 after applying a security patch. For Windows10/Edge is the same story.
As this is a local websocket (ws://localhost) you have to add ws:\\localhost\ to Internet Explorer configurations (Tools > Internet Options > Security > Local Intranet > Sites > Advanced).
In Windows 10/Microsoft Edge you will find this configuration in Control Panel > Internet Options.
UPDATE
The address of your webapp (https://test.dev.mydomain.net) must be added to the local intranet zone too. Note that in the image the webapp address should be added.
Well, my question wasn't that successful, so I'll post the "workaround" I found.
I got another address for the website, in 194.247.. too. This, magically, solved it. Guess IE doesn't like mixing local and external stuff and watches the IP.
Anyways, I hope this may come in handy to anyone who's got the same issue.
If you have a solution to solve the "real" issue by configuring IE, let me know :)
Cheers,
Browsers has a websocket limitation. For example Internet Explorer has default limit of websocket connections set to 6 per host header name. the same limitation is set for WinForms WebBrowser component.
The solution is to add values under key Computer\HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_WEBSOCKET_MAXCONNECTIONSPERSERVER in registry. Just add DWORD value with executable name , for example iexplore.exe (or your application executable name if you use Web browser component) and set value from range 2..128
Second option how to solve SecurityException is to create multiple subdomains.
The client hostname/IP Address should be same as server IP/Hostname thats listening to otherwise you would get the above error.
1) Make sure whether server hostname configured to listen at IP/localhost etc andif not explicitly specify the hostname ast server
2) use the same hostname in the client. THis will solve the issue. It worked for me...
I encountered the error (although it did not say the SCRIPT5022 part, rather it just reports "ScriptError"). I got around the issue by clicking on "Trusted Sites" and then adding the machine hosting the remote websocket. Note, to add to trusted sites,
I had to supply the address without the "ws://" part (like just mymahcine.mydomain.com)
I had to uncheck the box that says "Require server verification https:// " option.
After I was done adding the domain, I re-checked the box "Require server verification (https://). I would recommend everyone to do the same. Unchecking the box is only a workaround to add sites that don't begin with https (rather ws:// in my case)
I had the same issue at one of my customer's environment.
It turned out that they had a proxy configuration that did not allow the connection to the WebSocket endpoint directly and did not support the WebSocket protocol.
The temporary solution was to disable using the proxy and everything started working. The long term solution is to edit the proxy's configuration (.pac file) to exclude the address of the WebSocket endpoint.
To disable the proxy, go to: Internet Explorer Options > Connections tab > LAN settings button > un-check Automatically detect settings.
Hope this helps someone.
In addition to making sure that the internet zone is not localhost (as in above answers), ensure that if https is used, then wss should be used.
This is not an issue in other browsers, but IE is abit more finicky.

Replace remote JavaScript file with a local debugging copy using Greasemonkey or userscript

While debugging a client app that uses a Google backend, I have added some debugging versions of the functions and inserted them using the Chrome Developer Tools script editor.
However there are a number of limitations with this approach, first is that the editor doesn't seem to always work with de-minified files, and when the JS file is 35K lines long, this is a problem.
Another issue is that all the initialization that is done during load time, uses the original "unpatched" functions, hence this is not ideal.
I would like to replace the remote javascript.js file with my own local copy, presumably using some regex on the file name, or whatever strategy was suitable, I am happy to use either Firefox or Chrome, if one was easier than the other.
So basically, as #BrockAdams identified, there are a couple of solutions to these types of problem depending on the requirements, and they follow either 1 of 2 methods.
the browser API switcharoo.
The proxy based interception befiddlement.
the browser API switcharoo.
Both firefox and chrome support browser extensions that can take advantage of platform specific APIs to register event handlers for "onbeforeload" or "onBeforeRequest" in the case of firefox and chrome respectively. The chrome APIs are currently experimental, hence these tools are likely to be better developed under firefox.
2 tools that definitely do something like what is required are AdBlock plus and Jsdeminifier both of which have the source code available.
The key point for these 2 firefox apps is that they intercept the web request before the browser gets its hands on it and operate on the other side of the http/https encrpytion stage, hence can see the decrypted response, however as identified in the other post that they don't do the whole thing, although the jsdeminifier was very useful, I didn't find a firefox plugin to do exactly what I wanted, but I can see from those previous plugins, that it is possible with both firefox and chrome. Though they don't actually do the trick as required.
The proxy based interception befiddlement This is definitely the better option in a plain HTTP environment, there are whole bunch of proxies such as pivoxy, fiddler2, Charles Web HTTP proxy, and presumably some that I didn't look at specifically such as snort that support filtering of some sort.
The simplest solution for myself was foxyproxy and privoxy on firefox, and configure a user.action and user.filter to detect the url of the page, and then to apply a filter which swapped out the original src tag, for my own one.
The https case. proxy vs plugin
When the request is https the proxy can't see the request url or the response body, so it can't do the cool swapping stuff. However there is one option available for those who like to mess with their browser. And that is the man-in-the-middle SSL proxy. The Charles Web HTTP proxy appears to be the main solution to this problem. Basically the way it works is that when your browser makes a request to the remote HTTPS server, the ssl proxy intercepts the request and from the ip address of the server generates a server certificate on the fly, which it signs with its own root CA, and sends back to the browser. The browser obviously complains about the self-signed cert, but here you can choose to install the ssl proxy root CA cert into the browser, befuddling the browser and allowing the ssl proxy to man in the middle and make replacements and filters on the raw response body.
Alternative roll your own chrome extension
I decided to go with rolling my own chrome extension, which I am planning to make available. Currently its in a very hardcoded to my own requirements state, but it works pretty good, even for https requests and another benefit is that a browser plugin solution can be more tightly integrated with the browser developer tools.

Cross-domain SSL handshake failure in Firefox using xhr, client-certificate

The setup is as follows:
Firefox (both 3.x and 4b) with properly set up and working certificates, including a client certificate.
Web page with an XMLHttpRequest() type of AJAX call to a different subdomain.
Custom web server in said subdomain accepting requests, reponding with a permissive Access-Control-Allow-Origin header and requiring client verification.
The problem is that Firefox aborts the request (well, that's what it says in firebug anyway) abruptly. Running the setup with openssl s_server instead hints that Firefox actually doesn't even send the client certificate:
140727260153512:error:140890C7:SSL routines:SSL3_GET_CLIENT_CERTIFICATE:peer
did not return a certificate:s3_srvr.c:2965:ACCEPT
The same exact setup works perfectly with Chrome, suggesting perhaps a bug in Firefox. However, performing the ajax call with a <script> element injected into the DOM seems to work as intended...
So, has anyone else run into this? Is it a bug? Any workarounds? Is there something obvious missing?
Chiming in 5 years later probably isn't much help to the OP, but in case someone else has this issue in the future...
Firefox appears to not send the client certificate with a cross-origin XHR request by default. Setting withCredentials=true on the XHR instance resolved the issue for me. Note that I also did not see this problem with Chrome--only Firefox.
For more info see this Mozilla Dev Network blog post. In particular, the following statement:
By default, in cross-site XMLHttpRequest invocations, browsers will
not send credentials. A specific flag has to be set on the XMLHttpRequest object when it is invoked.
The reason injecting the script works as opposed to a simple XHR request is because of the Single Origin Policy. This would probably explain why Chrome allows the XHR but not FF; Chrome considers the subdomain part of the same origin, but FF does not.
Injecting scripts from other domains (which is what Google Analytics does) is allowed and one of the practices to handle this situation.
The way my team handles this situation is by making a request through a server-side proxy.
I would recommend using a server-side proxy if you can, but the script injection method works fine as long as the code is coming from a trusted source.
I also found this article which describes your situation.

Cookie not being sent by IE7

I have two copies of IE7, same exact security settings and same exact builds. Two different machines, both running WinXP. In my application, my cookie headers are being properly sent to the server on one version of IE. No other cookies are being sent in another version.
What are some points to troubleshoot in this scenario?
Try Fiddler to trace what's happening, It's more appropriate (and simpler) than Wireshark for this purpose.
http://www.fiddlertool.com/fiddler/
you may want to get something like wireshark to see what is being sent across the line. FIrebug has some net utilities like this but your problem seems specific to IE. Still, trying another browser couldn't hurt in trying to troubleshoot this issue.
Other items to look for are the advanced properties of the IE installation and the zone that the website is in.
be careful what links you are accessing. It took me almost a day to discover why the same browser sometimes sent the session cookies, and sometimes it didn't.
Accessing the page via http://www.example.com will create different cookies than on http://example.com (without the 'www') because the browser sees them as two different access points:)
Also be careful about your browsers settings..you should make sure they are identical.

Categories