This morning, upon upgrading my Firefox browser to the latest version (from 22 to 23), some of the key aspects of my back office (website) stopped working.
Looking at the Firebug log, the following errors were being reported:
Blocked loading mixed active content "http://code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css"
Blocked loading mixed active content "http://ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"`
among other errors caused by the latter of the two above not being loaded.
What does the above mean and how do I resolve it?
I found this blog post which cleared up a few things. To quote the most relevant bit:
Mixed Active Content is now blocked by default in Firefox 23!
What is Mixed Content?
When a user visits a page served over HTTP, their connection is open for eavesdropping and man-in-the-middle (MITM) attacks. When a user visits a page served over HTTPS, their connection with the web server is authenticated and encrypted with SSL and hence safeguarded from eavesdroppers and MITM attacks.
However, if an HTTPS page includes HTTP content, the HTTP portion can be read or modified by attackers, even though the main page is served over HTTPS. When an HTTPS page has HTTP content, we call that content “mixed”. The webpage that the user is visiting is only partially encrypted, since some of the content is retrieved unencrypted over HTTP. The Mixed Content Blocker blocks certain HTTP requests on HTTPS pages.
The resolution, in my case, was to simply ensure the jquery includes were as follows (note the removal of the protocol):
<link rel="stylesheet" href="//code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css" type="text/css">
<script type="text/javascript" src="//ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"></script>
Note that the temporary 'fix' is to click on the 'shield' icon in the top-left corner of the address bar and select 'Disable Protection on This Page', although this is not recommended for obvious reasons.
UPDATE: This link from the Firefox (Mozilla) support pages is also useful in explaining what constitutes mixed content and, as given in the above paragraph, does actually provide details of how to display the page regardless:
Most websites will continue to work normally without any action on your part.
If you need to allow the mixed content to be displayed, you can do that easily:
Click the shield icon Mixed Content Shield in the address bar and choose Disable Protection on This Page from the dropdown menu.
The icon in the address bar will change to an orange warning triangle Warning Identity Icon to remind you that insecure content is being displayed.
To revert the previous action (re-block mixed content), just reload the page.
It means you're calling http from https. You can use src="//url.to/script.js" in your script tag and it will auto-detect.
Alternately you can use use https in your src even if you will be publishing it to a http page. This will avoid the potential issue mentioned in the comments.
In absence of a white-list feature you have to make the "all" or "nothing" Choice. You can disable mixed content blocking completely.
The Nothing Choice
You will need to permanently disable mixed content blocking for the current active profile.
In the "Awesome Bar," type "about:config". If this is your first time you will get the "This might void your warranty!" message.
Yes you will be careful. Yes you promise!
Find security.mixed_content.block_active_content. Set its value to false.
The All Choice
iDevelApp's answer is awesome.
Put the below <meta> tag into the <head> section of your document to force the browser to replace unsecure connections (http) to secured connections (https). This can solve the mixed content problem if the connection is able to use https.
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
If you want to block then add the below tag into the <head> tag:
<meta http-equiv="Content-Security-Policy" content="block-all-mixed-content">
Its given the error because of security.
for this please use "https" not "http" in the website url.
For example :
"https://code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css"
"https://ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"
In the relevant page which makes a mixed content https to http call which is not accessible we can add the following entry in the relevant and get rid of the mixed content error.
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
If you are consuming an internal service via AJAX, make sure the url points to https, this cleared up the error for me.
Initial AJAX URL: "http://XXXXXX.com/Core.svc/" + ApiName
Corrected AJAX URL: "https://XXXXXX.com/Core.svc/" + ApiName,
Simply changing HTTP to HTTPS solved this issue for me.
WRONG :
<script src="http://code.jquery.com/jquery-3.5.1.js"></script>
CORRECT :
<script src="https://code.jquery.com/jquery-3.5.1.js"></script>
I had this same problem because I bought a CSS template and it grabbed a javascript an external javascript file through http://whatever.js.com/javascript.js. I went to that page in my browser and then changed it to https://whatever... using SSL and it worked, so in my HTML javascript tag I just changed the URL to use https instead of http and it worked.
To force redirect on https protocol, you can also add this directive in .htaccess on root folder
RewriteEngine on
RewriteCond %{REQUEST_SCHEME} =http
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
#Blender Comment is the best approach. Never hard code the protocol anywhere in the code as it will be difficult to change if you move from http to https. Since you need to manually edit and update all the files.
This is always better as it automatically detect the protocol.
src="//code.jquery.com
I've managed to fix this using these :
For Firefox user
Open a new TAB enter about:config in the address bar to go to the configuration page.
Search for security.mixed_content.block_active_content
Change TRUE to FALSE.
For Chrome user
Click the Not Secure Warning next to the URL
Click Site Settings on the popup box
Change Insecure Content to Allow
Close and refresh the page
I found if you have issues with including or mixing your page with something like http://www.example.com, you can fix that by putting //www.example.com instead
I have facing same problem when my site goes from http to https. We have added rule for all request to redirect http to https.
You needs to add the redirection rule for inter site request, but you have to remove the redirection rule for external js/css.
I just fixed this problem by adding the following code in header:
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
#if (env('APP_DEBUG'))
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
#endif
Syntax for Laravel Blade, Remember to use it for debugging only to avoid MITM attacks and eavs-dropping
Also using
http -> https
for Ajax or normal JS Scripts or CSS will also solve the issue.
If your app server is weblogic, then make sure WLProxySSL ON entry exists(and also make sure it should not be commented) in the weblogic.conf file in webserver's conf directory. then restart web server, it will work.
Related
I am developing a web page that needs to display, in an iframe, a report served by another company's SharePoint server. They are fine with this.
The page we're trying to render in the iframe is giving us X-Frame-Options: SAMEORIGIN which causes the browser (at least IE8) to refuse to render the content in a frame.
First, is this something they can control or is it something SharePoint just does by default? If I ask them to turn this off, could they even do it?
Second, can I do something to tell the browser to ignore this http header and just render the frame?
If the 2nd company is happy for you to access their content in an IFrame then they need to take the restriction off - they can do this fairly easily in the IIS config.
There's nothing you can do to circumvent it and anything that does work should get patched quickly in a security hotfix. You can't tell the browser to just render the frame if the source content header says not allowed in frames. That would make it easier for session hijacking.
If the content is GET only you don't post data back then you could get the page server side and proxy the content without the header, but then any post back should get invalidated.
UPDATE: 2019-12-30
It seem that this tool is no longer working! [Request for update!]
UPDATE 2019-01-06: You can bypass X-Frame-Options in an <iframe> using my X-Frame-Bypass Web Component. It extends the IFrame element by using multiple CORS proxies and it was tested in the latest Firefox and Chrome.
You can use it as follows:
(Optional) Include the Custom Elements with Built-in Extends polyfill for Safari:
<script src="https://unpkg.com/#ungap/custom-elements-builtin"></script>
Include the X-Frame-Bypass JS module:
<script type="module" src="x-frame-bypass.js"></script>
Insert the X-Frame-Bypass Custom Element:
<iframe is="x-frame-bypass" src="https://example.org/"></iframe>
The X-Frame-Options header is a security feature enforced at the browser level.
If you have control over your user base (IT dept for corp app), you could try something like a greasemonkey script (if you can a) deploy greasemonkey across everyone and b) deploy your script in a shared way)...
Alternatively, you can proxy their result. Create an endpoint on your server, and have that endpoint open a connection to the target endpoint, and simply funnel traffic backwards.
Yes Fiddler is an option for me:
Open Fiddler menu > Rules > Customize Rules (this effectively edits CustomRules.js).
Find the function OnBeforeResponse
Add the following lines:
oSession.oResponse.headers.Remove("X-Frame-Options");
oSession.oResponse.headers.Add("Access-Control-Allow-Origin", "*");
Remember to save the script!
As for second question - you can use Fiddler filters to set response X-Frame-Options header manually to something like ALLOW-FROM *. But, of course, this trick will work only for you - other users still won't be able to see iframe content(if they not do the same).
Somewhere in the code, over a secure site, the following snippet is used:
var iframe = document.createElement("IFRAME");
iframe.setAttribute("src", "pugpig://onPageReady");
document.documentElement.appendChild(iframe);
iframe.parentNode.removeChild(iframe);
iframe = null;
The iframe src attribute set here is actually triggering a callback but it's causing Chrome (version 54) to complain about "Mixed Content" as the src attribute is interpreted as a non-https url over an https:// domain and that version of Chrome is not presenting the users with an easy option to allow for mixed content to load anyway (e.g. shield icon in the address bar).
Changing the Chrome version / using a different browser / starting chrome with the --allow-running-insecure-content switch is not an option for certain reasons so my question is, is there a way to make the "pugpig://onPageReady" part be perceived as an https url?
You can try this:-
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests" />
Or
<meta http-equiv="Content-Security-Policy" content="block-all-mixed-content" />
Paste it in <head>...</head> tags.
The HTTP Content-Security-Policy (CSP) block-all-mixed-content directive prevents loading any assets using HTTP when the page is loaded using HTTPS.
All mixed content resource requests are blocked, including both active and passive mixed content. This also applies to <iframe> documents, ensuring the entire page is mixed content free.
The upgrade-insecure-requests directive is evaluated before block-all-mixed-content and If the former is set, the latter is effectively a no-op. It is recommended to set one directive or the other – not both.
As log as i know, no, ther's not. If there is, it can be considered a security flaw, and it will be fixed.
mixed content explaination
I'm using a service that automatically constructs and hosts launch pages. In the body of their code, they have a call to jquery:
<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js?cache=2015-09-22-09">
Unfortunately, since I'm in China, googleapis.com is blocked and the page has to wait for this code to timeout before it will render. This portion is autogenerated as part of the template and I can't change it. However I can insert custom javascript in the head and the body.
Are there any ways I can prevent this code from making the request to googleapis.com or to force it to abort after it has already made the request?
-EDIT-
Here's a screen cap of the network tab when I try to load the page. As you can see, the call to googleapis.com hangs for 1.4 mins until it times out, at which point DomContentLoaded triggers and the entire page loads.
Right, if you are able to put html in the head of the document, not just execute javascript you could use a meta tag to block external script loading:
<meta http-equiv="Content-Security-Policy" content="script-src 'self'">
From the Mozilla Content-Security-Policy Meta Tag Docs:
Authors are strongly encouraged to place meta elements as early in the document as possible, because policies in meta elements are not applied to content which preceds them. In particular, note that resources fetched or prefetched using the Link HTTP response header field, and resources fetched or prefetched using link and script elements which precede a meta-delivered policy will not be blocked.
So the meta tag will only work in the head, certainly before the script which loads jQuery. You can whitelist URL's in the tag by adding them into the content parameter in the meta tag too.
If you can only execute javascript, you can add the meta tag dynamically. Unfortunately it is likely the browser has probably decided on it's policies by the time it is added. Nevertheless, it can be added with
var meta = document.createElement('meta');
meta.httpEquiv = "Content-Security-Policy";
meta.content = "script-src 'self'";
document.getElementsByTagName('head')[0].appendChild(meta);
More Interesting reading material homework for solving the 'Prevent an external js request' mystery:
Use JavaScript to prevent a later `<script>` tag from being evaluated?
Good Luck!
Consider using a Content Security Policy. It would be an unusual use case, but with the CSP you can tell the browser that it is not allowed to access googleapis.com before your government even gets a say in the matter. In this way, the browser won't even try to load it, and the page will not hang.
Yeah. #Niet's suggestion seems nice. To add to his answer, here's how you can block rendering of googleapi domain using CSP:
Content-Security-Policy: script-src 'self';
This code would instruct the browser to only execute YOUR own domain's scripts.
If you have access to the web server and its windows you could add an entry in the Hosts file to redirect the google address to the local server web application to download the javascript from there? (if you match the folder structure of the javascript link)
c:\windows\system32\drivers\etc\hosts
ajax.googleapis.com 127.0.0.1
This question has nothing to do with the mixed content error. About to launch a site. When i navigate from http://example.com to https://example.com, i notice that the css/js/etc is redownloaded as i am using root relative paths: .
Using an http sniffer i see that the browser thinks https://www.example.com/_css/main.css is different than http://www.example.com/_css/main.css (its not). Thus the same exact content is downloaded twice causing the site to look slow navigating from http to https (if the user doesn't have both versions cached already).
Is there anyway to stop this? The user will almost always hit up the non-ssl version of the site first so is there a script that will wait until the http content is loaded than maybe force a https version into the users cache? Or should i just use absolute paths (https://www.example.com/_css/main.css) on ever page and on every css background image (only 2 i use sprites). Or do we just live with it? Thanks.
Using an http sniffer i see that the browser thinks https://www.mysite.com/_css/main.css is different than http://www.mysite.com/_css/main.css (its not).
It is a different resource with identical content. The browser has no way to know that they are going to have the same content.
You can redirect (with a 301) from one to the other so you don't have a non SSL version.
Is there anyway to stop this?
Not really.
The user will almost always hit up the non-ssl version of the site first so is there a script that will wait until the http content is loaded than maybe force a https version into the users cache?
No. It would be a horrible security problem if a URL could precache content for arbitrary other URLs.
Or should i just use absolute paths (https://www.mysite.com/_css/main.css) on ever page and on every css background image (only 2 i use sprites).
That would work, but lead to mixed content issues.
Or do we just live with it?
Yes.
There are a couple ways to fix this.
Use a base tag. Then use relative paths for your resources and caching will be perceived to work for http and https, though really it was just loaded on https already. Demonstration
<base href="https://example.com/" />
Redirect everything to SSL when the user hits the site the Apache way (Redirect SSL)
Redirect permanent / https://example.com/login
You can use an .htaccess RewriteRule to load the https content every time; or a redirect header specified in the http version of the html would work more slowly (extra round-trip) but otherwise just as well, I believe.
Use protocol-relative paths.
Instead of this
<link rel="stylesheet" href="http://domain.com/style.css">
<link rel="stylesheet" href="https://domain.com/style.css">
use this
<link rel="stylesheet" href="//domain.com/style.css">
then it will use the protocol of the parent page.
You can reference the files without the protocol specifier, e.g.:
<link rel="stylesheet" type="text/css" href="//mysite.com/_css/main.css" />
See this post for more details:
Can I change all my http:// links to just //?
I am using a mobile network based internet connection and the source code is being rewritten when they present the site to the end user.
In the localhost my website looks fine, but when I browse the site from the remote server via the mobile network connection the site looks bad.
Checking the source code I found a piece of JavaScript code is being injected to my pages which is disabling the some CSS that makes site look bad.
I don't want image compression or bandwidth compression instead of my well-designed CSS.
How can I prevent or stop the mobile network provider (Vodafone in this case) from proxy injecting their JavaScript into my source code?
You can use this on your pages. It still compresses and put everything inline but it wont break scripts like jquery because it will escape everything based on W3C Standards
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
On your server you can set the cahce control
"Cache-Control: no-transform"
This will stop ALL modifications and present your site as it is!
Reference docs here
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.5
http://stuartroebuck.blogspot.com/2010/08/official-way-to-bypassing-data.html
Web site exhibits JavaScript error on iPad / iPhone under 3G but not under WiFi
You're certainly not the first. Unfortunately many wireless ISPs have been using this crass and unwelcome approach to compression. It comes from Bytemobile.
What it does is to have a proxy recompress all images you fetch smaller by default (making image quality significantly worse). Then it crudely injects a script into your document that adds an option to load the proper image for each recompressed image. Unfortunately, since the script is a horribly-written 1990s-style JS, it craps all over your namespace, hijacks your event handlers and stands a high chance of messing up your own scripts.
I don't know of a way to stop the injection itself, short of using HTTPS. But what you could do is detect or sabotage the script. For example, if you add a script near the end of the document (between the 1.2.3.4 script inclusion and the inline script trigger) to neuter the onload hook it uses:
<script type="text/javascript">
bmi_SafeAddOnload= function() {};
</script>
then the script wouldn't run, so your events and DOM would be left alone. On the other hand the initial script would still have littered your namespace with junk, and any markup problems it causes will still be there. Also, the user will be stuck with the recompressed images, unable to get the originals.
You could try just letting the user know:
<script type="text/javascript">
if ('bmi_SafeAddOnload' in window) {
var el= document.createElement('div');
el.style.border= 'dashed red 2px';
el.appendChild(document.createTextNode(
'Warning. Your wireless ISP is using an image recompression system '+
'that will make pictures look worse and which may stop this site '+
'from working. There may be a way for you to disable this feature. '+
'Please see your internet provider account settings, or try '+
'using the HTTPS version of this site.'
));
document.body.insertBefore(el, document.body.firstChild);
}
</script>
I'm suprised no one has put this as answer yet. The real solution is:
USE HTTPS!
This is the only way to stop ISPs (or anyone else) from inspecting all your traffic, snooping on your visitors, and modifying your website in flight.
With the advent of Let's Encrypt, getting a certificate is now free and easy. There's really no reason not to use HTTPS in this day and age.
You should also use a combination of redirects and HSTS to keep all of your users on HTTPS.
You provider might have enabled a Bytemobile Unison feature called "clientless personalization". Try accessing the fixed URL http://1.2.3.50/ups/ - if it's configured, you will end up on a page which will offer you to disable all feature you don't like. Including Javascript injection.
Good luck!
Alex.
If you're writing you own websites, adding a header worked for me:
PHP:
Header("Cache-Control: no-transform");
C#:
Response.Cache.SetNoTransforms();
VB.Net:
Response.Cache.SetNoTransforms()
Be sure to use it before any data has been sent to the browser.
I found a trick. Just add:
<!--<![-->
After:
<html>
More information (in German):
http://www.programmierer-forum.de/bmi-speedmanager-und-co-deaktivieren-als-webmaster-t292182.htm#3889392
BMI js it's not only on Vodafone. Verginmedia UK and T-Mobile UK also gives you this extra feature enabled as default and for free. ;-)
In T-mobile it's called "Mobile Broadband Accelerator"
You can Visit:
http://accelerator.t-mobile.co.uk
or
http://1.2.3.50/
to configure it.
In case the above doesn't apply to you or for some reason it's not an option
you could potentially set-up your local proxy (Polipo w/wo Tor)
There is also a Firefox addon called "blocksite"
or as more drastic approach reset tcp connection to 1.2.3.0/24:80 on your firewall.
But unfortunately that wouldn't fix the damage.
Funny enough T-mobile and Verginmedia mobile/broadband support is not aware about this feature! (2011.10.11)
PHP: Header("Cache-Control: no-transform"); Thanks!
I'm glad I found this page.
That Injector script was messing up my php page source code making me think I made an error in my php coding when viewing the page source. Even though the script was blocked with firefox NoScript add on. It was still messing up my code.
Well, after that irritating dilemma, I wanted to get rid of it completely and not just block it with adblock or noscript firefox add ons or just on my php page.
STOP http:// 1.2.3.4 Completely in Firefox: Get the add on: Modify
Headers.
Go to the modify header add on options... now on the Header Tab.
Select Action: Choose ADD.
For Header Name type in: cache-control
For Header Value type in: no-transform
For Comment type in: Block 1.2.3.4
Click add... Then click Start.
The 1.2.3.4 script will not be injected into any more pages! yeah!
I no longer see 1.2.3.4 being blocked by NoScript. cause it's not there. yeah.
But I will still add: PHP: Header("Cache-Control: no-transform"); to my php pages.
If you are getting it on a site that you own or are developing, then you can simply override the function by setting it to null. This is what worked for me just fine.
bmi_SafeAddOnload = null;
As for getting it on other sites you visit, then you could probably open the devtools console and just enter that into there and wipe it out if a page is taking a long time to load. Haven't yet tested that though.
Ok nothing working to me. Then i replace image url every second because when my DOM updates, the problem is here again. Other solution is only use background style auto include in pages. Nothing is clean.
setInterval(function(){ imageUpdate(); }, 1000);
function imageUpdate() {
console.log('######imageUpdate');
var image = document.querySelectorAll("img");
for (var num = 0; num < image.length; num++) {
if (stringBeginWith(image[num].src, "http://1.1.1.1/bmi/***yourfoldershere***")) {
var str=image[num].src;
var res=str.replace("http://1.1.1.1/bmi/***yourfoldershere***", "");
image[num].src = res;
console.log("replace"+str+" by "+res);
/*
other solution is to push img src in data-src and push after dom loading all your data-src in your img src
var data-str=image[num].data-src;
image[num].src = data-str;
*/
}
}
}
function stringEndsWith(string, suffix) {
return string.indexOf(suffix, string.length - suffix.length) !== -1
}
function stringBeginWith(string, prefix) {
return string.indexOf(prefix, prefix.length-string.length) !== -1
}
An effective solution that I found was to edit your hosts file (/etc/hosts on Unix/Linux type systems, C:\Windows\System32\drivers\etc on Windows) to have:
null 1.2.3.4
Which effectively maps all requests to 1.2.3.4 to null. Tested with my Crazy Johns (owned by Vofafone) mobile broadband. If your provider uses a different IP address for the injected script, just change it to that IP.
Header("Cache-Control: no-transform");
use the above php code in your each php file and you will get rid of 1.2.3.4 code injection.
That's all.
I too was suffering from same problem, now it is rectified. Give a try.
I added to /etc/hosts
1.2.3.4 localhost
Seems to have fixed it.