Page throws TLS error instead of loading - javascript

I've researched this up the wazoo but to no avail. I have a page internally that requires TLS 1.1 or 1.2. If they're not on, you get an error:
This page can’t be displayed
Turn on TLS 1.0, TLS 1.1, and TLS 1.2 in Advanced settings and try
connecting to https://SITEADDRESS again. If this error
persists, it is possible that this site uses an unsupported protocol
or cipher suite such as RC4 (link for the details), which is not
considered secure. Please contact your site administrator.
Obviously the fix is to turn on those cipher suites in IE. However what I would like to put into the page is a check, to preload something from the site, make sure it's visible/readable/loadable/whatever, then allow the user to move forward, OR, if whatever it is I'm checking cannot be rendered/read/etc, direct them elsewhere.
So what I've tried was doing a PHP file_get_contents of the SAME address
<?php
$contents = file_get_contents('https://SITEADDRESS');
echo "<pre>";
var_dump($contents);
echo "</pre>";
?>
However, it ends up being able to pull the page code and dump it out?!?!!? Which means it CAN access the page, yet, I know it can't because I get a TLS error when trying to load it normally.
The question is, how can I precheck a URL with TLS as a consideration before forwarding the user on to a page that might not render?

It appears you're mixing server-side and client-side.
I assume PHP is running server-side, and it definitely will not be affected by IE's (which I'm assuming is running client-side) choice of ciphers.
If that doesn't make sense, let me know and I'll give further utterance...
If I'm understanding correctly, you want your own logic running on the client (IE) to detect if the client (IE) can reach the URL.
You could try firing an asynchronous request via Javascript (e.g. https://api.jquery.com/jquery.get/ )
YMMV, I'm not sure if IE will pass the TLS error into your Javascript code or not. Assuming it does, you should be able to at least handle any error events (arising from the Javascript HTTP request) and assume the client should not try to proceed.

Loading this file in the head...
<script type="text/javascript" src="https://SITEADDRESS/api/start_session.js"></script>
Then checking for the existence of variable BG, where BG is a var from the loaded file. IF BG is undefined (doesn't exist) then something blocked access to the external domain and as such show/hide the DIV's appropriately so the user is never sent to the wrong site!
window.onload = function() {
if (typeof BG !== 'undefined') {
document.getElementById('test1').style.display = 'block';
document.getElementById('test2').style.display = 'none';
} else {
document.getElementById('test1').style.display = 'none';
document.getElementById('test2').style.display = 'block';
}
};

Related

Detect and log when external JavaScript or CSS resources fail to load

I have multiple <head> references to external js and css resources. Mostly, these are for things like third party analytics, etc. From time to time (anecdotally), these resources fail to load, often resulting in browser timeouts. Is it possible to detect and log on the server when external JavaScript or CSS resources fail to load?
I was considering some type of lazy loading mechanism that when, upon failure, a special URL would be called to log this failure. Any suggestions out there?
What I think happens:
The user hits our page and the server side processes successfully and serves the page
On the client side, the HTML header tries to connect to our 3rd party integration partners, usually by a javascript include that starts with "http://www.someothercompany.com...".
The other company cannot handle our load or has shitty up-time, and so the connection fails.
The user sees a generic IE Page Not Found, not one from our server.
So even though my site was up and everything else is running fine, just because this one call out to the third party servers failed, one in the HTML page header, we get a whole failure to launch.
If your app/page is dependent on JS, you can load the content with JS, I know it's confusing. When loading these with JS, you can have callbacks that allow you to only have the functionality of the loaded content and not have to worry about what you didn't load.
var script = document.createElement("script");
script.type = "text/javascript";
script.src = 'http://domain.com/somefile.js';
script.onload = CallBackForAfterFileLoaded;
document.body.appendChild(script);
function CallBackForAfterFileLoaded (e) {
//Do your magic here...
}
I usually have this be a bit more complex by having arrays of JS and files that are dependent on each other, and if they don't load then I have an error state.
I forgot to mention, obviously I am just showing how to create a JS tag, you would have to create your own method for the other types of files you want to load.
Hope maybe that helps, cheers
You can look for the presence of an object in JavaScript, e.g. to see if jQuery is loaded or not...
if (typeof jQuery !== 'function') {
// Was not loaded.
}
jsFiddle.
You could also check for CSS styles missing, for example, if you know a certain CSS file sets the background colour to #000.
if ($('body').css('backgroundColor') !== 'rgb(0, 0, 0)') {
// Was not loaded.
}
jsFiddle.
When these fail, you can make an XHR to the server to log these failings.
What about ServiceWorker? We can use it to intercept all http requests and get response code to log whether the external resource fails to load.
Make a hash of the js name and session cookie and send both js name in plain and the hash. Server side, make the same hash, if both are same log, if not, assume it's abuse.

Stop mobile network proxy from injecting JavaScript

I am using a mobile network based internet connection and the source code is being rewritten when they present the site to the end user.
In the localhost my website looks fine, but when I browse the site from the remote server via the mobile network connection the site looks bad.
Checking the source code I found a piece of JavaScript code is being injected to my pages which is disabling the some CSS that makes site look bad.
I don't want image compression or bandwidth compression instead of my well-designed CSS.
How can I prevent or stop the mobile network provider (Vodafone in this case) from proxy injecting their JavaScript into my source code?
You can use this on your pages. It still compresses and put everything inline but it wont break scripts like jquery because it will escape everything based on W3C Standards
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
On your server you can set the cahce control
"Cache-Control: no-transform"
This will stop ALL modifications and present your site as it is!
Reference docs here
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.5
http://stuartroebuck.blogspot.com/2010/08/official-way-to-bypassing-data.html
Web site exhibits JavaScript error on iPad / iPhone under 3G but not under WiFi
You're certainly not the first. Unfortunately many wireless ISPs have been using this crass and unwelcome approach to compression. It comes from Bytemobile.
What it does is to have a proxy recompress all images you fetch smaller by default (making image quality significantly worse). Then it crudely injects a script into your document that adds an option to load the proper image for each recompressed image. Unfortunately, since the script is a horribly-written 1990s-style JS, it craps all over your namespace, hijacks your event handlers and stands a high chance of messing up your own scripts.
I don't know of a way to stop the injection itself, short of using HTTPS. But what you could do is detect or sabotage the script. For example, if you add a script near the end of the document (between the 1.2.3.4 script inclusion and the inline script trigger) to neuter the onload hook it uses:
<script type="text/javascript">
bmi_SafeAddOnload= function() {};
</script>
then the script wouldn't run, so your events and DOM would be left alone. On the other hand the initial script would still have littered your namespace with junk, and any markup problems it causes will still be there. Also, the user will be stuck with the recompressed images, unable to get the originals.
You could try just letting the user know:
<script type="text/javascript">
if ('bmi_SafeAddOnload' in window) {
var el= document.createElement('div');
el.style.border= 'dashed red 2px';
el.appendChild(document.createTextNode(
'Warning. Your wireless ISP is using an image recompression system '+
'that will make pictures look worse and which may stop this site '+
'from working. There may be a way for you to disable this feature. '+
'Please see your internet provider account settings, or try '+
'using the HTTPS version of this site.'
));
document.body.insertBefore(el, document.body.firstChild);
}
</script>
I'm suprised no one has put this as answer yet. The real solution is:
USE HTTPS!
This is the only way to stop ISPs (or anyone else) from inspecting all your traffic, snooping on your visitors, and modifying your website in flight.
With the advent of Let's Encrypt, getting a certificate is now free and easy. There's really no reason not to use HTTPS in this day and age.
You should also use a combination of redirects and HSTS to keep all of your users on HTTPS.
You provider might have enabled a Bytemobile Unison feature called "clientless personalization". Try accessing the fixed URL http://1.2.3.50/ups/ - if it's configured, you will end up on a page which will offer you to disable all feature you don't like. Including Javascript injection.
Good luck!
Alex.
If you're writing you own websites, adding a header worked for me:
PHP:
Header("Cache-Control: no-transform");
C#:
Response.Cache.SetNoTransforms();
VB.Net:
Response.Cache.SetNoTransforms()
Be sure to use it before any data has been sent to the browser.
I found a trick. Just add:
<!--<![-->
After:
<html>
More information (in German):
http://www.programmierer-forum.de/bmi-speedmanager-und-co-deaktivieren-als-webmaster-t292182.htm#3889392
BMI js it's not only on Vodafone. Verginmedia UK and T-Mobile UK also gives you this extra feature enabled as default and for free. ;-)
In T-mobile it's called "Mobile Broadband Accelerator"
You can Visit:
http://accelerator.t-mobile.co.uk
or
http://1.2.3.50/
to configure it.
In case the above doesn't apply to you or for some reason it's not an option
you could potentially set-up your local proxy (Polipo w/wo Tor)
There is also a Firefox addon called "blocksite"
or as more drastic approach reset tcp connection to 1.2.3.0/24:80 on your firewall.
But unfortunately that wouldn't fix the damage.
Funny enough T-mobile and Verginmedia mobile/broadband support is not aware about this feature! (2011.10.11)
PHP: Header("Cache-Control: no-transform"); Thanks!
I'm glad I found this page.
That Injector script was messing up my php page source code making me think I made an error in my php coding when viewing the page source. Even though the script was blocked with firefox NoScript add on. It was still messing up my code.
Well, after that irritating dilemma, I wanted to get rid of it completely and not just block it with adblock or noscript firefox add ons or just on my php page.
STOP http:// 1.2.3.4 Completely in Firefox: Get the add on: Modify
Headers.
Go to the modify header add on options... now on the Header Tab.
Select Action: Choose ADD.
For Header Name type in: cache-control
For Header Value type in: no-transform
For Comment type in: Block 1.2.3.4
Click add... Then click Start.
The 1.2.3.4 script will not be injected into any more pages! yeah!
I no longer see 1.2.3.4 being blocked by NoScript. cause it's not there. yeah.
But I will still add: PHP: Header("Cache-Control: no-transform"); to my php pages.
If you are getting it on a site that you own or are developing, then you can simply override the function by setting it to null. This is what worked for me just fine.
bmi_SafeAddOnload = null;
As for getting it on other sites you visit, then you could probably open the devtools console and just enter that into there and wipe it out if a page is taking a long time to load. Haven't yet tested that though.
Ok nothing working to me. Then i replace image url every second because when my DOM updates, the problem is here again. Other solution is only use background style auto include in pages. Nothing is clean.
setInterval(function(){ imageUpdate(); }, 1000);
function imageUpdate() {
console.log('######imageUpdate');
var image = document.querySelectorAll("img");
for (var num = 0; num < image.length; num++) {
if (stringBeginWith(image[num].src, "http://1.1.1.1/bmi/***yourfoldershere***")) {
var str=image[num].src;
var res=str.replace("http://1.1.1.1/bmi/***yourfoldershere***", "");
image[num].src = res;
console.log("replace"+str+" by "+res);
/*
other solution is to push img src in data-src and push after dom loading all your data-src in your img src
var data-str=image[num].data-src;
image[num].src = data-str;
*/
}
}
}
function stringEndsWith(string, suffix) {
return string.indexOf(suffix, string.length - suffix.length) !== -1
}
function stringBeginWith(string, prefix) {
return string.indexOf(prefix, prefix.length-string.length) !== -1
}
An effective solution that I found was to edit your hosts file (/etc/hosts on Unix/Linux type systems, C:\Windows\System32\drivers\etc on Windows) to have:
null 1.2.3.4
Which effectively maps all requests to 1.2.3.4 to null. Tested with my Crazy Johns (owned by Vofafone) mobile broadband. If your provider uses a different IP address for the injected script, just change it to that IP.
Header("Cache-Control: no-transform");
use the above php code in your each php file and you will get rid of 1.2.3.4 code injection.
That's all.
I too was suffering from same problem, now it is rectified. Give a try.
I added to /etc/hosts
1.2.3.4 localhost
Seems to have fixed it.

Read a text file

I have looked everywhere and surprisingly can't find a good solution to this! I've got the following code that is supposed to read a text file and display it's contents. But it's not reading, for some reason. Am I doing something wrong?
FTR, I can't use PHP for this. It's gotta be Javascript.
var txtFile = new XMLHttpRequest();
txtFile.open("GET", "http://www.mysite.com/todaysTrivia.txt", true);
txtFile.send(null);
txtFile.onreadystatechange = function() {
if (txtFile.readyState == 4) { // Makes sure the document is ready to parse.
alert(txtFile.responseText+" - "+txtFile.status);
//if (txtFile.status === 200) { // Makes sure it's found the file.
var doc = document.getElementById("Trivia-Widget");
if (doc) {
doc.innerHTML = txtFile.responseText ;
}
//}
}
txtFile.send(null);
}
Any good ideas what I'm doing wrong? It just keeps givimg me a zero status.
EDIT: I guess it would be a good idea to explain why I need this code. It's basically a widget that other folks can put on their own websites that grabs a line of text from my website and displays it on theirs. The problem is that it really can't be server-side since I've got zero control over everyone else's sites that use this.
If this is cross domain, you won't be able to do this with an xmlhttprequest due to the same origin policy.
This exmaple contains jQuery code.
var text;
$.get( "proxy.php", function(data) {
text = data.responseText;
});
Then in proxy.php:
<?php
header('Content-type: application/xml');
$daurl = 'http://www.mysite.com/todaysTrivia.txt';
$handle = fopen($daurl, "r");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
Example taken from here:
http://jquery-howto.blogspot.com/2009/04/cross-domain-ajax-querying-with-jquery.html
As explained before, xmlhttp is designed for forbid cross domain requests for security issues. But nothing prevents you from doing this on your server in PHP.
Another example can be found here: http://usejquery.com/posts/9/the-jquery-cross-domain-ajax-guide
Your problem could be with the fact that you can only request XML data from the same domain via Javascript. This is the biggest issue with AJAX calls - if the text file is on another server, you can't get it via AJAX. If it's on the same server, make your request using a relative URL (no http://).
EDIT
Now that I know what you're trying to accomplish ... my recommendation would be to use an iFrame. Build the system on your server using server-side code and allow remote sites to embed an iFrame to display the output on their own sites. NetworkedBlogs uses this for displaying Facebook features on remote sites. iGoogle uses it extensively with their various Apps and Gadgets. It's a fairly tried-and-true method.
The advantage of using an iFrame is that you'll still have control over most of the content of the widget, but you can give end-users control over the styling (just have your iFrame application accept arguments via query variables to change colors, positions, and sizes).
Assuming the AJAX stuff is right (which I haven't confirmed): You say you can't use PHP for this - if you just mean you need it to use javascript asynchronously but can still use server code in some places, what about using PHP (or any server-side language) to do the actual work and return it to the page through AJAX/javascript - this would solve the problem Alex brings up.
So instead of getting from mysite.com/something.txt from javascript, get it from SomeAjaxHelper.php (or aspx or whatever).
For cross domain, you would have to use dynamic script tags to fetch data asynchronously. The todaysTrivia file would be a .js file that stores the data as JSON. Google for "dynamic script tags cross domain" if you want to use this technique.

Check in JavaScript if an SSL Certificate is valid

Is there a way to check in JavaScript if given a host its SSL certificate is valid? (non blocking)
In my case, I want to display: "you can also use https://.." if via JavaScript I can make a request to https://my_url without being asked to accept an untrusted certificate.
Can this be done asynchronously?
Take a look here: https://support.mozilla.org/pl/questions/923494
<img src="https://the_site/the_image" onerror="redirectToCertPage()">
This solution is tested and working in current versions of FF and Chrome (as of 2022):
<script> var sslCertTrusted = false; </script>
<script src="https://example.com/ssltest.js"></script>
<script>
if (!sslCertTrusted)
{
alert('Sorry, you need to install the certificate first.');
window.location.replace('http://example.com/cert_install_instructions/');
}
else
{
// alert('Redirecting to secure connection')
window.location.replace('https://example.com/');
}
<script>
You of course need to make your web server return this code under the URL https://example.com/ssltest.js:
sslCertTrusted = true;
I'm not exactly sure about the details. But I've seen similar technology used to detect adblocking etc. You may need to piggyback on the window object maybe, if the variable can't be modified by another script, but generally making the above proof of concept work is left as an exercise to the reader.
What I've found up to now - it is possible with Firefox, don't know yet about other browsers:
https://developer.mozilla.org/En/How_to_check_the_security_state_of_an_XMLHTTPRequest_over_SSL
The straight answer is no. Javascript does not provide any means of validating certificates. This is a job left to the browser.
A better approach to this problem is from the server side. If you are controlling the site, than you can render down a variable on the page with information gleaned on the server side.
In .Net something like
var canSecure = <%= MySiteHasSsl ? "true" : "false" %>;
if (canSecure) {
if (confirm("This site supports SSL encryption. Would you like to switch to a secure connection?")) {
location.href = "https://mysite.com";
}
}
I'm not quite sure what your use case is. If you are just trying to "check ahead of time" before you provide a link to someone for another website then the other answers here will be more relevant than mine.
If you are expecting mysite.com to use an SSL certificate that isn't trusted by default in the browser but you have another way of knowing it should be trusted, then you could use a JavaScript TLS implementation to make cross-domain requests to that other site. However, this requires that your website be served on https and trusted in the browser to begin with and the other site to provide a Flash cross-domain policy file.
If this sounds anything like what you want to do, check out the open source Forge project at github:
http://github.com/digitalbazaar/forge/blob/master/README.md
Useful notice: navigator.clipboard will be undefined on Chrome browsers if there's no valid SSL certificate.
The question doesn't make sense. You can't get the server's SSL certificate without opening an SSL connection to it, and once you've done that, telling the user they can do that too is a bit pointless.
You could run a server elsewhere that handles certificate checks based on whatever you want, then your javascript application sends a request to that server asking for a checkup. This does require that you have at least one server somewhere in the world that you can trust.
A query of this nature can be done in the background quite easily.

Dashboard Cross-domain AJAX with jquery

Hey everyone, I'm working on a widget for Apple's Dashboard and I've run into a problem while trying to get data from my server using jquery's ajax function. Here's my javascript code:
$.getJSON("http://example.com/getData.php?act=data",function(json) {
$("#devMessage").html(json.message)
if(json.version != version) {
$("#latestVersion").css("color","red")
}
$("#latestVersion").html(json.version)
})
And the server responds with this json:
{"message":"Hello World","version":"1.0"}
For some reason though, when I run this the fields on the widget don't change. From debugging, I've learned that the widget doesn't even make the request to the server, so it makes me think that Apple has some kind of external URL block in place. I know this can't be true though, because many widgets phone home to check for updates.
Does anyone have any ideas as to what could be wrong?
EDIT: Also, this code works perfectly fine in Safari.
As requested by Luca, here's the PHP and Javascript code that's running right now:
PHP:
echo $_GET["callback"].'({"message":"Hello World","version":"1.0"});';
Javascript:
function showBack(event)
{
var front = document.getElementById("front");
var back = document.getElementById("back");
if (window.widget) {
widget.prepareForTransition("ToBack");
}
front.style.display = "none";
back.style.display = "block";
stopTime();
if (window.widget) {
setTimeout('widget.performTransition();', 0);
}
$.getJSON('http://nakedsteve.com/data/the-button.php?callback=?',function(json) {
$("#devMessage").html(json.message)
if(json.version != version) {
$("#latestVersion").css("color","red")
}
$("#latestVersion").html(json.version)
})
}
In Dashcode click Widget Attributes then Allow Network Access make sure that option is checked. I've built something that simply refused to work, and this was the solution.
Cross-domain Ajax requests ( Using the XMLHttpRequest / ActiveX object ) are not allowed in the current standard, as per the W3C spec:
This specification does not include
the following features which are being
considered for a future version of
this specification:
Cross-site XMLHttpRequest;
However there's 1 technique of doing ajax requests cross-domain, JSONP, by including a script tag on the page, and with a little server configuration.
jQuery supports this, but instead of responding on your server with this
{"message":"Hello World","version":"1.0"}
you'll want to respond with this:
myCallback({"message":"Hello World","version":"1.0"});
myCallback must be the value in the "callback" parameter you passed in the $.getJSON() function. So if I was using PHP, this would work:
echo $_GET["callback"].'({"message":"Hello World","version":"1.0"});';
Apple has some kind of external URL block in place.
In your Info.plist you need to have the key AllowNetworkAccess set to true.
<key>allowNetworkAccess</key>
<true/>
Your code works in Safari because it is not constrained in the dashboard sever and it is not standards complient in that it DOES allow cross site AJAX. FF IS standards complient in that it DOES NOT allow cross site ajax.
If you are creating a dashboard widget, why don't you use the XMLHttpRequest Setup function in the code library of DashCode. Apple built these in so you don't need to install 3rd party JS libraries. I'm not sure about JSON support but perhaps starting here will lead you in a better direction.
So another solution is to create your own server side web service where you can control the CORS of, the users web browser can't access another site, but if you wrap that other site in your own web service (on the same domain) then it does not cause an issue.
Interesting that it works in Safari. As far as I know to do x-domain ajax requests you need to use the jsonp dataType.
http://docs.jquery.com/Ajax/jQuery.getJSON
http://bob.pythonmac.org/archives/2005/12/05/remote-json-jsonp/
Basically you need to add callback=? to your query string and jquery will automatically replace it with the correct method eg:
$.getJSON("http://example.com/getData.php?act=data&callback=?",function(){ ... });
EDIT: put the callback=? bit at the end of the query string just to be safe.

Categories