Favicons and mixed content WHYYYY? - javascript

I am building a web app that is served over https. I am getting a lot of console warnings like these:
Mixed Content: The page at 'https://www.sharewalks.com/' was loaded over
HTTPS, but requested an insecure image
'http://yandex.st/lego/_/pDu9OWAQKB0s2J9IojKpiS_Eho.ico?1493850556643'.
This content should also be served over HTTPS.
There are 14 of these - from the following urls (the numbers change):
FAVICON ERRORS:
http://www.google.com/favicon.ico?1493850556625
http://www.baidu.com/favicon.ico?1493850556625
http://www.cloudflare.com/favicon.ico?1493850556625
http://www.yandex.ru/favicon.ico?1493850556633
OTHER?:
http://yandex.st/lego/_/pDu9OWAQKB0s2J9IojKpiS_Eho.ico?1493850556633
I need all content to be served HTTPS because I want to use geolocation services and I read that some browsers won't allow it unless ALL content is HTTPS. In testing, navigator works on laptop Chrome, but not in mobile browsers (chrome, safari and firefox).
But I'm not requesting these favicons. I don't really even know from where they are being called.
My question is what are these favicons and why are they messing with me? Is there any way around this?

OK I did a global search within my project for some of these url names, and lo! it turns out I was using a library called is-online which calls some of these sites as "tests" to see if you're online. I changed the 'hostnames' file to use the full https url and voila the errors disappeared. Thanks for Barmar for taking the time to answer me without merely downvoting my question!

Firefox -> F12 -> console , you can see the errors of mixed content
Also in Chrome you have the same results in "developer tools" -> console

Related

Delphi TWebBrowser JavaScript Errors and Cookies

I have been stuck on this for a couple of weeks now and this is a follow on from SO question Delphi REST Debugger Returns Error 429 Too Many Requests but Browser Returns JSON as Expected
I was wanting to get the content of a url response using the TNetHTTPRequest and TNetHTTPClient components. I was continually getting 429 errors “too many requests”. When using Firefox Inspect Element to look at network and storage, I discovered that I needed to receive cookies and then send those cookies with my request. Unfortunately, one of the cookies essential to the website content seems to be dependent (I think) on the execution of javascript. I went back to first principles and dropped a TWebbrowser on a form (VCL) and sure enough browser shows a javascript error “Expected Identifier”.
When I use the TWebbrowser in FMX it does not throw an error it just does not return the website contents at all and remains blank. I need FMX as I will be in a cross platform mobile environment.
The URL is https://shop.coles.com.au/a/national/home
I use Delphi Community Edition 10.3.3 Rio.
The URL returns perfectly in commercial browsers Firefox, Safari, Chrome and even CEF4Delphi. Unfortunately, I can’t use CEF as I need cross platform.
I would like to know how to get the website content returned to the browser (or even better NetHTTPClient) without script errors and how to access the browsers current cookies.
Any help will be most appreciated.
Thanks,
John.
URL returns perfectly in commercial browsers ... without script errors and how to access the browsers current cookies
If you'd inspect the network traffic (F12 > Network, then requesting your URL) or use uMatrix (to block everything that doesn't belong to the domain by default) you'd see the JS does at least one XHR to amazonaws.com. Your HTTP transfer alone (as done by TNetHTTP*) works fine and you get the same resource that each internet browser gets.
However, you don't operate with what you got (in contrast to the internet browser, which also automatically parses the HTML, sees JS resources, and executes them). TWebbrowser does not what you take for granted most likely due to security settings (try to get an error console in there, preferably F12 again). You need to do the same: parse the HTML resource for JS URIs, requesting those and executing what you get, while still providing the same cookie environment.
For executing JS you could use Chakra or mORMot or BESEN. It's challenging at first, but the more you understand about HTTP (including cookies) and a JS engine, the more you'll see why "things work" in one situation and not in another. There's a reason why an internet browser is a very complex software and not just a downloader.
As per this forcing IE11 Quirks mode might cure your problem already when using TWebBrowser:
TBrowserEmulationAdjuster.SetBrowserEmulationDWORD(TBrowserEmulationAdjuster.IE11_Quirks);

safari loads css/js very slow on client certificate SSL website

I have a website where I use client certificates for accessing the site and it runs with SSL required. It runs on IIS 8.5 on a windows server 2012 R2.
All my css and javascript is minified into 4 seperate files
app.js -> Our own javascript
app.css -> Our own css
vendor.js -> External javascript libraries
vendor.css -> External css libraries
All of these files are minified and placed locally on the server.
The site works very well when using chrome or IE from a computer, but when using safari (only tried safari 5 on PC and latest safari on iphone 6/7) the page can stuck in a "loading" mode. The does not happend every time, and when it does it often helps with clearing the cache in safari and try again.
The website also uses local storage to save some userdata, and a cookie that stores a token for authentication. Not sure if this is useful information, just throwing it out there.
It can connect to the webserver, since we can see the EV+ certificate.
When debugging the phone on a mac, or safari on a PC and looking at the network tab in the developer window I can see that sometimes it takes really long time for the browser to load some of the css and/or javascript files.
Sometimes it appears to be vendor.js, and sometimes app.css, and sometimes the other ones. There seems to be no logic to me, that its always the same files etc.
The site is .NET 4.6 site, running with angularjs, signalr 2.2.1 and html5 in the front.
We have tried
Monotoring IIS Logs and network traffic
Remove sourcemap on css/js to reduce file size
Tried reference signalr/hubs (the generated js file). And also tried copying the content into vendor.js so there is a local version instead
Without any success ATM. I would really appreciate help, feeling stuck on this one.
Many Thanks!
It may be dynamic compression. Are you using Brotli compression on the server?
I suggest a detailed analysis of HTTP Request and Response headers. There may be a discrepancy resulting in this unexpected behaviour. I would follow this up with scouring the Safari bug tracker.
The SSL certificate itself may be the issue, or rather Safari's interpretation of policies.
Hope it helps.

Chrome blocking iframe requests as cross-origin request even when origins are the same

This one has me stumped.
I have a web app that has a file upload/download area. Files are downloaded in the background via a temporary iFrame element. This is a single-paged AJAX application and the UI is written in Javascript, jQuery and uses the jQuery.FileDownloader.js to manage the iFrame. The application runs over HTTPS and the site and download URL are on the same exact domain. The back-end is a RESTful application. This has worked great for months. Until today.
All of a sudden, when attempting to download a file in Chrome, the browser reports an error of "Blocked a frame with origin https://example.com from accessing a cross-origin frame."
The problem is that the origin of the main site and that of the iframe are the exact same domain. I have ensured that the domains are the same as well as the protocol. Chrome is the only browser that throws up the cross-origin error. IE, Firefox, Opera, Safari... all work as expected. It's only in Chrome and it's only as of today. To make things worse, no updates were made to the browser. It truly is spontaneous. I've also ruled out plugins as the cause by running in Incognito mode, where none are allowed to run by my settings, as well as disabling my anti-virus software. This problem is being exhibited on other computers, in other locations (not on our LAN or subnet), all running Chrome.
And, again, both domains of the parent frame and the embedded iframe are identical. This only happens against the production server which runs over HTTPS. Other non-HTTPS sites (e.g. our dev environment, localhost) don't have the problem. Our SSL is valid. Since this is a single-paged AJAX application, we're trying to avoid popping up another window for the download.
Hopefully, someone can offer some advice. Thanks in advance.
Update: After additional research, I have found the solution to this problem is to enclose the filename in the response header in double-quotes.
I have found the cause of the problem. It turns out that Google Chrome has problems with files that have commas in their filename. When downloading the file via a direct link, Chrome will report that duplicate headers were reported from the server. This has been a long-standing problem with Chrome that has gone un-addressed. Other browsers are not susceptible to this problem. Still, it's a fairly easy problem to troubleshoot and, indeed, when I searched on this error, the first search result had the solution: remove commas from filenames when handling a request from Google Chrome.
However, this wasn't a direct link, it was an AJAX-request, which results in a different exception. In this case, the error provided by Chrome is the cross-origin request exception and this is what made it so difficult to troubleshoot.
So, the tl;dr of it all is to strip out commas in the names of uploaded files.
Another instance I found where this issue occurred is after executing code similar to:
document.domain = '[the exact same domain that the iframe originally had]'
Removing this line of code got rid of this error for me.

Facebook App is not working in Chrome but well in Firefox

I have just started making Facebook Apps using heroku. I made a test app. I uploaded a page on heroku which uses HTML5, CSS and Javascript. The app is not showing in Google Chrome https://apps.facebook.com/shrytestapp/ but works well in Mozilla Firefox. Also, the page works well when opened in heroku server http://salty-shelf-6707.herokuapp.com/.
When you access the app within Facebook, HTTPS is used to transfer the data, but Chrome has blocked content delivered over normal HTTP as a result and insists that everything be transfered securely whereas Firefox isn't so fussy.
Here's what the Console is showing in Chrome
[blocked] The page at https://salty-shelf-6707.herokuapp.com/
ran insecure content from http://www.google.com/jsapi.
Uncaught ReferenceError: google is not defined
Google's JS API has been blocked and the JavaScript fails to run.
(You also have some not found errors, but that's unrelated)
The app works fine through the http://salty-shelf-6707.herokuapp.com/ as you mentioned, but not through https://salty-shelf-6707.herokuapp.com/
Try using the following instead to load the API
<script type="text/javascript" src="//www.google.com/jsapi"></script>
the // at the start of the src value will make the url protocol-relative or for the correct technical term, scheme-relative.
Paul Irish, the lead developer of HTML5 Boilerplate, has more information about this in a post on his site.

Get JS file via HTTPS from a HTTP page

Okay, so what are the ramifications of getting a JS file via an HTTPS call while on a HTTP page.
I assume it would just be a little bit of extra overhead. Would there be any warnings about this call from any certain browser?
Don't ask why. It's just hypothetical.
There shouldn't be any warnings in any browser. You can try it out this URL - http://www.530geeks.com/mixed-content.html. I tested with ie6 and firefox 3.5, they don't complain.
Its logical too - the host page is being served over http, there is no implied trust with the connection. So there is nothing to warn the user.
This will probably trigger the "Mix of secure and insecure" content alert in Internet Explorer.
Sometimes IE (and maybe other browsers) get nervous and complain about pages having a mix of secure and insecure content. Whether what you're doing would cause that would require a simple test.

Categories