Image Resource Loaded via JS Gets Automatically Upgraded to HTTPS - javascript

When loading an image from a secure server via JS with the following code:
var preloadImage = new Image();
preloadImage.src = 'http://some/resource.png';
The request gets automatically upgraded to https. Presumably it's a well intentioned feature to stop mixed content. However the server I'm pointing to can only do http. I've been browsing the methods and properties available on Image to no avail. I mean ideally yes, the image would be https, but it's just a temp server spun up on AWS so if we can avoid it for now, that'd be much easier.
Does anyone have a workaround to stop JS automatically upgrading the request?

It is not trigged from the JS code it's up to the server to check if client can support https and force it.
My suggestion would be to have an intermediate server/proxy that will do the request to the remote server using https and communicate with the origin server using http

It turns out a recent update to most browsers has stopped mixed content all together. Before it used to throw a warning, now it's just not possible. We got around it with some nginx reverse proxy magic where we passed the IP in the URL like https://normal-domain/preview/IPADDESSS and it's working now.

Related

Cross-origin request for local file

I need to open a local html file in the browser. The javascript works fine but ajax stops working and XMLHttpRequest gives a cross origin error. Is there a way to run ajax from local directory. It is necessary for me that it is run from local file only.
Thanks
For anyone who wants to know, I had to run a server in the app to serve the js files. Seems like it's not possible to do it without running a server.
If anyone knows of another way, do tell.
The simplest way to allow this in Firefox is to navigate to about:config, look for the privacy.file_unique_originsetting and toggle it off.
Essentially, Firefox used to treat local files from the same directory as being from the same source, thus CORS was happily satisfied. That behavior changed after CVE-2019-11730 was discovered.
It worked fine for me on 84.0.1 on Arch. Just be sure to turn it off when not locally debugging.
Source
If you are using VS Code, the Live Server extension might help you. It resolved a cross-origin issue I was having when editing a webpage.
https://marketplace.visualstudio.com/items?itemName=ritwickdey.LiveServer
If you are using chrome, try this extension
CORS allows web applications on one domain to make cross domain AJAX requests to another domain. It's dead simple to enable, only requiring a single response header to be sent by the server.
What this extension does is add to response header rule - Access-Control-Allow-Origin: *
You can do that manually also by sending a response header.
For simple CORS requests, the server only needs to add the following header to its response: Access-Control-Allow-Origin: *
Read this for more info.
If you are able to change the server code, you can try adding the string "null" to allowed origins. In Chrome, it sends the origin as "null" if it's running from a local file.
Here's an example in ASP.NET Core for setting up the "null" origin:
services.AddCors(options =>
{
options.AddPolicy("InsecurePolicy",
builder => builder
.WithOrigins("null")
.AllowAnyMethod()
.AllowAnyHeader()
.AllowCredentials());
});
Note that this is not recommended, but might be good enough if you just need it during development.

"Mixed content" Error when accessing HTTP within HTTPS page

Our intranet website has to communicate with a client .NET app. We're using a HttpListener (on http://localhost:[port]) on the client app and an iframe that refers to this url in the page. Its working like a charm when the page is HTTP.
Problem:
When the site is HTTPS a 'Mixed content' Javascript error is displayed in newer browsers and the request doesnt arrive at the client.
I believe this error would also occur when using an Ajax request instead of an iframe.
I also tried to bind a self-signed certificate to the listener and listening on https://localhost:[port] (which works for IE), but since the Firefox has its own certificate store its really tough to install it there automatically (IE uses Windows certificate store which is easy to install there).
So, does anyone know any possibility to make a request to http://localhost:[port] when the site itself is HTTPS that works for both FF and IE?
Thanks!
Change the iframe to:
<script>
var request = new XMLHTTPRequest();
request.open("GET", "http://localhost:[port]/?action=doStuff");
request.send();
</script>
You will also need to make some minor modifications to your app.
It needs to implement an OPTIONS method and it needs to return a cross-origin-resource-policy. This sounds a lot harder than it is, it just needs to return a reply with the Access-Control-Allow-Origin header set to *.
The response of the GET request must also have this header.
If you know all the domains that try to communicate with your app on localhost you can change the * to a whitelist or even just a single value.

How to test GET/POST or better way?

I am writing an app that will send POST/GET requests from a remote client to a server. As a way to test and also educate myself I was trying to make these requests by writing a simple html file that resides on my desktop with a form that would POST to a server side php file. I also tried using ajax style requests or XMLHttprequests independent of and connected to a form but I received an error in the console:
Cross Origin Request Blocked
For all attempts form or no form. I have read that this is because I am making the request from FILE:// and this is not allowed by browsers unless CORS is enabled in some way. I have also read that using a webserver to host the file might fix the problem but I don't understand what is meant by webserver (separate or same domain? lamp, xammp, mamp?) and I am not interested in overriding security (allowing cors with headers) if I am eventually going to have to find a secure way when it goes live anyway.
As I said I am really just trying to test and I would like my html files to communicate with the server-side php from another machine instead of just putting all files together in the same domain/folder. Is there a way to do this using html/javascript or Websockets or anything html5 has to offer that might be useful?
code examples are welcome but if anyone could help me grasp this concept better it would be greatly appreciated. I am a noob XD
I am open to a better approach entirely if one exists, the only constraint I have is that everything on the UI/Client end is going to be written in html/javascript but I can utilize either or both intel XDK api and Cordova api as well.
Please help and thank you.
If you need to send some HTTP requests to test the server-side of your app I would strongly recommend you use an HTTP client like Fiddler:
http://www.telerik.com/fiddler
Also, read this:
GUI HTTP client
A desktop-based client will have a nice GUI with plenty of features to tweak, save, send, resend your requests.
One thing you can do is use pythons SimpleHTTPServer to serve the html file. Then when you go to your browser and go to 127.0.0.1:8000 the origin will be the same.
You can run the server by going into the directory that has the html file and running the command python -m SimpleHTTPServer 8000. This will serve the content of that directory on port 8000 and it should allow the requests to be made without a CORS exception.
Heres the documentation https://docs.python.org/2/library/simplehttpserver.html
I recommend that you disable the same-origin policy in your browser in order to test cross domain AJAX request from a local file.
For example, with Google Chrome on Windows you can disable this by launching chrome with the following command :
C:\Users\YOUR_USER\AppData\Local\Google\Chrome\Application\chrome.exe --allow-file-access-from-files --disable-web-security
Together, both of these flags will allow you to test cross-domain ajax requests from a local file. These flags are relevant across Mac, Windows and Linux.
This is not on how to write the requests but you can bypass writing those if you are using the latest version of the Intel XDK. If you go to the services tab in the latest version of the Intel XDK, there is a service by the name of Sandbox Explorer. It has a GET and POST method UI. Just plug in the URL and you will see the response immediately. You can use this to debug the server that you are writing. Once you have the server returning the right response, create a data binding to use the GET API in your client side javascript or html code. Cross origin is taken care of.

Client-side socket.io without a node.js server

To use socket.io on the client side, usually we start a node.js server and go like this:
<script src="/socket.io/socket.io.js"></script>
or with specific port:
<script src="http://localhost:3700/socket.io/socket.io.js"></script>
Question is:
is it necessary to use node.js server to serve socket.io.js ?
...or is it possible to
make a local copy of socket.io.js instead of goes to server every single time we need socket.io?
like, we go to view source and copy everything we got from the source of script tag,
paste and save it as socket.io-local.js so that next time we use:
<script src="socket.io-local.js"></script>
will that work ?
Updates
Thanks for everyone's great response,
I'm asking this because in the case I'm involved, I don't actually have access to the server:
I am writing the client-side to connect to other developer's Socket Sever which is written in Java.
Therefore I'll have to think a way to work around the fact that I don't have a server there for me.
from what I've been testing,
this way seems to work but I really don't know what's happening behind the scene.
You obviously can host the socket.io client library anywhere and pull it in to a page. However, it will almost certainly not work with your Java-based server.
To understand why, you need to understand what socket.io is really doing behind the scenes; the client library is only a small part of it.
Socket.io actually defines and implements its own protocol for realtime communication between a browser and a server. It does so in a way that supports multiple transports: if—for example—a user's browser or proxy doesn't support WebSockets, it can fall back to long polling.
What the socket.io client actually does is:
Makes a XHR GET request for /socket.io/1. The server responds with a session ID, configured timeouts, and supported transports.
The client chooses the best transport that the user browser supports. In modern browsers, it will use WebSockets.
If WebSockets are supported, it creates a new WebSocket to initiate a WebSocket connection (HTTP GET with Upgrade: websocket header) to a special URL – /socket.io/1/websocket/<session id>.
If WebSockets aren't supported by the browser or fail to connect (there are lots of intermediaries in the wild like proxies, filters, network security devices, and so forth that don't support WebSocket requests), the library falls back to XHR long polling, and makes a XHR request to /socket.io/1/xhr-polling/<sesion id>. The server does not respond to the request until a new message is available or a timeout is reached, at which point the client repeats the XHR request.
Socket.io's server component handles the other end of that mess. It handles all the URLs under /socket.io/, setting up sessions, parsing WebSocket upgrades, actually sending messages, and a bunch of other bookkeeping.
Without all of the services provided by the socket.io server, the client library is pretty useless. It will just make a XHR request to a URL that doesn't exist on your server.
My guess is that your Java-based server just implements the WebSockets protocol. You can connect directly to it using the browser-provided WebSocket APIs.
It is possible that your server does implement the socket.io protocol – there are a few abandoned Java projects to do that – but that's unlikely. Talk with the developer of your server to find out exactly how he's implemented a "socket server."
A standalone build of socket.io-client is exposed automatically by the socket.io server as /socket.io/socket.io.js. Alternatively you can serve the file socket.io-client.js found at the root of this repository.
https://github.com/LearnBoost/socket.io-client
I have a module called shotgun-client that actually wraps socket.io. I needed to serve a custom client script as well as the socket.io client script, but I didn't want every user of my module to have to include multiple script references on their pages.
I found that, when installed, you can serve the generated client script from socket.io by reading the file /node_modules/socket.io/node_modules/socket.io-client/dist/socket.io.js. So my module adds a listener for its own URL and when it serves my custom client script it also serves the socket.io client script with it. Viola! Only a single script reference for the users of my module :)
While this is technically possible, I don't see why you'd need to do that. If you're concerned about reducing the data that goes over the wire, this change won't actually do much beyond the few characters saved in the shorter src tag. Simply changing the location of the JS file on the server won't actually improve performance - the JS has to be sent.
Proper caching (which Socket.IO has) will return a 304 Not Modified (and not re-send the JS file every time you load a page).

Is there a way to bypass Javascript / jQuery's same origin policy for local access?

Trying to use ajax, getJSON, and functions like that to fetch an external URL from a local (non-server) development computer. Is there a way to bypass the same origin policy, so that I can test locally, instead of having to upload to a server?
Here's the simple answer: chrome --disable-web-security
From the source code (chrome_switches.h):
// Don't enforce the same-origin policy. (Used by people testing their sites.)
const char kDisableWebSecurity[] = "disable-web-security";
I wanted to use jquery.js to send AJAX calls to a Google Apps python server running on port 8080. Just for testing, I wanted to run the browser and the server on the same machine.
I don't understand all the security nuances, but for temporary development it seems like a reasonable workaround. So long as I only use chrome for testing with this flag, it shouldn't be a problem.
Here's the whole command for Mac OS X:
/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --disable-web-security
We had the same need when developing our web app. Here's how we did it:
The browser and the server communicate only through JSON.
All the HTML is rendered in the browser using PURE (our JS template engine).
The browser code is developed locally like this:
We add a host parameter in the url of the app:
http://localhost/app.html?host=test.beebole-apps.com
In production, the JSON are sent to the server with a POST.
But here the function in charge of the ajax call will react to the host parameter and make a JSONP injection(GET) instead.
<script src="http://test.beebole-apps.com/?callback=f2309892&json={...}" />
f2309892 is a temporary function, with a random name, that points to the method that will handle the response
json is the JSON we send to the server
It means you will need some cooperation from the backend to serve you the json wrapped in a callback function like:
f2309892( /*the json here*/ );
Except a size limitation(you can't send a big JSON to the server with a GET) it works like a breeze.
An other advantage is you can call all the different systems(development and test) from the same localhost.
There are different ways to get around this, depending on which browser you're using for development. For example:
In Firefox (Gecko), set security.fileuri.strict_origin_policy to false
In Chrome, start the browser with the option --allow-file-access-from-files
References: Firefox, Chrome
Without touching the server -
The quickest and easiest way to bypass the same origin security policy in Firefox is the install the Force CORS add-on. This works with any service by inserting the proper headers into every response.
https://addons.mozilla.org/en-US/firefox/addon/forcecors/
Since this is a development issue and not a end-user/functionality issue, rather than focusing on getting AJAX to cross domains get your development environment set up as a proxy to fetch the most recent data from the production servers. This is actually really easy to do.
You'd need to set up a web server in your dev environment (if it doesn't have one already), and then configure the server to respond to 404 requests by fetching and then echoing production data. You can set up your server so that only the AJAX data files are picked up (otherwise, it will be confusing to debug other files if production assets start showing up on your development pages). So if http://dev.myserver.com/data/json/mydata.json is missing, your 404 script will get http://prod.myserver.com/data/json/mydata.json and echo it to the client. The nice thing about this set-up is that you can use mock data very easily: if the file is there in your dev environment, your AJAX script will get that; but if you then erase or rename that file, you'll get the production data instead. This feature has been so useful I can't recommend it enough.
If you're working with XML, I'd recommend duplicating the HTTP headers in the 404. If your 404 process responds with a Content-Type of text/html, you won't get any responseXML to parse.
try this (php curl ayax cross domain - by google):
http://www.iacons.net/writing/2007/08/02/ajax-cross-domain-proxy/
http://www.phpfour.com/blog/2008/03/cross-domain-ajax-using-php/
http://jquery-howto.blogspot.com/2009/04/cross-domain-ajax-querying-with-jquery.html
I had that problem, too, using Chrome and the --allow-file-access-from-files option didn't really help. Back to the script my server needed to return, I added these headers to the response and it worked fine :
'Access-Control-Allow-Origin: http://localhost/'
and another one for allowing a sort of key exchange
'Access-Control-Allow-Headers: X-KEY'
localhost is not allowed to use in CORS http://code.google.com/p/chromium/issues/detail?id=67743 use lvh.me instead

Categories