I have set up an apache server, hosting a website.
For example In my website you can play with an online javascript atari roms. Whenever you load a rom in the website, the javascript temporarily downloads it to your browsers cache.
If you for example write website.com/roms/atari.zip you can download this rom. I do not want this.
Is there a way to forbid direct access to this file but also whitelisting access from within the javascript requests?
Many thank you in advance.
Requiring Authorization header is one way to do this.
Creating an authentication user:
/path/to/htpasswd -c /etc/htpasswd/.htpasswd downloaduser
And you'd supply the password. Note that the command above will create a new file, overwriting a previous one.
You would configure it in httpd.config like:
<Directory "/var/www/html/roms">
AuthType Basic
AuthName "Authentication Required"
AuthUserFile "/etc/htpasswd/.htpasswd"
Require valid-user
</Directory>
Then, with XHR request in javascript,
req.setRequestHeader('Authorization','Basic ' + Base64StringOfUserColonPassword);
base64StringOFUserColonPassword is what the name implies, you can get it like window.btoa(username + ":" + password)), or with base64 command line command.
Further reading:
https://wiki.apache.org/httpd/PasswordBasicAuth
How to assign basic authentication header to XMLHTTPREQUEST?
Edit: there are xampp specific instructions for example here: http://chandanpatra.blogspot.com/2013/08/basic-authentication-with-htpasswd-in.html. The process is as I outlined for xampp as well.
Related
I have two services I'm running locally on docker images. One of them is an nginx server with configuration to proxy requests to various other services, and the other is a simple React GraphiQL UI.
The nginx server is not explicitly set up to run on localhost, but when making requests with curl/postman I can explicitly set the host header to be that of the actual url (rather than localhost) and it will then find the correct config and the request will succeed.
The issue is that I would like to call the server from a local instance of my UI, but it's failing because I can't overwrite the host header. I've tried to manually add it to my react fetch request but when I check the request in the browser the header isn't there. After some searching I then found some slack posts saying it's not possible, although no references to why.
return fetch(
edgeUrl(environment) + "/some/endpoint",
{
method: "POST",
headers: {
'Authorization': 'Bearer ' + getApiKey(partner, environment),
'host': 'actual.host.com',
'origin': 'http://localhost/'
},
body: JSON.stringify({ query })
}
)
Is there any other way to override the host used in requests? Possibly another http library I could use? I'd prefer not to have to configure the nginx server for localhost as it is owned by another team.
You should not try change the host header. The browser won't allow you to, and it's not the right way to do it.
As I see it, you have 2 options:
Configure NGINX to accept requests to localhost, if that is its' actually hostname.
Change the hosts file, to include your domain to point to 127.0.0.1, which is equivalent to adding it to DNS.
The Windows Hosts file is located here: C:\Windows\System32\drivers\etc\hosts.
You should add the following to your hosts file after the comments #.
actual.host.com 127.0.0.1
For anyone interested here's some information on possible attacks using the host header and why it's useful to validate it, which is what this service is doing.
https://portswigger.net/web-security/host-header
https://infosecwriteups.com/identifying-escalating-http-host-header-injection-attacks-7586d0ff2c67
I'm going to ask the other team if I can add localhost configuration to their nginx config so that I can make requests locally, looks like my coworker was misinformed in suggesting I override the host header.
I have a page with some D3 javascript on. This page sits within a HTTPS website, but the certificate is self-signed.
When I load the page, my D3 visualisations do not show, and I get the error:
Mixed Content: The page at 'https://integration.jsite.com/data/vis' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint 'http://integration.jsite.com/data/rdata.csv'. This request has been blocked; the content must be served over HTTPS.
I did some research and all I found what the JavaScript will make the call with the same protocol that the page was loaded. So if page was loaded via https then the rdata.csv should also have been requested via https, instead it is requested as http.
Is this because the certificate is self-signed on the server? What I can do to fix this, other than installing a real SSL certificate?
What I can do to fix this (other than installing a real SSL certificate).
You can't.
On an https webpage you can only make AJAX request to https webpage (With a certificate trusted by the browser, if you use a self-signed one, it will not work for your visitors)
Steps to Allow Insecure Content in Chrome
To allow insecure content on individual sites within Chrome, click on the lock icon in the URL bar, then click 'Site settings'.
There you will see a list of various permissions the page has. Choose 'Allow' next to 'Insecure content'.
Now your HTTPS site can access HTTP endpoint
I had the same issue for my angular project, then I make it work in Chrome by changing the setting. Go to Chrome setting -->site setting -->Insecure content --> click add button of allow, then add your domain name
[*.]XXXX.biz
Now problem will be solved.
You will be able to solve the error by adding this code to your html file:
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests" />
If any solutions don't work, try this solution.
I solved the problem adding a slash at the end of the requesting url
This way: '/data/180/'
instead of: '/data/180'
As for me, I had same warning.
I fixed it at URL request.
I had excessive '/'.
Before:
const url = ${URL}search/movie/?api_key=${API_KEY}&query=${movie};
After:
const url = ${URL}search/movie?api_key=${API_KEY}&query=${movie};
I had the same problem but from IIS in visual studio, I went to project properties -> Web -> and project url change http to https
One solution here server side end point which you access via https, which then makes the call to whichever http url, and then and returns the result. In other words, making your own little HTTPS proxy to access the http resource
update core_config_data
set value='X-Forwarded-Proto'
where path='web/secure/offloader_header'
this is easy,
if you use .htaccess , check http: for https: ,
if you use codeigniter, check config : url_base -> you url http change for https.....
I solved my problem.
I've started to write a HTML file which displays data with JavaScript. Since it shall be done as easy as possible I don't want to run nodejs oder any other local http server. I've just opened the HTML file in a browser (url is file:///home/visu/index.htm).
Everything is fine, till a jquery ajax request to a online API is done in the index.htm. The browser blocks the request with the message:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://x.x.x.x. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing)."
How can I get rid of the problem without starting a local http server?
A possible solution is to start the browser with some "no security flags" or disable CORS with plugins, but this I've to do manually all the time so I don't like it.
When your browser will perform an AJAX request to a different server than the one hosting the current page, it first sends an OPTIONS HTTP message. In that message it sends the following header:
origin: http://my-web-server.com
And the backend server will respond with:
access-control-allow-origin: http://my-web-server.com
But, when you don't have a webserver, there is no address for your browser to put in that origin header. That's why your browser disallows you to do any AJAX request from a local file (maybe you can disable the browser's CORS security as someone mentioned in the comments, but that can put you at risk of malicious sites).
Another option
You can tell your browser to allow to connect from localhost to a backend if you change your backend to return the following header:
access-control-allow-origin: https://localhost:8888
And, you also need to tell your localhost server to serve your page in HTTPS instead of HTTP. Once both conditions are met, CORS validations won't fail.
Notice that to enable HTTPS you'll need to have a SSL cert and key, you can generate them with the following command:
openssl req -x509 -out localhost.crt -keyout localhost.key \
-newkey rsa:2048 -nodes -sha256 \
-subj '/CN=localhost' -extensions EXT -config <( \
printf "[dn]\nCN=localhost\n[req]\ndistinguished_name = dn\n[EXT]\nsubjectAltName=DNS:localhost\nkeyUsage=digitalSignature\nextendedKeyUsage=serverAuth")
The source of that command and more information are found in this page from Let's Encrypt.
On Firefox, you can install this addon: https://addons.mozilla.org/en-US/firefox/addon/cors-everywhere/ to disable CORS for the respective tab. Then, any request will also work on file:/// URIs. Be careful though!
Either mock Ajax calls, or start web server with reverse proxy and HTTP rewriting configured, since I'm sure you don't want, or have not access to configure API server CORS headers.
If you don't want to mock ajax calls, then use either:
node-http-proxy
nginx - if you don't have nodejs and you don't want to install it.
Not Possible By Design
CORS are always blocked when attempted from a file on disk (web pages using the file:// protocol). There is nothing you can do to make it work from a file. It is simply impossible.
The reasoning for this is that files on disk have no real "origin" to allow the backend server to determine the validity of the request. You can have a file for an issue tracking html on the same disk as a file for a blog html. The server cannot know which html requested the data (you can even have someone else's file shared via Dropbox with embedded javascript that may attempt to access your server's data when you open it - nobody expects a hacking attempt when they simply open a plain html file!!).
This is why no browser vendor will allow you do make CORS requests from a file.
You Need a Server
To make it work you will need a HTTP server. There are lots of options for this from installing Apache/Nginx on your machine to running dev servers like webpack-dev-server or local-web-server. As long as the protocol is http:// or https:// you are allowed to make CORS requests.
Once you have a server serving your html file you can configure CORS on your backend as usual.
If you can not set it up access-control-allow-origin, you can try this.
Use "callback" function if your data is not in same domain.
And wrap your data "jsonCallback(" ... data ... ") as my example: http://www.ceducation.cz/akce-plnytext.json?short=1&callback=?
function jsonCallback(json) {
$.each(json, function(key, val) {
// your data is here
console.log('date: ' + val.date);
console.log('address: ' + val.address);
console.log('---');
});
}
$(document).ready(function() {
$.getJSON("http://www.ceducation.cz/akce-plnytext.json?short=1&callback=?", function(data) {
});
});
Working example
I am trying to upload to S3 from my meteor app, such that the data is encrypted at rest. I'm using this package, but I modified it because it doesn't yet support specifying SSE (I created this issue for it).
In my forked version of the code, I added these lines to the uploadFile function near the bottom of this file:
if ops.server_side_encryption
form_data.append "x-amz-server-side-encryption", "AES256"
and a way to set ops.server_side_encryption to true.
This is all very simple, and I successfully add x-amz-server-side-encryption": "AES256" to the form_data that gets posted. The problem is that adding this parameter causes a 403 Forbidden response from s3.
AWS docs don't say that the bucket needs to be anything special to allow this new parameter, and thus SSE. They talk about enforcing that a client request specifies encryption, and I also tried adding that policy to no avail (though I wouldn't expect that to work because the docs don't say you need a special policy to allow this parameter).
Is there missing info, about some configuration that needs to be in place to allow that SSE parameters in client upload requests?
The answer (which I found thanks to Mark B's comment) is this:
I was using the form POST method of uploading a file to S3. I forgot that the policy document in the post can contain certain restrictions. The policy document being sent needed to allow encryption ({"x-amz-server-side-encryption": "AES256"} as one of my conditions).
I want to extract the source code of a webpage which is hosted by other website, but the problem is that O get an empty response, I tried to pull the source of multiple websites but the problem is from my code:
$(document).ready(function(){
$.get('http://www.xxxx.com', function(xdata) {
alert("content: "+xdata);
});
});
Is there any mistake?
Note: when I try to get the source of a local page, it works, but I don't know why it doesn't for an external one
Thanks
This isn't allowed, according to the Same Origin Policy.
The only way to approach this is to use some server-side pull of the data, which you would then process using your AJAX requests, this is known as a Cross-Domain Proxy.
you cant use AJAX across domains
You cannot use contain from an other domain because of the same origin policy
please look into JsonP.
Access-Control-Allow-Origin: * header need to set on external site to do cross domain access.
Because of the SOP (same origin policy), You can't use URL's from other domains. Try accessing a page from local server and don't use http.
If you're not interested in building your own proxy, there's a very easy-to-use public proxy (hosted on AppEngine) for this, with a JavaScript library. CurlJS: http://curljs.azoffdesign.com/
Your example could be done like this (after including the library):
curl("http://www.xxxx.com", function (status, xdata) {
alert("content:" + xdata);
});
Hope that helps!
I have a server using a virtual domain and created an apache proxy.
Super fast, works, no quirks.
Copy this, fix paths (mod_proxy, domains, etc...), add to your .conf file, restart server
LoadModule proxy_module /usr/local/zend/apache2/modules/mod_proxy.so
LoadModule proxy_http_module /usr/local/zend/apache2/modules/mod_proxy_http.so
ProxyRequests Off
ProxyPreserveHost On
ProxyPass /datadomain http://datadomain.com/webservices
ProxyPassReverse /datadomain http://datadomain.com/webservices
Now http://datadomain.com/webservices/data.php = http://yourdomain.com/datadomain/data.php
Enjoy!!!