Satellizer not pass Authenication header to my API - javascript

So I am currently working locally, I have an API (Laravel). Everything is working great, I can login using Facebook, I get a JWT from my API and that is saved in local storage, however, after being logged and API calls do contain the 'Authorization: Bearer + token' header.
From what I understand in the docs, this should be all set up and ready to go without and config in the app side of things?
Here is my code:
app.js
$authProvider.tokenPrefix = '';
// Facebook
$authProvider.facebook({
clientId: '219883618025157',
url: APICONFIG.url + APICONFIG.version + 'auth/facebook/callback'
});
Example API Call:
$http.get(APICONFIG.url + APICONFIG.version + 'auth/logout').then(function(response) {}, function(error) {});
The request headers in the above request:
GET /v1/auth/logout HTTP/1.1
Host: api.myapp.app
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Accept: application/json, text/plain, */*
Origin: http://myapp.app
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.109 Safari/537.36
Referer: http://myapp.app/
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8
Any ideas what is going on here?

Ok so this was my silly mistake, I was essentially logging the user out of my AngularJS application (removing the token) and then POSTing to my API which off course wouldn't contain the token as it was being unset before my call.

Related

CORS errors in react application

I'm using webdav library inside a react application. As of now, I'm in an early stage of development and I'm just logging the results of a call that should list the contents of a directory. To do that I just created a method that instantiates the client and makes an ls of the directories and then logs them in the console.
const getItems = async() => {
const client = createClient(
"https://mywebdavurl.com/",
{
authType: AuthType.Password,
username: "user",
password: "passwd",
headers: {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods": "DELETE, POST, GET, OPTIONS",
"Access-Control-Allow-Headers": "Content-Type, Access-Control-Allow-Headers, Authorization, X-Requested-With"
}
}
);
console.log(client)
const directoryItems = await client.getDirectoryContents("/");
console.log(directoryItems);
}
function App() {
getItems()
return (
<Router>
<Header />
<Switch>
<Route exact path={ROUTES.HOME}>
Hello Sputnik
</Route>
</Switch>
</Router>
);
}
When I open the app on the browser (localhost) using Chrome, I receive some strange errors that I can't fix.
Network tab in the inspector
First of all, I can't fix the CORS problem. I tried adding the headers but the result will not change and I will receive this error in my console:
Access to XMLHttpRequest at 'https://mywebdavurl.com/' from origin 'http://localhost:3001' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
I tried, just for debugging purposes to use a Chrome Extension that disables the CORS. If I use this, the error is quite different and says:
Access to XMLHttpRequest at 'https://mywebdavurl.com/' from origin 'http://localhost:3001' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: It does not have HTTP ok status.
Well, if you look at the image, you can see that I have both a CORS problem and an Unauthorized problem. If I check my WebDAV server logs they state this:
2021/10/21 13:44:03 [error] 12#0: *106 no user/password was provided for basic authentication, client: 100.70.4.1, server: , request: "OPTIONS / HTTP/1.1", host: "https://mywebdavurl.com/", referrer: "http://localhost:3001/ "
2021/10/21 13:44:14 [error] 12#0: *106 no user/password was provided for basic authentication, client: 100.70.4.1, server: , request: "OPTIONS / HTTP/1.1", host: "https://mywebdavurl.com/", referrer: "http://localhost:3001/ "
2021/10/21 13:44:14 [error] 12#0: *105 no user/password was provided for basic authentication, client: 100.64.6.1, server: , request: "OPTIONS / HTTP/1.1", host: "https://mywebdavurl.com/", referrer: "http://localhost:3001/ "
2021/10/21 13:49:20 [error] 12#0: *109 no user/password was provided for basic authentication, client: 100.70.6.1, server: , request: "OPTIONS /results/ HTTP/1.1", host: "https://mywebdavurl.com/", referrer: "http://localhost:3001/ "
2021/10/21 13:49:20 [error] 12#0: *110 no user/password was provided for basic authentication, client: 100.64.8.1, server: , request: "OPTIONS /results/ HTTP/1.1", host: "https://mywebdavurl.com/", referrer: "http://localhost:3001/ "
If I check the contents of the request and responses from the network tab in the inspector, I get the following:
Requests with type XHR and status CORS error
Referrer Policy: strict-origin-when-cross-origin
Provisional headers are shown
Learn more
Accept: text/plain
Access-Control-Allow-Headers: Content-Type, Access-Control-Allow-Headers, Authorization, X-Requested-With
Access-Control-Allow-Methods: DELETE, POST, GET, OPTIONS
Access-Control-Allow-Origin: *
Authorization: Basic dXNlcjpwYXNzd2Q=
Depth: 1
Referer: http://localhost:3001/
sec-ch-ua: "Google Chrome";v="95", "Chromium";v="95", ";Not A Brand";v="99"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/95.0.4638.54 Safari/537.36
Requests with type Preflight and status 401
Request Method: OPTIONS
Status Code: 401 Unauthorized
Remote Address: 10.107.50.24:443
Referrer Policy: strict-origin-when-cross-origin
access-control-allow-headers: *
access-control-allow-methods: GET, HEAD, POST, PUT, PATCH, DELETE, OPTIONS
access-control-allow-origin: *
access-control-expose-headers: *
Connection: close
Content-Length: 605
Content-Type: text/html; charset=utf-8
Date: Thu, 21 Oct 2021 11:44:14 GMT
Server: nginx/1.4.6 (Ubuntu)
Set-Cookie: d8522de4f35024212f5d0e7f0b289d29=3570640034998b52c04921bcd836972d; path=/; HttpOnly; Secure; SameSite=None
WWW-Authenticate: Basic realm="Restricted"
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: en,de-DE;q=0.9,de;q=0.8,en-US;q=0.7
Access-Control-Request-Headers: access-control-allow-headers,access-control-allow-methods,access-control-allow-origin,authorization,depth
Access-Control-Request-Method: PROPFIND
Connection: keep-alive
Host: https://mywebdavurl.com/
Origin: http://localhost:3001
Referer: http://localhost:3001/
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: cross-site
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/95.0.4638.54 Safari/537.36
From what I can see, the XHR requests have the CORS headers and the Authorization header. Yet I receive the CORS error. The preflight request, instead, is not filled with those headers.
In the end, I tried one last thing that gave me, once again an unexpected result: I open Chrome using the following command-line flags:
C:\Users\Francesco\AppData\Local\Google\Chrome\Application\chrome.exe --disable-web-security --disable-gpu --user-data-dir=~/chromeTemp
When I do that, everything works fine. Of course, the codebase is the same. In this case, I have in the network tab in Chrome just two entries (same values, maybe I'm doing something wrong in the way a make the call) with the following data:
Type: XHR
Status: 207
Request Method: PROPFIND
Status Code: 207 Multi-Status
Remote Address: 10.107.50.24:443
Referrer Policy: strict-origin-when-cross-origin
Connection: close
Date: Thu, 21 Oct 2021 12:44:55 GMT
Server: nginx/1.4.6 (Ubuntu)
Transfer-Encoding: chunked
Accept: text/plain
Accept-Encoding: gzip, deflate, br
Accept-Language: en,de-DE;q=0.9,de;q=0.8,en-US;q=0.7
Access-Control-Allow-Headers: Content-Type, Access-Control-Allow-Headers, Authorization, X-Requested-With
Access-Control-Allow-Methods: DELETE, POST, GET, OPTIONS
Access-Control-Allow-Origin: *
Authorization: Basic dXNlcjpwYXNzd2Q=
Connection: keep-alive
Cookie: d8522de4f35024212f5d0e7f0b289d29=3570640034998b52c04921bcd836972d
Depth: 1
Host: https://mywebdavurl.com/
Referer: http://localhost:3001/
sec-ch-ua: "Google Chrome";v="95", "Chromium";v="95", ";Not A Brand";v="99"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: cross-site
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/95.0.4638.54 Safari/537.36
Of course, I need to get rid of this hack that was just for debugging purposes and let me understand that I can use the WebDAV library inside a react application and that there are some other kinds of problems.
Any help will be appreciated. I provided all the information I have but if you need more, I can look for other stuff.

XMLHttpRequest return 403 on new server

I use a library (orakupload) to upload photos. This makes a query through XMLHttpRequest to a php file from the same library, which save the image.
On the old server it works perfectly, but we have changed the server and it has stopped working.
Calling the php file I get a 403 Forbidden. This file is in exactly the same folder as the Js that calls it (logically, also the same server).
I have searched and tried everything, but I can't get it to not return the error, let's see if you can help me.
The XMLHttpRequest:
var xhr = new XMLHttpRequest();
xhr.open("POST", settings.orakuploader_path+"orakuploader.php?filename="+encodeURIComponent(file.name)+"&path="+settings.orakuploader_path+"&resize_to="+settings.orakuploader_resize_to+"&thumbnail_size="+settings.orakuploader_thumbnail_size+"&main_path="+settings.orakuploader_main_path+"&thumbnail_path="+settings.orakuploader_thumbnail_path+"&watermark="+settings.orakuploader_watermark+"&orakuploader_crop_to_width="+settings.orakuploader_crop_to_width+"&orakuploader_crop_to_height="+settings.orakuploader_crop_to_height+"&orakuploader_crop_thumb_to_width="+settings.orakuploader_crop_thumb_to_width+"&orakuploader_crop_thumb_to_height="+settings.orakuploader_crop_thumb_to_height, true);
xhr.send(file);
xhr.onreadystatechange = function()
{
...
If I open the path that creates, and open directly in chrome, it executes the php file without problem.
So I think it must not be the htaccess problem.
I thought that for some reason it could be a CORS problem, but I add the headers to the php file and it continues with the problem:
header('Access-Control-Allow-Origin: *');
header("Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Accept");
header('Access-Control-Allow-Methods: GET, POST, PUT, DELETE');
Finally, I leave here the information of the headers in the query, in case it can be of help:
General
Request URL: https://www.XXXXXX.com/intranet/orakuploader/orakuploader.php?filename=aaaa.png&path=/intranet/orakuploader/&resize_to=0&thumbnail_size=0&main_path=/intranet/files&thumbnail_path=/intranet/files/tn&watermark=&orakuploader_crop_to_width=1920&orakuploader_crop_to_height=1420&orakuploader_crop_thumb_to_width=200&orakuploader_crop_thumb_to_height=200
Request Method: POST
Status Code: 403 Forbidden
Remote Address: 52.121.xx.xx:xxx
Referrer Policy: strict-origin-when-cross-origin
Response Header
Connection: Keep-Alive
Content-Length: 318
Content-Type: text/html; charset=iso-8859-1
Date: Wed, 21 Jul 2021 12:33:40 GMT
Keep-Alive: timeout=5, max=53
Server: Apache
X-Frame-Options: SAMEORIGIN, SAMEORIGIN
Request Header
Accept: */*
Accept-Encoding: gzip, deflate, br
Accept-Language: es,ca;q=0.9
Connection: keep-alive
Content-Length: 200316
Content-Type: image/png
Cookie: xxxxxxx
Host: www.XXXXXX.com
Origin: https://www.XXXXXX.com
Referer: https://www.XXXXXX.com/intranet/prop/1193
sec-ch-ua: " Not;A Brand";v="99", "Google Chrome";v="91", "Chromium";v="91"
sec-ch-ua-mobile: ?0
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: same-origin
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36
EDIT
If I disconnect the ssl certificate, it does not save the image, but I do not receive the 403 error, could the cause be here?
I have found the problem, I leave it here in case it helps someone:
Finish looking for the apache "mod_security" module. And I found that one of rules was blocking the request.

Flasgger / Swagger - apidocs works with localhost but not Openshift (You need to enable JavaScript to run this app)

I am running a Python Flask server. A co-worker added Flasgger/Swagger support and I can successfully display the API using
http://localhost:5000/apidocs
Similarly, I can get the json version of the API
http://localhost:5000/api_documentation.json
Python code configures that filename
That same code is deployed in an Openshift project and uses Traefik to route external requests to the Python Flask server
https://my-openshift-url-here/apidocs
does not display the API
only displays "Powered by Flasgger 0.9.5
https://my-openshift-url-here/api_documentation.json
works same as the localhost request
Traefik keys off of the "apidocs" and "api_documentation.json" and routes it directly to the Python Flask server
rule: PathPrefix(`/apidocs`)
entryPoints:
- web
middlewares:
- gzip
- mysecurity-no-token
service: my-python-server
apidocumentation:
rule: PathPrefix(`/api_documentation.json`)
entryPoints:
- web
middlewares:
- gzip
- mysecurity-no-token
service: my-python-server
What I see in my Chrome browser debugger (F12 - Network) for the swagger-ui-bundle.js response is "You need to enable JavaScript to run this app."
Why does this work in Chrome for the localhost version but not when accessing the server deployed on Openshift? Both are being accessed from the same Chrome window - just different tabs.
This is the Headers content of the apidocs request for the localhost version
Request Method: GET
Status Code: 200 OK
Remote Address: 127.0.0.1:5000
Referrer Policy: strict-origin-when-cross-origin
Content-Length: 3041
Content-Type: text/html; charset=utf-8
Date: Thu, 19 Aug 2021 12:08:41 GMT
Server: Werkzeug/2.0.1 Python/3.7.4
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Cache-Control: no-cache
Connection: keep-alive
Host: localhost:5000
Pragma: no-cache
sec-ch-ua: "Chromium";v="92", " Not A;Brand";v="99", "Google Chrome";v="92"
sec-ch-ua-mobile: ?0
Sec-Fetch-Dest: document
Sec-Fetch-Mode: navigate
Sec-Fetch-Site: none
Sec-Fetch-User: ?1
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.131 Safari/537.36
This is the Headers content of the apidocs request for the Openshift version
Request Method: GET
Status Code: 200 OK
Remote Address: 123.456.789.123:443 (obfuscated, of course)
Referrer Policy: origin
content-encoding: gzip
content-length: 1264
content-type: text/html; charset=utf-8
date: Thu, 19 Aug 2021 12:08:59 GMT
server: istio-envoy
vary: Accept-Encoding
x-envoy-upstream-service-time: 28
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Cache-Control: no-cache
Connection: keep-alive
Host: my-openshift-server-url-here
Pragma: no-cache
sec-ch-ua: "Chromium";v="92", " Not A;Brand";v="99", "Google Chrome";v="92"
sec-ch-ua-mobile: ?0
Sec-Fetch-Dest: document
Sec-Fetch-Mode: navigate
Sec-Fetch-Site: none
Sec-Fetch-User: ?1
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.131 Safari/537.36
Turns out it was a Traefik routing issues. After the initial response to the /apidocs request, it was also making requests to /flasgger_static/foo. I had to add a route for flassgger_static in my Traefik routing table.
flassger:
rule: PathPrefix(`/flasgger_static`)
entryPoints:
- web
middlewares:
- gzip
- mysecurity-no-token
service: my-python-server

Code is not working when sending request behind "Proxies"

I am sending a request to a website ( I am using request module ) and it returns data in response. Everything is working fine until I send request behind a proxy ( even when the proxy is not banned on the site ).
my problem is the same
How to stop NodeJS "Request" module changes request when using proxy
I tried every solution in the above post but nothing helped
Headers I am using in request
Host: www.somesite.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:69.0) Gecko/20100101 Firefox/69.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Connection: keep-alive
Upgrade-Insecure-Requests: 1

burpsuite intruder and hashed passwords

In order to learn how to use Burpsuite, I am trying to use it to hack into the management gui on an IP camera of mine. Access credentials are currently admin/admin. From Burpsuite's proxy I select the following POST command for an intruder attack:
POST /login/ HTTP/1.1
Host: 10.XXX.XXX.173
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: http://10.XXX.XXX.173/
Content-Type: application/x-www-form-urlencoded
Content-Length: 29
DNT: 1
Connection: close
Upgrade-Insecure-Requests: 1
username=admin&password=admin
When I run a sniper attack using a list of passwords, including admin, all the tries come back with the exact same 200 success response/message. Investigating further, I see that after each attack and response message, there is a subsequent GET message from the browser that looks like this:
GET /login/?_=1551456400210&_username=admin&_login=true&_signature=§de6af126fa27f887c20ca2de02411aa913815d9b§ HTTP/1.1
Host: 10.XXX.XXX.173
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: http://10.XXX.XXX.173/
X-Requested-With: XMLHttpRequest
DNT: 1
Connection: close
And this generates a success response in the case of admin/admin and a failure response in other cases. So, it appears the camera is hashing the password and passing on the hash in the "signature" element of the GET command. If I use the GET command for the sniper attack and include the hashed value in the password list, the attack sees the hash of "admin" as being successful.
At this point, it appears that I need a two-part attack. Part one gets the hashes of the password list from the gui and part two tells me when a hash/password works to unlock the gui. Am I right? How do I do this in Burpsuite?

Categories