Reactjs Expressjs end to end encryption - javascript

I am not sure this question already been asked here. But I could not see such a question so far.
I have a RESTapi written in Express.js.
That RESTapi is connected to a React.js web app.
RESTapi is secured with JWT authentication tokens.
In the web app, we display thousands of product items with daily prices. Someone logging into the web app and viewing few product's prices is okay.
But what is not okay is someone automates the fetch all item's prices daily and store and analytics our pricing strategies.
Basically what I want is for someone trying to access the API using a tool like a postman or something it should be blocked. Only a web browser should be able to access the API. This can be achieved to some extend by blocking the user agents. We can block POSTMON user agents but how we block all the tools like POSTMON?
Even though If we block all the tools like that still browser's dev tools network tab they can see the response.
Is there a way to encrypt the response? So network tab will display something that the users can understand. But at the same time that response can be decrypted by React.
Hope my question is clear to all!
Any help!
Thanks in advance. =)

Basically what I want is for someone trying to access the API using a tool like a postman or something it should be blocked. Only a web browser should be able to access the API.
Can't be done.
This can be achieved to some extend by blocking the user agents. We can block POSTMON user agents but how we block all the tools like POSTMON?
People can lie about the user-agent very easily.
Even though If we block all the tools like that still browser's dev tools network tab they can see the response.
Yes. Browsers are designed to work for their user's benefits above the benefits of the owners of the websites they access.
Is there a way to encrypt the response? So network tab will display something that the users can understand. But at the same time that response can be decrypted by React.
Not really. You'd have to give the decryption key to the browser, and if you give it to the browser then the user can access it.
But what is not okay is someone automates the fetch all item's prices daily and store and analytics our pricing strategies.
Consider rate limiting instead.
Essentially, however, what you are trying to do is make information public without giving it to some people.
That's a contradiction and thus impossible.

what you can do with express is to use CORS and allows only your website to reach the API.
check https://www.npmjs.com/package/cors
you gonna have something like
var express = require('express')
var cors = require('cors')
var app = express()
var corsOptions = {
origin: 'http://example.com',
optionsSuccessStatus: 200 // some legacy browsers (IE11, various SmartTVs) choke on 204
}
app.get('/products/:id', cors(corsOptions), function (req, res, next) {
res.json({msg: 'This is CORS-enabled for only example.com.'})
})
but to use cors for that goal probably need to use a restriction based on user-agent, to make sure that all requests are coming from a browser and with cors header.
But still not a very efficient mechanism.
The other idea is to implement some sort of 'API KEY', generated by your client-side and validated on the backend. each request needs to have an 'API KEY' as param

You need to use CORS policy with HTTPS, if the policy is set correctly in CORS policy the request will be handled only if come from the UI App, otherwise, the app will reject the requests.
hope this code snippet will assist you
const whitelist = ['http://www.example.com', 'https://www.example.com'];
const corsOptions = {
origin(origin, callback) {
if (whitelist.indexOf(origin) !== -1 || !origin) {
// !origin // allow requests with no origin (like mobile apps or curl requests)
callback(null, true);
} else {
console.error(`Not allowed by CORS, Origin ${origin}`);
callback(new Error('Not allowed by CORS'));
}
},
exposedHeaders: 'Authorization',
};
router.use(cors(corsOptions));
router.use('/api/auth', auth);

Related

CORS error on Linkedin oauth/v2/accessToken API from frontend

I am trying to hit a Linkedin accessToken API but always facing CORS error in react js (frontend). Samething works while direct hit in URL bar or through postman.
This is the error I am getting:
Access to fetch at 'https://www.linkedin.com/oauth/v2/accessToken' from origin 'http://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
My Code is:
const queryParams = querystring.stringify({
redirect_uri: process.env.REACT_APP_LINKEDIN_REDIRECT_URI,
client_id: process.env.REACT_APP_LINKEDIN_CLIENT_ID,
client_secret: process.env.REACT_APP_LINKEDIN_CLIENT_SECRET,
grant_type: 'authorization_code',
code: code,
});
const headers = {
'Content-Type': 'application/x-www-form-urlencoded',
};
const response = await fetch(`https://www.linkedin.com/oauth/v2/accessToken`, {
method: 'POST',
headers: headers,
body: queryParams,
});
`
The API responses do not include Access-Control-Allow-Origin header so your browser is prohibiting requests to those APIs.
You have 2 choices here:
obtain the access token on the backend using a 2-legged OAuth flow
or use a 3-legged OAuth flow which requires you to redirect the user's browser to Linkedin's site
From a security perspective, you should not distribute the client secret in HTML/JS files.
The accepted answer (adambene) is misleading, as is the official documentation. The next answer (edarv) is technically correct but too brief to really learn from.
Referencing https://learn.microsoft.com/en-us/linkedin/shared/authentication/authorization-code-flow ,
There is already a mandatory step in the 3-leg auth flow where, as noted, you do need to simply provide a user-facing link or redirect the browser window, at which point the flow is resumed by Linkedin redirecting back to your website with additional data. But the /accessToken endpoint referenced by the OP is after this step of the flow.
While one Microsoft support thread (https://techcommunity.microsoft.com/t5/sharepoint-developer/cors-error-occuring-after-accessing-linkedin-share-api-through/m-p/787679) also indirectly suggests that you need to add your domain to your app's Widgets list in the App Settings portal, this is also spurious to the OP's use case.
The ultimate answer does appear to be, completely unmentioned in the main auth flow doc, that you simply cannot use any Linkedin API past the initial oauth/v2/authorization redirect from a web client context. Full stop. You'll always get CORS'd. This makes sense if you dig into the side documentation on how/why to protect your client secret specifically for the /accessToken call (https://learn.microsoft.com/en-us/linkedin/shared/api-guide/best-practices/secure-applications?context=linkedin/context), but imo makes less sense for subsequent calls once you have the access token. But whether it makes sense or not, you'll need to set up a webserver or other standalone app to make subsequent API calls.
The Linkedin API doesn't allow requests from locations such as localhost using the CORS header 'Access-Control-Allow-Origin'. https://en.wikipedia.org/wiki/Cross-origin_resource_sharing
If you really need it, you could always have your proxy server or use something like https://cors-anywhere.herokuapp.com/.

How do you send a cookie to a cross origin request using react and maybe electron?

I need to make a request to this other website and I need to send a cookie alongside it. I have no control over this other website so I cannot modify its cors policy or allow credentials in any meaningful way.
So, I decided to convert my website to a react app that will run contained inside an electron web browser. I thought that as the electron has full control over the computer, I would be able to send a normal request without having to deal with cors, but it seems, that cors and all sorts of other protections are still enabled.
To clarify my current setup, I have:
A react website localhost:3000
A electron app showing the react website.
There is no other communication happening between the two.
All requests are made solely by the react website.
The react website needs to communicate with foreignwebsite.com and send a cookie to it.
Normally this in any server-side type of application, including electron itself, would be of no problem, but react cannot make direct requests.
One way I could do this was to potentially create a middle server that communicates with the foreigner API and friendly gives me the data I need for my front-end app. But, I don't want to host a server, and if I create a server on the client machine I think the app would be too heavy.
I could use the electron process as a server and make node.js requests with it, but to communicate between react and electron seems weirdly complicated, it feels like it would be a mess of events.
I found an answer online that tries to kind of deactivate the cors of the electron browser, it partially did work, or at least it seems that it worked, it shows no error but the cookie simply isn't transferred, maybe it just tricks the browser but it doesn't work behind the scenes, idk...
const filter = {
urls: ['http://foreignwebsite.com/*']
}
mainWindow.webContents.session.webRequest.onBeforeSendHeaders(
filter,
(details, callback) => {
details.requestHeaders.Origin = `http://foreignwebsite.com/`
details.requestHeaders.Referer = `http://foreignwebsite.com/`
callback({ requestHeaders: details.requestHeaders })
}
)
mainWindow.webContents.session.webRequest.onHeadersReceived(
filter,
(details, callback) => {
details.responseHeaders['access-control-allow-origin'] = ['http://localhost:3000']
details.responseHeaders['access-control-allow-credentials'] = 'true'
callback({ responseHeaders: details.responseHeaders })
}
)
So above it sets the AccessControlAllowOrigin and the AllowCredentials of the response accordingly and it changes the origin and referer of the requests.
And when I go make requests on the react website I do it like this:
document.cookie = 'OmegaCookie=' + cookieVal + '; path=/; SameSite=None; Secure';
let response = await fetch('http://foreignwebsite.com', {credentials: 'include'});
But the cookie is not sent. Do you know what could be going wrong or how to handle it better in this situation?

How to create a live website thumbnail as preview

I'm trying to create a gallery in React with life previews of many websites (like a portfolio) all the linked websites belong to me as well.
I already try to use iFrame and embed but I didn't have the result I would like, I'm trying to get a miniature website like in here https://codesandbox.io/explore.
Even tho the website show the thumbnails as images if you update your sandbox it will the images will update too.
I try use iFrame and embed but it does not show a small version of the website but the website as a mobile and just the frame size.
Any ideas in how I could generate such images or solve this problem in some other way?
You cant do this on the front end in a webpage. You need to execute something like puppeteer on your backend to screenshot the pages. An example can be found on
https://bitsofco.de/using-a-headless-browser-to-capture-page-screenshots/
As of the the same-origin policy, browsers do not allow you to make request to a different domain, you cannot request a different domain from your web app.
The Cross-Origin Resource Sharing standard works by adding new HTTP headers that let servers describe which origins are permitted to read that information from a web browser.
As of an alternative solution you can setup an express server and use cors package to add permission for sending request to your other site.
If you control both sites, then config Cross-Origin Resource Sharing (CORS) from server settings by adding new HTTP headers like Access-Control-Allow-Origin to be accept requests from your other servers
var express = require('express')
var cors = require('cors')
var app = express()
app.use(cors())
const whitelist = ['http://example1.com', 'http://example2.com']
const corsOptions = {
origin: function (origin, callback) {
if (whitelist.indexOf(origin) !== -1) {
callback(null, true)
} else {
callback(new Error('Not allowed by CORS'))
}
},
}
app.listen(4000, function () {
console.log('CORS-enabled web server listening on port 4000')
})
There is also a library that captures screenshot of the given url and save it to the given outputFilePath, capture-website, you can use.

How to handle CORS in a service account call to Google oAuth?

I access a Google calendar using a service account. This means that the owner of the calendar will not be prompted for his authorization.
This works fine in Python, where I make a requests call to https://accounts.google.com/o/oauth2/token with a specific body. This gets me back a token I can use later.
I now need to have an application running in a standalone browser (Chrome - without user interaction) and tried to directly port this call as
fetch('https://accounts.google.com/o/oauth2/token',
{
method: 'POST',
body: JSON.stringify(body),
})
.then(function(res) { return res.json(); })
.then(function(data) { alert(JSON.stringify(data)) })
but I get a reply from Google
Failed to load https://accounts.google.com/o/oauth2/token: No
'Access-Control-Allow-Origin' header is present on the requested
resource. Origin 'http://devd.io' is therefore not allowed access. The
response had HTTP status code 400. If an opaque response serves your
needs, set the request's mode to 'no-cors' to fetch the resource with
CORS disabled.
My limited understanding of CORS (a previous answer was a very good read) is that
No 'Access-Control-Allow-Origin' header is present on the requested
resource
means that it is not present in the response headers, which means that Google does not want me to access its resources via JS when ran from the browser (which points to something else that https://accounts.google.com).
This may be a good idea but I control all elements, from the code to the browser and would like to get that token the same way I get it in a non-browser environment, specifically my working Python code.
How can I tell https://accounts.google.com to send me back a Access-Control-Allow-Origin header which tell my browser that it is OK to accept the call?
You can't.
Client side and server side code need to interact with OAuth in different ways.
Google provide documentation explaining the client side process.
Importantly, part of it involves redirecting to Google's servers instead of accessing them with fetch or XMLHttpRequest.
#Quentin's answer "You can't" is the right one for my question ("how can I force the server to send back the right header").
This is a decision at Google not to provide this header, effectively cutting off any non-interactive applications.
As a solution, I will look at
how to force the browser not to take into account the security mechanisms provided by CORS (there seems to be some ways through extensions or command-line arguments, I will update this answer once I find it)
or write an intermediate layer which will query the data for me and pass them verbatim to the application (this is equivalent, in my case, of just making the query from JS - but it adds an extra layer of code and server)

Cross domain Ajax JSON POST support against RESTful WCF service using transportCredentialOnly security

I've posted before on this subject, but after a year of getting on with other things, I've managed to get into a pickle once again. I'll try and give a brief overview of the scenario and the current attempts to make things work:
IIS web server hosting HTML, JS etc. on host: iis.mycompany.com (referred to as foo)
WCF RESTful web services hosted via a Windows Service on host: wcf.mycompany.com (referred to as bar)
The Javascript served from foo works by making RESTful ajax calls (GET or POST depending on the action) to the WCF services on bar, obviously these are cross domain calls as they aren't on the same host.
The Javascript uses the jQuery (1.7.2) framework to manipulate the DOM and perform ajax calls to bar, the expected content type for POSTS is JSON, and the response from GETS is expected to be JSON too (application/json).
Bar has it's WCF services configured using TransportCredentialOnly as the security mode and the transport client credentail type is NTLM, so only authed users to contact the services.
CORS Support has been added to bar's WCF services using an extension to WCF:
http://blogs.msdn.com/b/carlosfigueira/archive/2012/05/15/implementing-cors-support-in-wcf.aspx
We have added additional headers and modfied some that the post already contained based on numerous internet articles:
property.Headers.Add("Access-Control-Allow-Headers", "Accept, Content-Type");
property.Headers.Add("Access-Control-Allow-Methods", "POST, GET, OPTIONS");
property.Headers.Add("Access-Control-Max-Age", "172800");
property.Headers.Add("Access-Control-Allow-Origin", "http://iis.mycompany.com");
property.Headers.Add("Access-Control-Allow-Credentials", "true");
property.Headers.Add("Content-type", "application/json");
Sites giving information on enabling CORS suggest that the Access-Control-Allow-Origin response header should be set to "*" however, this is not possible in our case as we make jQuery ajax calls using the following setup:
$.ajaxSetup({
cache: "false",
crossDomain: true,
xhrFields: {
withCredentials: true
}
});
As it turns out you cannot use "*" for the accepted origin when you are using "withCredentials" in the ajax call:
https://developer.mozilla.org/en/http_access_control
"Important note: when responding to a credentialed request, server
must specify a domain, and cannot use wild carding."
Currently in our development lab, this doesn't matter as we can hard code the requests to the IIS (foo) server URL.
The main problem now appears to be attempting POST requests (GET is working using the above configuration). When the browser attempts the POST process, it first sends an OPTIONS header to the server requesting allowed OPTIONS for the subsequent post. This is where we would like to see the headers we've configured in the CORS Support WCF extension being passed back, however we aren't getting that far; before the response comes back as "401 Unauthorized", I believe this is to do with the transport security binding configuration requesting NTLM, but I'm not sure.
Also, I'm not very experienced with this, but I haven't seen much information about POST using application/json content type as opposed to text/plain when performing cross domain requests.
I know that people will probably suggest JSONP as the one true solution, I'm not against different approaches, indeed I encourage anyone to suggest best practices as it would help others reading this question later. However, please attempt to answer the question before suggestion alternatives to it.
Many thanks in advance for anyone who contributes.
peteski
:)
UPDATE:
It appears that Chrome (20.x.x) doesn't suffer the problem of not negotiating NTLM to retrieve the OPTIONS header response from the server, but Firefox (13.0.1) does.
We've also noticed that someone has already posted a bug up on the Firefox forum, which we've added information to:
http://bugzilla.mozilla.org/show_bug.cgi?id=751552
Please vote for this bug to be fixed on the bugzilla site!
Using the following code, we can watch the network trace to see Firefox failing and Chrome working fine:
var url = "http://myWebServiceServer/InstantMessagingService/chat/message/send";
var data = '{ "remoteUserUri" : "sip:foo.bar#mydomain.com", "message" : "This is my message" }';
var request = new XMLHttpRequest();
request.open("POST", url, true);
request.withCredentials = true;
request.setRequestHeader("Content-Type", "application/json");
request.send(data);
console.log(request);
On a separate note, IE8 doesn't support the XMLHttpRequest for cross domain calls, favouring it's own magical XDomainRequest object, so we've got some work to do in changing the client side code to handle IE8 vs the world cases. (Thanks IE8).
/me crosses fingers that Mozilla fix the Firefox bug.
UPDATE 2:
After some digging it appears that IE8's XDomainRequest cannot be used to make cross domain requests where NTLM must be negotiated, this basically means that the security on our WCF binding can't be used thanks to limitations in a web browser.
http://blogs.msdn.com/b/ieinternals/archive/2010/05/13/xdomainrequest-restrictions-limitations-and-workarounds.aspx
"No authentication or cookies will be sent with the request"
So, I guess we've taken this as far as it is going to go for now.. It looks like we're going to have to create our own custom token authentication and pass it across to the WCF service in a cookie, or in IE8's case, POST it with the JSON. The WCF service will then have to handle decrypting the data and using that instead of the ServiceSecurityContext.Current.WindowsIdentity we previously had access to with NTLM auth.
I know you said you would rather have the problem itself addressed, but you may consider using a "reverse proxy."
I don't know what technologies you are using, but we use Apache web server and have a Java RESTful API running on a different server that required authentication. For a while, we messed with JSONP and CORS, but were not satisfied.
In the end, we setup an Apache Reverse Proxy and it worked miracles. The web browser believes it is communicating with its own domain and acts appropriately. The RESTful API doesn't know it is being used via a proxy. Therefore, everything just works. And Apache does all the magic.
Hopefully, all web servers have a feature like Apache's reverse proxy.
Here is some documentation on the feature: http://httpd.apache.org/docs/2.2/mod/mod_proxy.html
All we had to do is ensure the mod_proxy module was installed, then add the following lines to our Apache config file:
ProxyPass /restapi http://restfulserver.com/restapi
ProxyPassReverse /restapi http://restfulserver.com/restapi
Then restart the web server and voila!

Categories