Microsoft single tenant authentication with Firebase - javascript

For a work-related app I use Firebase authentication with Microsoft. In this case, however, it is important that only people from my company (we use Office 365) can sign into this application. I have everything set-up and working in a non-firebase context. But when I use Firebase for authentication, it seems to always point to the /common/ endpoint. This causes problem with my single-tenant-application. If I set the application to accept all tenants, the app works again. But obviously, now everyone can log into my application.
The pop-up is called with a rather conventional:
const provider = new auth.OAuthProvider("microsoft.com");
provider.setCustomParameters({
tenant: "[tenantName].com"
});
auth()
.signInWithPopup(provider)
.then(result => {
But I can't find any instructions on changing the oauth endpoint to use the single tenant endpoint.
How would I go about doing this?

But I can't find any instructions on changing the oauth endpoint to
use the single tenant endpoint.
We can not change the oauth endpoint, even though we add the tenant information to customParameters. The endpoint always use common as the value of tenant. This is the default design.
If we enable Microsoft as a sign-in provider, users using Microsoft accounts (Azure Active Directory and personal Microsoft accounts) can sign in.

Turns out the above is not exactly true. I've switched to signing in with a redirect, and now it (mysteriously) works.
const provider = new auth.OAuthProvider("microsoft.com");
provider.setCustomParameters({
tenant: "[tenant].com"
});
auth().signInWithRedirect(provider);
I have tested this. The tenant is named in the redirect, and people from other tenants cannot log in.

Related

Active Directory access Azure Storage from browser

I want to use Azure Active Directory to allow users to read and write to Azure storage (specifically all Blobs and Tables) from a single-page web app.
I started like this:
import { InteractiveBrowserCredential } from '#azure/identity';
import { TableClient, TableServiceClient } from '#azure/data-tables';
const credentials = new InteractiveBrowserCredential({
clientId: myAuthConfig.clientId,
tenantId: myAuthConfig.tenantId,
});
const client = new TableServiceClient(
`https://${myAuthConfig.storageAccountName}.table.core.windows.net`,
credentials
);
client.listTables().byPage().next().then(console.log);
This works totally fine! I can see all the tables on the account. But then I wanted to list some of the data in on of the tables. So I did:
const client = new TableClient(
`https://${myAuthConfig.storageAccountName}.table.core.windows.net`,
'<table name>',
credentials
);
client.listEntities().byPage().next().then(console.log);
But this gives an error:
{
"odata.error": {
"code":"AuthorizationPermissionMismatch",
"message": {
"lang":"en-US",
"value":"This request is not authorized to perform this operation using this permission.\nRequestId:<uuid>\nTime:2021-10-28T18:04:00.0737419Z"
}
}
}
I'm very confused by this error. As far as I can tell I've done everything right. I followed every tutorial. I've set up active directory permissions for my app to use the storage API, my Microsoft account has permission to access the tables, OCRS is enabled, etc.
I'm not sure why I would have access to see a table but not see what's in it. I tried to use InteractiveBrowserCredential.authenticate to explicitly set scopes like this:
const scopes = ["User.Read"]
credentials.authenticate(scopes).then(console.log);
It works fine for User.Read but I couldn't figure out what scopes corresponded to Storage read/write access. If I added a scopy like "Microsoft.Storage" it told me that it didn't exist
Has anyone got an error like this before? What am I supposed to do here?
Thank you #gaurav mantri ,Posting your suggestion in comment as an answer.
From error it looks like your service principal does not have access permission to your table storage data. You should either grant permission using a RBAC role on the storage account resource (add to storage account contributors or readers) as below. Or use Storage Explorer to grant permission.
In your storage account please check, if you have Storage Table Data Contributer /Storage table data reader roles assigned as commented by #gaurav mantri
If not , you can add them
go into your storage account > IAM > Add role assignment, and add the special permissions
If roles are already assigned , the issue might be due to storage account being protected by firewall. Please try configure in Firewall and virtual networks of your storage account to add an existing virtual network or create a new vnet.If there is no issue you may allow access from all networks.
References:
Authorize access to tables using Active Directory - Azure Storage |
Microsoft Docs
Assign an Azure role for access to table data using powershell -
Azure Storage | Microsoft Docs

How can we separate route of admin and user?

I am creating routes in node js . I am creating routes for dashboard.
User login and get the JWT token.
By sending Token,User can access some route related to user(edit,delete,logout route etc).
But For admin, I want to create the routes which can see the list of users,edit or remove users,check the logout time of users.I have also set the flag in table to identify the person is user or Admin.
How will be authenticate routes for Admins on backend side?
You can be inspired by this logic, And no further explanation can be given here. follow steps (It may help):
First) define role field into DB mongoDB or Mysql (for example):
enum: ['user', 'admin']
Second) create a function checkRole(role) for check role after signin and verify jwt, then get user
Third) create separate route for admin panel (for example):
router.route('/admin-panel').use(authController.checkRole('admin'))
You can put your authorization flag in your JWT. When a user logs in, your server generates corresponding JWT, in which included authentication info(i.e. userId). You can put additional authorization info in the token(i.e. auth).
Based on the auth field, your server can identify whether the request is sent by a general user or an admin. Of course, securing the JWT from hijacking is an another story.

Use Open ID connect with auth code flow Azure AD

I am working on authentication of a NodeJS, Express Web application where I want users to be routed to microsoft SSO. I am using passport azure ad and open id connect to do this.
However what I want to know is -
Is it only possible to do Open ID connect with implicit grant? Or c
Can we do open id connect with auth flow? If so, does passport-azure-ad support it?
According to the document of OpenID Connect, it will request a id_token when send the sign-in request(the response_type is "id_token"). And we can see from the document of auth code flow, the response_type is "code". But we according to this tutorial, the response_type could also be "code id_token" in auth code flow.
So we can also do open id connect in auth code flow.
If you want to use passport-azure-ad, here is a method(for auth code flow) for your reference.
The params are items we get from the request or metadata, such as id_token, code, policy, metadata, cacheKey, etc
The oauthConfig are items needed for oauth flow (like redirection, code redemption), such as token_endpoint, userinfo_endpoint, etc
The optionsToValidate are items we need to validate id_token against, such as issuer, audience, etc
Hope it helps~

How to find user is logged in or not from cloud function side using Firebase?

To provide dynamic content delivery, I am using rewrites in fire base hosting. Whenever open website with index.html then the browser request the firebase cloud function main.
"rewrites": [ {
"source": "/index.html",
"function":"main"
}]
Now I am facing a problem to provide dynamic content based on user login status. I also checked about client side authendication using JS.
firebase.auth().onAuthStateChanged(function(user) {
if (user) {
// User is signed in.
} else {
// No user is signed in.
}
});
I don't know about web development. Here I have few questions,
How can I find authentication status of user by more flexible way? Does cookies used for this? I am asking because I don't know to pass firebase-token to the cloud function main.
Please address me an idea. Thank you all.
Short answer: End users don't have a sign-in status that's visible on a backend. That's just not how Firebase Authentication works.
Auth clients provide credentials to get a token that's used to identify themself when they invoke backend services. This tokens has a lifetime of 1 hour, and the client must refresh it in order to keep using it. Your backend doesn't know or care if the client has an auth token, if they use it, or if they refresh it. The client just needs to provide that token from whatever device they have signed in so the backend can validate it. There is no way your backend can know if the client obtained a token - you just have to accept the one it is given. This means you're going to have to actually figure out how to pass that token and validate it with the Firebase Admin SDK, or use a callable type function using the Firebase Client SDK to send that token automatically.

Authenticating user in AWS Cognito User/Identity Pool with Google as identity provider

AWS provides two possible ways of dealing with Cognito:
"old one" via amazon-cognito-identity-js (and possibly amazon-cognito-auth-js) and
"new one" via aws-amplify (which inlcudes the above one)
After quite a bit of trouble and reverse engineering, I've successfully managed to sign in (receve back CognitoIdentityCredentials) using aws-amplify locally as part of the development effort.
The steps where (bear with me, as these are important for the questions to follow, and also might help someone):
Setup
Create a User Pool in Cognito console
Create a User Pool App Client in Cognito console
Create Google Web App in Google Console
Configure Google Web App to point to http://localhost:8080 (my local dev server)
Configure User Pool to use Google as an Identity Provider, supplying it with the Google Web App Client ID and Client secret from Google Console
Create an Identity Pool in Congnito console and configure it to work with Google as an Identity Provider, supplying Google Web App Client ID there as well
Implementation
Configure Amplify.Auth:
Amplify.configure({
Auth: {
identityPoolId: ,
region: ,
userPoolId: ,
userPoolWebClientId:
}
});
Inject Google API script:
const script = document.createElement('script');
script.src = 'https://apis.google.com/js/platform.js';
script.async = true;
script.onload = this.initGapi;
document.body.appendChild(script);
Init Google API:
window.gapi.load('auth2', function() {
window.gapi.auth2.init({
client_id: ,
scope: 'profile email openid'
});
});
Allow, on a button click, for a Google user to sing in:
const ga = window.gapi.auth2.getAuthInstance();
const googleUser = await ga.signIn();
const {id_token, expires_at} = googleUser.getAuthResponse();
const profile = googleUser.getBasicProfile();
User the profile, id_token, expires_at above to create a Cognito credentials session:
const user = {
email: profile.getEmail(),
name: profile.getName()
};
const credentials = await Auth.federatedSignIn(
'google',
{token: id_token, expires_at},
user
);
At this point a CognitoIdentityCredentials object was returned, properly populated, with token and all...
Problem
Unfortunately, aws-amplify adds a whopping 190K to my application webpack bundle (GZIPped, minified, optimized), which made me choke on my coffee.
Question 1
Can this somehow be reduced by a Babel plugin I'm missing (I'm guessing, no, since AWS is apparently still in 1995 and configures everything on a singleton Amplify and Auth objects).
Question 2
Have I made this unnecessarily complicated and there is a much more robust solution?
Question 3 (most important)
Can this be achieved using the "old way" amazon-cognito-identity-js, which is MUCH MUCH smaller?
I couldn't find, among all the (use cases)[https://github.com/aws/aws-amplify/tree/master/packages/amazon-cognito-identity-js/] a use case for social/federated login.
For me, the difference between
import Amplify from 'aws-amplify'
and
import Amplify from '#aws-amplify/core'
is ~500kB optimized and minified.
I think you also want
import Auth from '#aws-amplify/auth'
which adds only a little bit more.
But I agree, the aws-amplify package is really very large and it's not easy to figure out how to use the core components directly (e.g. aws-cognito-identity-js/es and aws-cognito-auth-js/es).
You could try using modularized exports in AWS amplify

Categories