Strapi + Hashicorp Vault Dynamic Database Secret - javascript

I have implemented a script and a cron task in strapi to generate new database secret from my vault server.
const client = new VaultEnv()
const secret = await client.secretManager()
strapi.config.set('database.connection.connection.user', secret.data.username)
strapi.config.set('database.connection.connection.password', secret.data.password)
So far the above code block only works at start-up in the register({ strapi }) function at src/index.js, but after in cron task it updates the connection config but strapi seems to only use the config that was set at register({ strapi }) function at start-up
I have also tried injecting into process.env but still the same result or I probably did it the wrong way.
Apparently it may be impossible but I will leave this question open in-case there are others searching.
A workaround will be to take advantage of container orchestration to restart strapi servers (agents) for zero downtime.

Related

DDD: using db layer as a singleton, could this be problematic

I recently downloaded a ddd boilerplate to get an example of a javascript functional approach to ddd. As I went through and analyzed the code I reached the awilix ioc setup and am a little confused on this approach. The code below is directly from the container.js file (https://github.com/joshuaalpuerto/node-ddd-boilerplate):
const { createContainer, asValue, asFunction } = require('awilix')
// you can do this
const app = require('./app')
const server = require('./interfaces/http/server')
const router = require('./interfaces/http/router')
const auth = require('./interfaces/http/auth')
const config = require('../config')
const logger = require('./infra/logging/logger')
const database = require('./infra/database')
const jwt = require('./infra/jwt')
const response = require('./infra/support/response')
const date = require('./infra/support/date')
const repository = require('./infra/repositories')
const container = createContainer()
// SYSTEM
container
.register({
app: asFunction(app).singleton(),
server: asFunction(server).singleton(),
router: asFunction(router).singleton(),
logger: asFunction(logger).singleton(),
database: asFunction(database).singleton(),
auth: asFunction(auth).singleton(),
jwt: asFunction(jwt).singleton(),
response: asFunction(response).singleton(),
date: asFunction(date).singleton(),
config: asValue(config),
repository: asFunction(repository).singleton()
})
module.exports = container
If endpoints are being accessed that access the db layer via a repository layer, should the database layer be a singleton? I would think this could open the door to unwanted side effects if all users are accessing the same db connection. Granted, I am probably missing something as this is my first time seeing awilix being used but it sure looks like only one instance of the database layer is being utilized. Am I missing something here and is this actually safe or could this really lead to issues?
Remember that NodeJS is single-threaded. There is only one node process running, so having a singleton is not bad. This is what saves machine resources over other architectures (like Java) where each request is it's own thread and memory etc. The single instance is just the means of accessing the DB.
Also remember that NodeJS is good at lots of I/O happening at once. It takes quite a bit to slow the whole thing down. Even though it is a DB singleton, that may not be an issue for your application. There may never be a bottleneck at the DB level. Or there may not be a (noticeable) delay with simultaneous request. Rather than each request build a DB connection (which could be costly in time and resources), the DB connection is shared. Ease of getting started early in a project is more valuable than figuring out highly-scalable things (which could have different solutions when the time comes.)
One last thing. The singleton is an instance of sequelize. Sequelize is a really popular NodeJS DB connection, and can even be configured with a connection pool. Now your single DB instance is managing requests via multiple connections which will take you to the next level scalability.
unwanted side effects if all users are accessing the same db connection
That's not how server apps usually work. Your users perform requests to an HTTP API, which then usually needs a single connection with a database throughout the request lifetime, fetching all kinds of information from all kinds of tables.
Even if you'd have a process that handles multiple users at the same time, that can all just happen over the same database connection.
Unless your app needs more than 1 DB connection, there's no real world adverse or positive consequences to using a singleton for it, besides the fact that it can look silly.

Connect to MongoDB Atlas Cluster db with react-native app

I'm having a hard time understanding how to connect to the MongoDB Atlas Cluster from my react-native app. What I'm trying to do is basically take my data from my component login page (userName and password) and connect to the Atlas Cluster db to see if the data is there.
Im using React Native and use Expo to create the app. My login page opens up and I put in the data.
I want to take that data and then use the following code (from the Atlas Site Connection String) to connect and check.
const MongoClient = require('mongodb').MongoClient;
const uri = "mongodb+srv://<userName>:<password>#testcluster1-dbdq3.mongodb.net/test?retryWrites=true&w=majority";
const client = new MongoClient(uri, { useNewUrlParser: true });
client.connect(err => {
const collection = client.db("test").collection("devices");
// perform actions on the collection object
client.close();
});
Since react-native establishes a server, do I need to involve Express? Im new to this so I'm still trying to figure out what packages to utilize. Should I also install mongoose or mongoDB or both (from NPM). Im trying to wrap my head around how this works from a basic perspective and the packages required.
I want to perform a check against my userID and PW from my login page to the DB to see if the user exists. If the user doesnt, then I'll have them fill out some info and register which means writing a new user to my db.
So basically, I need to understand the code logic for:
Connecting to the db through my app and when to perform this
connection (when app loads or each time the login button is clicked)
Take data from my userName and password and search the atlas db to
see if the user exists. If so, then the next page loads.
If username and password doesn't exist, then I write the new user
and password to the db.
Thanks
I think you should rewrite the code following the format suggested by mongodb here:
https://mongodb.github.io/node-mongodb-native/api-articles/nodekoarticle1.html
So essentially:
const MongoClient = require('mongodb').MongoClient;
//make sure to check connection string is correct here, since this depends on the whether you are running standalone, replica, sharded cluster
const uri = "mongodb+srv://<userName>:<password>#testcluster1-dbdq3.mongodb.net/test?retryWrites=true&w=majority";
MongoClient.connect(uri, { useNewUrlParser: true }, function(err, client) {
if (err) {
//error
} else {
var collection = client.db('test').collection('devices');
//client.close() should be called after you are done performing actions such as collection.update, etc.
}
});
you can use any npm package with Expo if it works with RN (React Native), but you may need to detach in order to do so. Any npm packages which include native iOS or Android code will not work with Expo out of the box, unfortunately. Because MongoDB NPM package just mentioned the Node.js in thier docs, this doesn't mean that it will work on React Native. That's why MongoDB made this page about JUST React Native https://docs.mongodb.com/realm/tutorial/react-native/
You may need to use Realm Package to connect to MongoDB with React Native.

Authenticating user in AWS Cognito User/Identity Pool with Google as identity provider

AWS provides two possible ways of dealing with Cognito:
"old one" via amazon-cognito-identity-js (and possibly amazon-cognito-auth-js) and
"new one" via aws-amplify (which inlcudes the above one)
After quite a bit of trouble and reverse engineering, I've successfully managed to sign in (receve back CognitoIdentityCredentials) using aws-amplify locally as part of the development effort.
The steps where (bear with me, as these are important for the questions to follow, and also might help someone):
Setup
Create a User Pool in Cognito console
Create a User Pool App Client in Cognito console
Create Google Web App in Google Console
Configure Google Web App to point to http://localhost:8080 (my local dev server)
Configure User Pool to use Google as an Identity Provider, supplying it with the Google Web App Client ID and Client secret from Google Console
Create an Identity Pool in Congnito console and configure it to work with Google as an Identity Provider, supplying Google Web App Client ID there as well
Implementation
Configure Amplify.Auth:
Amplify.configure({
Auth: {
identityPoolId: ,
region: ,
userPoolId: ,
userPoolWebClientId:
}
});
Inject Google API script:
const script = document.createElement('script');
script.src = 'https://apis.google.com/js/platform.js';
script.async = true;
script.onload = this.initGapi;
document.body.appendChild(script);
Init Google API:
window.gapi.load('auth2', function() {
window.gapi.auth2.init({
client_id: ,
scope: 'profile email openid'
});
});
Allow, on a button click, for a Google user to sing in:
const ga = window.gapi.auth2.getAuthInstance();
const googleUser = await ga.signIn();
const {id_token, expires_at} = googleUser.getAuthResponse();
const profile = googleUser.getBasicProfile();
User the profile, id_token, expires_at above to create a Cognito credentials session:
const user = {
email: profile.getEmail(),
name: profile.getName()
};
const credentials = await Auth.federatedSignIn(
'google',
{token: id_token, expires_at},
user
);
At this point a CognitoIdentityCredentials object was returned, properly populated, with token and all...
Problem
Unfortunately, aws-amplify adds a whopping 190K to my application webpack bundle (GZIPped, minified, optimized), which made me choke on my coffee.
Question 1
Can this somehow be reduced by a Babel plugin I'm missing (I'm guessing, no, since AWS is apparently still in 1995 and configures everything on a singleton Amplify and Auth objects).
Question 2
Have I made this unnecessarily complicated and there is a much more robust solution?
Question 3 (most important)
Can this be achieved using the "old way" amazon-cognito-identity-js, which is MUCH MUCH smaller?
I couldn't find, among all the (use cases)[https://github.com/aws/aws-amplify/tree/master/packages/amazon-cognito-identity-js/] a use case for social/federated login.
For me, the difference between
import Amplify from 'aws-amplify'
and
import Amplify from '#aws-amplify/core'
is ~500kB optimized and minified.
I think you also want
import Auth from '#aws-amplify/auth'
which adds only a little bit more.
But I agree, the aws-amplify package is really very large and it's not easy to figure out how to use the core components directly (e.g. aws-cognito-identity-js/es and aws-cognito-auth-js/es).
You could try using modularized exports in AWS amplify

Could not load the default credentials? (Node.js Google Compute Engine tutorial)

SITUATION:
I follow this tutorial: https://cloud.google.com/nodejs/tutorials/bookshelf-on-compute-engine
Everything works fine until I do npm start and go to:
http://localhost:8080
I am met with the following text on the blank page:
Could not load the default credentials. Browse to https://developers.google.com/accounts/docs/application-default-credentials for more information.
Which makes no sense since I am using OAuth. I followed the link and read the page, but I have no GOOGLE-APPLICATION-CREDENTIALS field anywhere, and nothing about it in the tutorial.
QUESTION:
Could you please reproduce the steps and tell me if you get the same result ?
(takes 5 minutes)
If not, what could I have done wrong ?
Yes, I had the same error. It's annoying cause Google Cloud Platform docs for their "getting started" bookshelf tutorial does not mention this anywhere. Which means that any new developer who tries this tutorial will see this error.
Read this:
https://developers.google.com/identity/protocols/application-default-credentials
I fixed this issue by running:
gcloud auth application-default login
In order to run thisgcloud auth application-default login
Visit: https://cloud.google.com/sdk/install
1) You have to install sdk into your computer
2) That will enable you to run the code
3) Log in to your associated gmail account then you are good to go!
This will make you login, and after that you code locally will use that authentication.
There are 2 solutions for this problem. One option, as mentioned by others, is to use gcloud auth application-default login
Second option is to set the environment variable GOOGLE_APPLICATION_CREDENTIALS. It should point to a file that defines the credentials. To get this file you need to follow these steps:
Go to the API Console Credentials page.
From the project drop-down, select your project.
On the Credentials page, select the Create credentials drop-down, then
select Service account key.
From the Service account drop-down, select an existing service account
or create a new one.
For Key type, select the JSON key option, then select Create. The file
automatically downloads to your computer.
Put the *.json file you just downloaded in a directory of your
choosing.
This directory must be private (you can't let anyone get access to
this), but accessible to your web server code.
Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the
path of the JSON file downloaded.
See https://developers.google.com/identity/protocols/application-default-credentials for details
Create a service account key using and download the json file. https://console.cloud.google.com/apis/credentials/serviceaccountkey
Add this to your ENV file
GOOGLE_APPLICATION_CREDENTIALS = "<PATH_TO_SERVICE_ACCOUNT_JSON_FILE>"
E.g:
GOOGLE_APPLICATION_CREDENTIALS=/Users/hello/Documents/ssh/my-10ebbbc8b3df.json
I was facing the same issue. It got fixed with following command.
gcloud auth application-default login
It stores default gcloud cloud credentials on your system and uses the same.
I got this error because of initially I did like below:
var admin = require("firebase-admin");
admin.initializeApp(); // I didnt add anything because firebaserc file include appName
It worked when I deployed the functions but not in serve. So this is how I solved it:
Go to the firebase Project settings(click on setting icon from side nav).
Click on the Service accounts.
Copy the admin sdk configuration snippet from selecting your pro. lang.
Ex (node.js):
var admin = require("firebase-admin");
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your-domain.firebaseio.com"
});
Now we need to add serviceAccountKey.json file.
Click on the Manage service account permissions in top right corner.
Now, you will see services accounts for your project, in the table find the row with column name name and value firebase-adminsdk, in that row click on Action dots and select Create key.
From the pop up dialog select Key type as json and press create button.
You will prompt to download the file, download it to your functions directory in project(You can customize this as you want and if you pushing to github, make sure to ignore that file).
Now, if you save it into the same directory where you are initializeApp(), access that file like: ./socialape-15456-68dfdc857c55.json(In my case, both files are located: functions/index.js and functions/services.son in functions directory and in index.js file, I initialed my firebase admin sdk).
Ex(node.js):
const functions = require('firebase-functions');
var admin = require("firebase-admin");
var serviceAccount = require("./myapp-15456-68dfdc857c55.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://myapp-15456.firebaseio.com"
});
It's a best and good idea to create .env file and include your file there and access it as others mentioned. I leave that part to you.
Hope this help someone on the planet. Regards!
If you're running the app locally, then the gcloud beta auth application-default login command should suffice for acquiring local credentials (I updated the tutorial to say so).
When running the app on Google Compute Engine, if the Compute Engine instance was created with the proper scopes (cloud-platform should be sufficient) then the app will authenticate with Google Cloud Platform APIs automatically without any extra work on your part.
Go here: https://firebase.google.com/docs/admin/setup#initialize_the_sdk and follow the instructions to create a private key.
Then after you have downloaded your private key open command prompt in the project directory and do the following command:
set GOOGLE_APPLICATION_CREDENTIALS=C:\YOUR-PATH\YOUR-KEY.json
use this to solve your issue. this actually works:-
just put credential parameter and give reference to your key to it.
const serviceAccount = require('../key.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
Another solution i found: in your package.json add an export command like this:
"scripts": {
"start": "export GOOGLE_APPLICATION_CREDENTIALS='./gcloud.json' && node ./bin/www --exec babel-node --presets babel-preset-env",
},
You have to create an object of your SessionsClient.
Here I will provide some steps, so you can run your code like a charm.
You have to go into your Dialogflow dashboard.
Click on setting ( Left navbar top-right gear icon)
in the General tab click Service Account link ( it will redirect you to another screen)
If you have a service account then ignore step 5
Create a service account ( Top-center +icon button)
Now you have a service account on a list click on
From the action, field presses the 3 vertical dots and create a key.
Download the JSON file on your local computer.
Assign object to your sessionClient.
const sessionClient = new dialogflow.SessionsClient({
keyFilename: "/var/www/html/moqatrips/dialog-flow.json"
});
For all people using firebase, what it worked for me was passing the credentials to the KeyManagementServiceClient constructor
const serviceAccount = require('../keys/file.json'); //<- your firebase credentials
const client = new KeyManagementServiceClient({
credentials: serviceAccount,
});
I also had this error problem, here I had not created the object of
keyFilename (stores the credentials of the api)
in the sessionClient object for nodejs app.
const sessionClient = new dialogflow.SessionsClient({
keyFilename: "./keyCredentials.json"
});
const sessionPath = sessionClient.sessionPath(projectId, sessionId);
To download 'keyCredentials.json' goto:
https://console.cloud.google.com/apis/credentials/serviceaccountkey
Also add the path of this file to the system variables
In windows open powershell
type GOOGLE_APPLICATION_CREDENTIALS = [PATH_TO_SERVICE_ACCOUNT_JSON_FILE]
you may find youself in another part of the world -- and land here. I'm adding to a three year old question because its keywords matched my issue and the preceding answers helped me although none describe my issue
firebase deploy --only functions --debug
produced
[2020-12-02T08:31:50.397Z] FirebaseError: HTTP Error: 429, Unknown Error
Error: Could not read source directory. Remove links and shortcuts and try again.
I could not find anything wrong with the source directory. But that was all so many tiny fish.
Examining the error in detail, from the top, lead to:
Our systems have detected unusual traffic from your computer network. This page checks to see if it's really you sending the requests, and not a robot.
The block will expire shortly after those requests stop.
Out of curiosity and exhaustion, I waited first. The wait reset duration is > than 30 minutes. So i pursued the captcha to prove my enduring humanity which did register eventually after some oauth warnings.
Although this question has been answered multiple times, I found myself in a situation not explained here.
After I created the variable: $GOOGLE_APPLICATION_CREDENTIALS, I was getting the same error as Coder1000.
However, I was running both: nodemon and: npm run dev in two separate sessions in Terminal, neither of which were aware of the variable.
Once I: shut the tabs down; added new tabs; and ran the commands again, the application was able to access the variable.
download Cloud SDK installer from this site. https://cloud.google.com/sdk/docs/install
run this command -> gcloud auth application-default login
If anyone ran into the issue just like me and doesn't want to set the variable each time before running their code it's best to manually set the environment variable. Name it GOOGLE_APPLICATION_CREDENTIALS and browse the downloaded JSON file. If you don't know the steps follow this: https://docs.oracle.com/en/database/oracle/machine-learning/oml4r/1.5.1/oread/creating-and-modifying-environment-variables-on-windows.html#GUID-DD6F9982-60D5-48F6-8270-A27EC53807D0

Google Cloud Storage change notifications with Node.js

I have Firebase storage bucket and I would like to use Node.js Google-cloud notification API in order to listen to changes in the storage.
What I have so far:
const gcloud = require('google-cloud');
const storage = gcloud.storage({
projectId: 'projectId',
credentials: serviceAccount
});
const storageBucket = storage.bucket('bucketId');
Now from what I understand I have to create a channel in order to listen to storage changes.
So I have:
const storageBucketNotificationChannel = storage.channel('channelId', 'resourceId');
This is the threshold where the docs stop being clear, as I can't figure out what channelId a resourceId stand for.
Nor do I understand how to declare listening to channel changes itself. Are there any lifecycle-type methods to do so?
Can I do something like?
storageBucketNotificationChannel.onMessage(message => { ... })
Based on the existing documentation of the Google Cloud Node.js Client and the feedback from this Github issue, there is presently no way for the node client to create a channel or subscribe to object change notifications.
One of the reasons being that the machine using the client may not necessarily be the machine on which the application runs, and thus a security risk. One can still however, subscribe to object change notifications for a given bucket and have notifications received a Node.js GAE application.
Using Objects: watchAll JSON API
When using gsutil to subscribe, gsutil sends a POST request to https://www.googleapis.com/storage/v1/b/bucket/o/watch where bucket is the name of the bucket to be watched. This is essentially a wrapper around the JSON API Objects: watchAll. Once a desired application/endpoint has been authorized as described in Notification Authorization, one can send the appropriate POST request to said API and provide the desired endpoint URL in address. For instance, address could be https://my-node-app.example.com/change.
The Node/Express application service would then need to listen to POST requests to path /change for notifications resembling this. The application would then act upon that data accordingly. Note, the application should respond to the request as described in Reliable Delivery for Cloud Storage to retry if it failed or stop retrying if it succeeded.

Categories