hide privateKey from nodeJS on Heroku? - javascript

I've integrate BrainTree Drop-in UI for checkout payment in nodeJS. It's only a demonstration that I needed to create. I had no issues, it was easy enough. The only thing that I would like to do is to hide merchantId, publicKey and privateKey from the code. I would like to add them directly in Heroku Config Vars. I know how to do all this in Python, but I have no idea how to do it in JavaScript. I show you the code changing the keys:
const express = require('express');
const router = express.Router();
const braintree = require('braintree');
router.post('/', (req, res, next) => {
const gateway = new braintree.BraintreeGateway({
environment: braintree.Environment.Sandbox,
// Use your own credentials from the sandbox Control Panel here
merchantId: 'h43jgh5g3gl4543',
publicKey: 'hj45j4h5lh45hl4h5l',
privateKey: 'b5hbhbb45bhbh4kfndnfdkkfnd'
});
// Use the payment method nonce here
const nonceFromTheClient = req.body.paymentMethodNonce;
// Create a new transaction for $10
const newTransaction = gateway.transaction.sale({
amount: '10.00',
paymentMethodNonce: nonceFromTheClient,
options: {
// This option requests the funds from the transaction
// once it has been authorized successfully
submitForSettlement: true
}
}, (error, result) => {
if (result) {
res.send(result);
} else {
res.status(500).send(error);
}
});
});
module.exports = router;
How can I assign those values to Variables that I can then pass to Heroku Config Vars?
Thank you very much for your help!
EDITING THE INTIAL POST:
Sorry I need to add more info about this post. I followed what a user suggested by changing the code in Node.js in this way:
router.post('/', (req, res, next) => {
const gateway = new braintree.BraintreeGateway({
environment: braintree.Environment.Sandbox,
// Use your own credentials from the sandbox Control Panel here
merchantId: process.env.MERCHANT_ID,
publicKey: process.env.PUBLIC_KEY,
privateKey: process.env.PRIVATE_KEY
});
I added those value to Heroku, and it looked it was working, but I also changed the sandbox value in index.hbs:
<script>
var button = document.querySelector('#submit-button');
braintree.dropin.create({
authorization: process.env.SANDBOX_KEY,
container: '#dropin-container',
paypal: {
flow: 'checkout',
buttonStyle: {
color: 'blue',
shape: 'rect',
size: 'medium'
},
amount: '10.00',
currency: 'USD'
}
},
I replace the value 'sanbox_34920hfjh34i23h4oi3' with process.env.SANDBOX_KEY and added that value in Heroku.
But now the interface doesn't work anymore, why?
Thank you very much for your precious help!

You can add Config variables in Heroku dashboard or with the CLI more on that here Configuration and Config Vars. Now, when you done that you can access them in your Node app as process.env.NAME_OF_VARIABLE.

Related

Is it possible to implement socket.io connection in express route?

I implement a payment service which depend on one of my express route as a callback route, so whenever a user want to make a payment, they will be redirected to this payment service link which entirely different my backend/frontend domain. After a successful payment, user will then be redirected to my express GET route (callback route), in this route is where I give users their asset and then redirect them to the frontend.
EXPECTATION
My expectation is, whenever a user make a purchase, I want a real time update on the frontend for others to see some details about the purchase without refreshing their browser.
WHAT I'VE TRIED
I had think socket.io would solve this, like adding a socket connection in the route to then push the data to the frontend. But after making lot of research, no solution seems to work for me.
HERE IS A SIMPLE CODE OF WHAT I'VE TRIED
=============================== server.js ========================
const express = require("express")
const app = express()
const http = require("http")
const cors = require("cors")
const session = require("express-session")
const runSocket = require("./runSocket")
const { Server } = require("socket.io")
app.use(cors())
app.use(express.json())
const server = http.createServer(app)
server.listen(3004, () => {
console.log("SERVER IS RUNNING")
})
const io = new Server(server, {
cors: {
origin: "http://localhost:3000",
methods: ["GET", "POST"],
},
})
const postRoute = require("./routes/postData")(io)
app.use("/post-data", postRoute)
==================================== postData Route ======================================
module.exports = function (io) {
router.post("/", async (req, res) => {
const data = req?.body?.data.message
const room = req?.body?.data?.room
io.on("connection", (socket) => {
console.log("Socket Running...")
socket.to(room).emit("the_message", data)
})
console.log("Under socket...")
return res.status(200).json({ data: req.body.data })
})
return router
}
This log: in postData route is not printing console.log("Socket Running...")
EXPECTATION
My expectation is, whenever a user make a purchase, I would like to make a real time update on the frontend for others to see some details about the purchase.
UPDATE: The Payment Gateway config looks somthing like this:
const { body } = await got.post("https://payment-provider-link", {
headers: { Authorization: "Bearer token for payment" },
json: {
email: "email#gmail.com",
amount: amount * 100,
initiate_type: "inline",
callback_url: `${BackendBaseUrl}/payment-callback`, // <<<============
},
})
Okay so you don't need the io.on("connection") in ur route. Remove that piece of code and simply change it to io.to(room).emit("the_message", data). Also make sure to have the other sockets joined the room ur trying to emit to otherwise they won't receive the data.

Pusher working only after refreshing page

I'm implementing pusher in a web application using nextjs. It works as expected in my development environment. But it does not work correctly when I deploy it to vercel. I can only see the results when I refresh the page in the browser.
This is the implementation for the client:
useEffect(() => {
const pusher = new Pusher(`${process.env.PUSHER_KEY}`, {
cluster: `${process.env.PUSHER_CLUSTER}`,
useTLS: true
});
const channel = pusher.subscribe('franks-auto-spa');
channel.bind('cancel-wash', data => {
console.log(data.date);
removeWash(data.date);
});
}, []);
And this is the implementation for the API:
export default async (req, res) => {
const connection = await mysql.createConnection(process.env.DATABASE_URL);
console.log(req.body.date);
const result = await connection.query('DELETE FROM ongoing WHERE date = ?', [req.body.date]);
console.log(result);
const pusher = new Pusher({
appId: `${process.env.PUSHER_ID}`,
key: `${process.env.PUSHER_KEY}`,
secret: `${process.env.PUSHER_SECRET}`,
cluster: `${process.env.PUSHER_CLUSTER}`,
useTLS: true
});
pusher.trigger('franks-auto-spa', 'cancel-wash', req.body);
res.json({message: 'Wash deleted deleted...'});
}
Am I missing any configuration in Vercel?
I've had this problem too, once or twice; Vercel times out after a little bit. (10 seconds on Hobby) So using a database with Vercel probably isn't gonna fly. Have you tried switching to the Pro plan? Or you could switch to Heroku, which not only has a longer timeout (30 seconds), but also supports websockets.

fastify and prisma with postgres session storage

I am building a nodejs api that uses fastify, Prisma and Postgres. I have the API working with fastify-cookies and fastify-session and i can get cookies just fine but i need to be able to store the session cookies in the database. I saw a tutorial on doing this but it was without prisma, so im lost on how to connect fastify-session to the Prisma database pool.
I user the prisma client to connect to the database to do my normal calls in my routes, const data = await prisma.model.create({});
server.js
const fastify = require('fastify')({ logger: true });
const PORT = process.env.PORT || 3000;
// Session state
fastify.register(require('./sessions'));
// Register all our routes here.
...
// Startup code for the fastify server.
const start = async () => {
try {
await fastify.listen(PORT, '0.0.0.0');
} catch (error) {
fastify.log.error(error);
process.exit(1);
}
};
// Start the fastify server.
start();
sessions.js
const cookie = require('fastify-cookie');
const session = require('fastify-session');
const fp = require('fastify-plugin');
/**
* #param {import('fastify').FastifyInstance} fastify
*/
const plugin = async (fastify) => {
// All plugin data here is global to fastify.
fastify.register(cookie);
fastify.register(session, {
secret: process.env.SESSION_SECRET,
store: new SessionStore({
tableName: 'UserSession',
pool: ???, <--------------------------------- how to connect?
}),
saveUninitialized: false,
cookie: {
httpOnly: true,
secure: false,
},
});
fastify.addHook('preHandler', (req, reply, next) => {
req.session.user = {};
next();
});
};
module.exports = fp(plugin);
If you want to use the Prisma connection pool you would have to create a session storage library similar to connect-pg-simple or modify the codebase to accept a Prisma connection. This is definitely a non-trivial implementation and I don't think it would make a lot of sense without exceptional circumstances.
I would suggest creating a new pg.Pool or pgPromise instance and connecting with that like it was shown in the tutorial video you linked to. There's no reason you can't have two separate connection pools open to the same database (One with Prisma and one with pg.Pool)

How to take clientside Javascript arrays and POST through a Node.js API into a MongoDB database?

I have a single webpage that initially has two form inputs, one for a list of names and another for the title of a game. I've written some javascript/jquery that takes the X names and creates X more form inputs meant for each person's specific score. The javascript then creates the following variables upon the clicking of the names/scores form's submit button:
gameTitle = Monopoly
users = [Bob, Bill, Jim, Janet]
scores = [100, 110, 90, 80]
positions = [2, 1, 3, 4]
I then have a MongoDB schema set up as such:
const SessionSchema = new mongoose.Schema({
gameTitle: String,
users: [],
scores: [],
positions: []
});
And a Node.js handler as such:
const express = require('express');
const router = express.Router();
const bodyParser = require('body-parser');
const timestamps = require('mongoose-timestamp');
router.use(bodyParser.urlencoded({ extended: true }));
router.use(bodyParser.json());
const Session = require('./Session');
//Post a session to the database
router.post('/', function(req, res) {
Session.create({
gameTitle : req.body.gameTitle,
users : req.body.user,
scores : req.body.score,
positions : req.body.position
},
function (err, session) {
if (err) return res.status(500).send("There was a problem adding the information to the database");
res.status(200).send(session);
});
});
Using Postman I can see that posting works when I use this format:
Postman POST
Postman GET
How do I take the created javascript variables and, also upon the clicking of the names/scores form's submit button, POST them through the API and into the MongoDB database?
Apologies if I have missed any important information/code - I haven't fully wrapped my head around how the backend stuff works.
You need to register your Schema:
const mongoose = require('mongoose');
const Schema = mongoose.Schema;
const SessionSchema = new mongoose.Schema({
gameTitle: String,
users: [],
scores: [],
positions: []
});
module.exports = mongoose.model('session', SessionSchema);
And here you need to use the mongo schema model, like this:
const express = require('express');
const router = express.Router();
const bodyParser = require('body-parser');
const timestamps = require('mongoose-timestamp');
router.use(bodyParser.urlencoded({ extended: true }));
router.use(bodyParser.json());
const SessionSchema = require('./Session'); // register the mongo model
const mongoose = require('mongoose');
const Session = mongoose.model('session');
//Post a session to the database
router.post('/', function(req, res) {
const new_session = {
gameTitle : req.body.gameTitle,
users : req.body.user,
scores : req.body.score,
positions : req.body.position
};
new_session.save((err, saved_session) => {
if(err) {
res.json(err);
} else {
res.json(saved_session);
}
});
});
Sounds like you have the backend working. What you're missing is the API request. Since your website is not under the same host:port than your API server, when doing it from the browser you'll face CORS issues. Let's get to that later:
First, you'll be making an API call. You can use axios or fetch. Let's go with fetch here:
fetch(url, {
body: JSON.stringify(yourJavascriptVariablesAsAnObject),
headers: {
'Content-Type': 'application/json'
},
method: 'POST',
})
.then(response => {
// here you can check for status if you want.
// ...
return response.json(); // assuming server returns JSON.
})
.then(responseBody => {
// Do something with the server's response body
});
Now for the CORS problem, if your client app is from create-react-app or at least you're using webpack-dev-server, you can proxy request really easy.
If you're not, then you need to allow CORS on your nodeJS server. The simplest way is to use a library.
PS: CORS basically means you can't do requests from a browser to a service living in a different `url:port:, unless that service explicitly says it's ok.
A third option would be putting both UI and server project behind a Web server like Nginx and proxy the requests, but that sounds too complex for what you need.

Hapi.js - adding mechanism to check every route

I am trying to implement a mechanism that will be run before any route is hit. In that mechanism I want to take a value from the header and check for authentication.
I have come up with this:
server.js:
// Create a server with a host and port
'use strict';
var Hapi = require('hapi');
var mongojs = require('mongojs');
var plugins = [
require('./routes/entities')
];
var server = new Hapi.Server();
server.connection({
port: 3000
});
//Connect to db
server.app.db = mongojs('hapi-rest-mongo', ['entities']);
server.app.checkHeader = function (request) {
var header = request.headers['x-authorization'];
if(header === "letmein"){
return true
}
return false
};
//Load plugins and start server
server.register(plugins, function (err) {
if (err) {
throw err;
}
// Start the server
server.start(function (err) {
console.log('Server running at:', server.info.uri);
});
});
and in routes.entities:
'use strict';
var Boom = require('boom');
var uuid = require('node-uuid');
var Joi = require('joi');
exports.register = function (server, options, next) {
var db = server.app.db;
server.route({
method: 'GET',
path: '/entities',
handler: function handler(request, reply) {
if(!server.app.checkHeader(request))
{
return reply(Boom.unauthorized());
};
//request.server.myFunc();
db.entities.find(function (err, docs) {
if (err) {
return reply(Boom.wrap(err, 'Internal MongoDB error'));
}
reply(docs);
});
}
});
So in short while starting the server I have registered my function server.app.checkHeader
And in the routes I am calling it and sending a request object to it. Request object contains information about the headers.
While this works, I am having a feeling I am not following the best practices with the Hapi.
How could I do it more elegantly?
There are a few options.
You can, of course, tap into the request lifecycle - note the events that occur in the pipeline prior to the route handler.
Although, I'd urge you to consider implementing an auth strategy that can be set as the default for all routes or selectively on appropriate routes.
The best way to require authentication for all or selected route is to use hapi’s integrated functionality.
You should set a default authentication strategy that is applied to each route handler. The sample below uses basic auth. You’d want to create a custom authentication strategy for hapi to check your x-authentication header.
const Hapi = require('hapi')
const BasicAuth = require('hapi-auth-basic')
const server = new Hapi.Server()
server.register(BasicAuth, function (err) {
if (err) {
console.log('error', 'failed to install plugins')
throw err
}
// TODO: add authentication strategy & set as default
server.auth.strategy('simple', 'basic', true, { validateFunc: basicValidationFn })
// or set strategy separately as default auth strategy
server.auth.strategy('simple', 'basic', { validateFunc: basicValidationFn })
server.auth.default('simple')
// TODO: add routes
server.start(function (err) {
})
})
You can also inject hapi’s request lifecycle and extend it at given points. Extending the request lifecycle should be done by using plugins:
register: function (server, options, next) {
// do some processing before 'onPreAuth'
// or pick another extension point
server.ext('onPreAuth', (request, reply) => {
// your functionality
})
}
Hope that helps!

Categories