Write file in NodeJS with credentials - javascript

I am trying to write a file in nodejs and it is working fine using below code:
writeFile() {
const sharedFolderPath = "\\server\folder";
fs.writeFile(sharedFolderPath, templatePath, (err) => {
if (err) {
console.error(err);
} else {
console.info("file created successfully");
}
})
}
It is writing into the shared drive because I have permissions but I want to do it using superuser account but not getting how to inject superuser credentials?
The same above code failed if I am using with different user machine. Is there any way to write a file in NodeJS in shared folder using credentials?

It looks like you are using Windows and writing into a shared folder. The superuser in Windows is LOCAL_SYSTEM but it only has superuser privileges on a local machine (otherwise a compromised LOCAL_SYSTEM could wreck havoc in your network). In other words, running an application as Administrator would not grant it privileges to bypass shared folder permissions.
The easiest solution would be to make that shared directory writable by the users you intend to run this application under.

Related

Can we call powershell command from browser extension in Javascript?

I know this is security-wise an absolute No-Go. But my customer is going to use a SAPUI5 developed application in the browser within the intranet upon which my preinstalled browser extension will be invoked (using "matches") which should simply collect the PRINTER names on the local workstation and supplies back to the application which is to be used for further processes. I have the below node js code which return the printernames but how to run this in browser engine.
const { exec } = require('child_process');
exec('get-printer | Select-Object Name | ConvertTo-Json', { 'shell': 'powershell.exe' }, (error, stdout, stderr) => {
console.log(stdout);
})
You can't. You will need to expose your nodeJS code as an API which can be called by the browser environment. You could try exposing it as a REST endpoint inside a nodejs express app.
Check this example: https://www.geeksforgeeks.org/how-to-connect-node-js-with-react-js/

How to hide my database string in a github public repo

I uploaded my repo and it has a database string named 'dbstring' which I do not want to share with anyone.
I created a repository secret on github and created a value named DBSTRING with its value but the thing is I dont know how to access it.
this is my uploaded code which reveals my dbstring.
const dbstring = mongodb+srv:/***********b.net
mongoose.connect(dbstring, { useUnifiedTopology: true, useNewUrlParser: true });
const db = mongoose.connection;
db.once('open', () => {
console.log('Database connected:', url);
});
How can I replace dbstring with secret value I created on my github repo?
What you need to do is to use Environment variables, where you can have a .env ( if you use dotenv ) for each environment. Then you keep your database credentials safe on your computer and on the server, this will also make it possible to target different environments like database production, dev, test, etc. Make sure you have .env file added in the .gitignore file.
It's also important that when you run this code it's executed on the server-side otherwise anyone with the dev tools open will be able to see the credentials as well. Then on your client side you make a request using axios to the URL related to that database connection.
If the ENV file works for you then what you can do is you can encrypt it before uploading it to the GitHub like creating an env-production file and encrypting it and once you use that repo you can decrypt it and you can also add that step to your CD/CI Line use this

How to have dynamic .env files instead of hard coded localhost?

I have 2 .env files (one for Vue and one for Laravel). Both of them have localhost hard coded inside of them. That means that I can't access my application from another computer on my network. It would be nice if it was dynamic (except for production)
For example if I go to my other PC and access my site at http://192.168.1.100:47344 then it won't work because it is hardcoded to localhost (escpecially frontend calls).
But I can't write javascript or PHP inside .env files to change localhost to something like window.location.host and for PHP $_SERVER['SERVER_ADDR']. I can't find the solultion..
My Vue .env
VITE_SERVER_URL=http://localhost:41166
VITE_APP_ENV=dev
And my Laravel .env
APP_ENV=local
APP_CLIENT_URL=http://localhost:47344
APP_URL=http://localhost:41166
Even if your app url is set to localhost, the application is still accessible, as long as there is no firewall restricting it. You can also access it through 127.0.0.1 or the local IP address assigned to your machine.
The way to get your local ip address depends on your system, but you can just run this in the node console
const { networkInterfaces } = require("os");
const nets = networkInterfaces();
for (const name of Object.keys(nets)) {
for (const net of nets[name]) {
if (net.family === "IPv4" && !net.internal) {
console.log({name , address: net.address});
}
}
}
Note that if you're running this in docker the setting may need to be 0.0.0.0 because 127.0.0.1 is a loopback host.

How to use .pem file inside Amazon Lambda to access EC2 instance

I'm currently working on a project that take place inside the AWS environment. I have configure a S3 bucket in order to receive mails (mails are coming from SES but that's not relevant).
What I want to do is to create a Lambda function that will be able to access a EC2 instance and launch a python scripts. So far i have the code below. The problem is that when I created my ec2 instance, I didnt create any username or password to connect via SSH. I only have a .pem file (certificate file) to authenticate to the instance.
I did some research but i couldn't find anything useful.
var SSH = require('simple-ssh');
var ssh = new SSH({
host: 'localhost',
user: 'username',
pass: 'password'
});
ssh.exec('python3.6 path/to/my/python/script.py', {
out: function(stdout) {
console.log(stdout);
}
}).start();
i've been thinking of severals solutions, but i'm not sure at all :
find an SSH library in Javascript that handle .pem file
converting .pem into a String (not secure at all, in my opinion).
maybe create a new ssh user in EC2 ?
Thanks you for your time.
A better option would be to use AWS Systems Manager to remotely run commands on your Amazon EC2 instances.
If you still choose to use simple-ssh then you need to supply an SSH key in config.key when creating your SSH object. You can store the private key in Parameter Store or Secrets Manager and retrieve it within the Lambda. In this case, you should definitely use passwordless SSH (with keypair).

Convert with Imagemagick without saving a file (on Heroku)

The goal: Have a Node.js server which will take a pdf from a PUT call, and return each page converted to a .jpg. Ultimately, I don't care how it gets done, of course.
I've successfully created a Node.js server which uses imagemagick to convert a pdf to an image. It's an express server, and it uses this imagemagick npm package, which wraps the imagemagick CLI.
Here's the code for the route, which works very well running locally
app.put('/v1/convert-pdf', (req, res) => {
res.status(202).send({ message: "we're working on it, alright?!" })
console.log("beginning image conversion")
im.convert(["-density", "144", "./content/pdf/foo.pdf", "./content/img/bar.jpg"], (err, stdout) => {
if (err) {
console.error(err)
} else {
console.log(stdout)
console.log("finished converting pdfs");
}
})
})
The route above outputs a bunch of .jpg files named foo-0.jpg, foo-1.jpg, etc... Now what I want to do is take a file, which is put to the route, convert it via imagemagick and then give it back to the user (eventually I'll put it in an S3 bucket, but baby steps).
If it does require saving the file to a folder in Heroku, how do I do that? I've attempted to read/write from Heroku, and I don't get an error (unless one of the files/directories doesn't exist), but I don't see the file either.
Summarising our conversation in comments here in case it's helpful to anyone:
The reason you won't be able to see the files that were created and saved by the application when you run heroku run bash is because that actually spins up a new dyno and doesn't have access to the file system of the dyno running the app.
As mentioned, the best way to do this is to upload the resultant images to something like S3 as soon as the conversion is complete, and to serve any incoming requests to retrieve those files through S3.
If running a single web dyno this wouldn't be a problem right now, but if you're running multiple then the converted files will only be available on the dyno that received and transformed the PDF, so other dynos won't have access to them.
Also, each deployment creates new dynos, so unless you store those files in something like S3, they'll be lost as soon as you push up.

Categories