Convert with Imagemagick without saving a file (on Heroku) - javascript

The goal: Have a Node.js server which will take a pdf from a PUT call, and return each page converted to a .jpg. Ultimately, I don't care how it gets done, of course.
I've successfully created a Node.js server which uses imagemagick to convert a pdf to an image. It's an express server, and it uses this imagemagick npm package, which wraps the imagemagick CLI.
Here's the code for the route, which works very well running locally
app.put('/v1/convert-pdf', (req, res) => {
res.status(202).send({ message: "we're working on it, alright?!" })
console.log("beginning image conversion")
im.convert(["-density", "144", "./content/pdf/foo.pdf", "./content/img/bar.jpg"], (err, stdout) => {
if (err) {
console.error(err)
} else {
console.log(stdout)
console.log("finished converting pdfs");
}
})
})
The route above outputs a bunch of .jpg files named foo-0.jpg, foo-1.jpg, etc... Now what I want to do is take a file, which is put to the route, convert it via imagemagick and then give it back to the user (eventually I'll put it in an S3 bucket, but baby steps).
If it does require saving the file to a folder in Heroku, how do I do that? I've attempted to read/write from Heroku, and I don't get an error (unless one of the files/directories doesn't exist), but I don't see the file either.

Summarising our conversation in comments here in case it's helpful to anyone:
The reason you won't be able to see the files that were created and saved by the application when you run heroku run bash is because that actually spins up a new dyno and doesn't have access to the file system of the dyno running the app.
As mentioned, the best way to do this is to upload the resultant images to something like S3 as soon as the conversion is complete, and to serve any incoming requests to retrieve those files through S3.
If running a single web dyno this wouldn't be a problem right now, but if you're running multiple then the converted files will only be available on the dyno that received and transformed the PDF, so other dynos won't have access to them.
Also, each deployment creates new dynos, so unless you store those files in something like S3, they'll be lost as soon as you push up.

Related

How to find full path when sending a file over a post request from Next JS functions (server)?

In the Next JS server-side pages/api, I have a function that makes a post request to a service requiring two files.
I am trying to read these files using the following code:
var fullPhotoPath = process.cwd() + '/public/photos/photo.png'
var photoStream = fs.createReadStream(fullPhotoPath)
This works when I run the app locally, but it fails with file not found error when deployed.
I assume the pathing in server changes because of webpack, though I am not familiar with NextJS.
Things I have tried:
Use relative path
Move photo files to different locations either public or private folders
Deploy in different environments: same error in both Firebase or Vercel hosting.
Workaround with getStaticProps/getServerSideProps: can't send file to API functions since they aren't JSON-able.
Thanks for suggestions.

JavaScript discord bot keeps going offline with keep alive

I have a Javascript discord bot, and it is getting bigger and bigger, the problem is that every time I close my replit. In about 30 minutes my bot just goes offline.
I use this code in my keep_alive.js file
http.createServer(function (req, res) {
res.write("Online:)");
res.end();
}).listen(8080);```
//I have this in index.js: `const keep_alive = require('./keep_alive.js')`
The best solution that I have found and use (and it is free) is pm2. It requires no code in your file at all
https://pm2.keymetrics.io/
npm i pm2 -g
// navigate to the folder with your main bot.js file (in this example it is called index.js)
pm2 start index.js
// followed by
pm2 save
// Optionally you can monitor the whole server as well with
pm2-server-monit
Comes with a free dashboard as well
You could use freshping to keep the bot online if you use replit or something like that. You just need this like website you can monitor.

How to run child_process.exec correctly on an ajax request?

There is a server, that I have an access to, but do not have ownership on. It serves a node js / express application on a default port 3000. There are several scripts, that are usually run either manually from the terminal or by cron job. What I want to do is to have a button on the client-side and make an ajax request to a certain route and execute a node js command with inline arguments. For example:
node script.js 123
All routes are set and working. I have a CliController file that handles requests and has to run the command above. Currently I am using the following code:
cp.exec(`node script.js ${ip}`, function (err, stdout, stderr) {
if (err) {
console.log(err);
}
console.log(stdout);
console.log(stderr);
});
The script.js file is in root folder of the project, but the project itself was built by using express-generator and is being served using node bin/www command. There is a service/process on the server that runs nodemon to restart this project if it fails as well. Therefore I do not have access to output of that particular process.
If I run the command above in the terminal (from the root folder of the project, to be precise), it works fine and I see the output of the script. But if I press the button on the webpage to make a request, I am pretty sure that the script does not execute, because it has to make an update to database and I do not see any changes. I also tried to use child_process.spawn and child_process.fork and failed to get it working.
I also tried to kill nodemon and quickly start the project again to the see console output. If I do this, everything works.
What am I doing wrong ?
The process invoked may be in a blocking state, hence the parent script is simply waiting for the children process to terminate, or return something.
We can avoid this behaviour right into the shell command, by adding & (ampersand control operator) at the end.
This makes a command running in the background. (Notice, you can still control the children(s) process using the PID's and POSIX signals, this is another subject, but very related and you might find it very handy pretty soon).
Also notice that killing/stopping the parent script will also kill the children(s). This can be avoided using nohup.
This is not linked to JavaScript or node.js, but to bash, and can be used with anything in the shell.
cp.exec(`node script.js ${ip} &`, function (err, stdout, stderr) {
if (err) {
console.log(err);
}
console.log(stdout);
console.log(stderr);
});
Bash reference manual

Access a local directory to retrieve files from localhost and write back

I currently have an Angular app (MEAN Stack) that I am running locally on my Windows machine. Being new to Node/Express, I would like to be able to access a local directory from http://localhost:3006 that I have setup within my main app directory, called /myfiles which I am unsure how to do.
What I am unsure is, how do I create an endpoint to access and read these files from localhost within the /myfiles directory and display them within an Angular Material dialog?
Just not sure what I need to do as part of Express side (setting up the route) and then the Angular side (using HttpClient) to display.
Further to the above, I will also need to write back to the /myfiles directory, where I will need to perform a copy command based on a file selection within Angular.
You'll want to create an endpoint in Express that your Angular app can call to be served the files.
I'll assume the files you want to read and send are JSON files. Here's a really simple example of an endpoint that you can visit that will return the file to your frontend.
In your Angular code you will make a get call to /myfile
var fs = require("fs");
app.get('/myFile', (req, res) => {
var filepath = __dirname + '/myfiles/thefile.json';
var file = fs.readFileSync(filepath, encoding);
res.json(JSON.parse(file));
});
Then in Angular, you'll have something like
http.get('/myfile').subscribe( (data) => { console.log("The data is: ", data) });
ADDED
The example I provided above was just the basics to answer your question. Ideally, in production for file paths, you should the Node path library which 'knows' how to behave in different environments and file systems.

Execute different server.js file from current server in nodejs?

I'm developing a simple app with nodejs. The thing is that the first thing I do is to run a server.js file which loads an html form and checks if the entered information is valid.
After authentication, I'm planning to run another server.js (starting it from the original server.js file) in a subfolder for starting the real application.
Is this possible?
Definitely, check out child_process.spawn.
child_process.spawn("node", ["server.js"]);
process.exit(); // ends auth server

Categories