how can I check if child_process can run a command?
'echo' is a valid command that can be run in a terminal, but 'echoes' is not one. For example, if I do this
const cp = require('child_process')
cp.exec('echo hello')
it will work.
If I do this, though
const cp = require('child_process')
cp.exec('echoes hello') //notice how it is echoes instead of echo
it will just error, but maybe the user has a program that adds 'echoes' to a terminal, and in that case, it would be able to run, but if it errors it will just exit out of the process and I won't be able to check if it works.
Is there any way to do this? Thank you so much in advance!
You have to manually loop through dirs in $PATH env & perform look up on those directory.
eg: $PATH is set to /bin:/usr/local/bin then you have to perform
fs.access('/bin/' + command, fs.constants.X_OK)
and
fs.access('/usr/local/bin/' + command, fs.constants.X_OK)
solution would look like this.
const { constants: fsconsts } = require('fs')
const fs = require('fs/promises')
const path = require('path')
const paths = process.env.PATH.split(':')
async function isExecutable(command) {
const cases = []
for (const p of paths) {
const bin = path.join(p, command)
cases.push(fs.access(bin, fsconsts.X_OK)) // X_OK is bit flag which makes sure file is executable
}
await Promise.any(cases)
return command
}
const found = (bin) => console.log('found', bin)
const notfound = (errors) => {
console.log('not found or not executable')
// console.error(errors)
}
// passes
isExecutable('echo').then(found).catch(notfound)
isExecutable('node').then(found).catch(notfound)
// fails
isExecutable('shhhhhh').then(found).catch(notfound)
isExecutable('echoes').then(found).catch(notfound)
NOTE: I think my solution works only on *nix based OSs
Related
In my current codes, it does only can read a text file, How can I make an Image (base64) file opened with Photos Application (Windows)? Is there any chance to do that? If it's impossible, please let me know!
const fs = require('fs')
fs.readFile('./Test/a.txt', 'utf8' , (err, data) => {
if (err) {
console.error(err)
return
}
console.log(data)
return
})
Another possible solution is like this:
const cp = require('child_process');
const imageFilePath = '/aaa/bbb/ccc'
const c = cp.spawn('a_program_that_opens_images', [ `"${imageFilePath}"` ]);
c.stdout.pipe(process.stdout);
c.stderr.pipe(process.stderr);
c.once('exit', exitCode => {
// child process has exited
});
Do something like this:
const cp = require('child_process');
const c = cp.spawn('bash'); // 1
const imageFilePath = '/aaa/bbb/ccc'
c.stdin.end(`
program_that_opens_images "${imageFilePath}"
`); // 2
c.stdout.pipe(process.stdout); // 3
c.stderr.pipe(process.stderr);
c.once('exit', exitCode => { // 4
// child process has exited
});
what it does:
spawns a bash child process (use sh or zsh instead if you want)
writes to bash stdin, (inputting the command to run)
pipes the stdio from the child to the parent
captures the exit code from the child
const ffmpeg = require('fluent-ffmpeg');
const videoFile = './f1.mp4';
ffmpeg.ffprobe(videoFile, (err,metaData) => {
const {duration} = metaData.format;
const startingTime = parseInt(duration - 60);
const clipDuration = 20;
ffmpeg()
.input(videoFile)
.inputOptions([`-ss ${startingTime}`])
.outputOptions([`-t ${clipDuration}`])
.output('./result.mp4')
.on('end', ()=> console.log('Done!'))
.on('error',(err)=>console.error(err))
.run();
});
So this is my node js code where I am cutting a clip of the video my choice and giving me an output. I run it by node index. js ( code is in index.js file)
I want to create a script that can run on the below command line
node index.js start_time end_time input/file/path.mp4 output/directory/
I mean it will be dynamic, as any input file from any directory and any output file from any directory like a function that will take user inputs and will accordingly. No manual set up.
Is there a way to do that?? I have tried many but all are manually on the command line or a manual node setup. I am trying to create a dynamic js file that will run for any input
What you probably want is process.argv. This is the argument vector: list of arguments, where the first two are the node process and the file being run. Example:
const args = process.argv.slice(2);
if (!args.length !== 4) {
console.error('Incorrect number of arguments');
process.exit(1);
}
const [ startTime, endTime, inputFile, outputDirectory ] = args;
I'm building a discord bot with node.js for my server and I have a bunch of commands for the bot. Each command is in a different file so I have a lot of const cmd = require("../commands/cmd.js");
const kick = require("../commands/kick");
const info = require("../commands/info");
const cooldown = require("../commands/cooldown");
const help = require("../commands/help");
Is there a simpler way to do this?
Inside folder commands put a file called index.js.
Each time you implement new commands in new file, require that file in index.js and then add it to the exports of it. For example index.js would be:
const kick = require('./kick');
const info = require('./info');
module.exports = {
kick: kick,
info: info
}
And then from any folder you can require multiple commands in one line like this:
const { kick, info } = require('../commands');
Export an object from one file instead?
const kick = require("../commands/kick");
const info = require("../commands/info");
const cooldown = require("../commands/cooldown");
const help = require("../commands/help");
const commands = {
kick,
info,
...
}
module.exports = commands;
And then:
const commands = require('mycommands')
commands.kick()
Create index.js file inside the command folder and then you can export an object like this.
const kick = require("../commands/kick");
const info = require("../commands/info");
const cooldown = require("../commands/cooldown");
const help = require("../commands/help");
const command = {
kick,
info,
cooldown,
help
};
module.exports = command;
You can import and use it like this:
const {kick, info} = require('./commands');
I want to grab all javascript files inside the parent directory and in all sub directories for my discord.js command handler. How do I achieve that?
I have a working block of code that already grabs all .js files from the parent directory, but all sub directories are left alone.
const botConfig = require('./config/nvdconfig.json');
const Discord = require('discord.js');
const fs = require('fs');
const prefix = botConfig.prefix;
// nvdColor: #45c263
const bot = new Discord.Client({
disableEveryone: true
});
bot.commands = new Discord.Collection();
const {
readdirSync,
statSync
} = require('fs');
const {
join
} = require('path');
fs.readdir('./cmds/', (err, files) => {
if (err) console.error(err);
let jsfiles = files.filter(f => f.split('.').pop() === 'js');
if (jsfiles.length <= 0) {
return console.log('No commands to load.');
return;
}
console.log(`Loading ${jsfiles.length} commands!`);
jsfiles.forEach((f, i) => {
let props = require(`./cmds/${f}`);
console.log(`${i + 1}: ${f} loaded!`);
bot.commands.set(props.help.name, props);
});
});
I expect the same result from the loading of the jsfiles through the parent directory as well as all current and future subdirectories.
My current result is all js files are being loaded in but the ones inside the subdirectories are left alone.
I would really appreciate if someone could help me with this! Thank you in advance.
So, I figured out how to get the result that I want.
I found a node package called fs-readdir-recursive that had everything I wanted.
Install the package: npm install fs-readdir-recursive --save
And initialize it. const <var name> = require('fs-readdir-recursive');
Then after you do that, create another variable. const <var name> = read('./<parent directory>/');
This variable will be the parent directory that is searched through with a for each loop.
const <var name> = require('fs-readdir-recursive');
const files = read('./cmds/');
files.forEach(file => {
let cmd = file.replace('.js', '');
let props = require(`./cmds/${cmd}`);
<your code here>
});
This will read each file in each directory of the parent.
I've got a script that synchronously installs non-built-in modules at startup that looks like this
const cp = require('child_process')
function requireOrInstall (module) {
try {
require.resolve(module)
} catch (e) {
console.log(`Could not resolve "${module}"\nInstalling`)
cp.execSync(`npm install ${module}`)
console.log(`"${module}" has been installed`)
}
console.log(`Requiring "${module}"`)
try {
return require(module)
} catch (e) {
console.log(require.cache)
console.log(e)
}
}
const http = require('http')
const path = require('path')
const fs = require('fs')
const ffp = requireOrInstall('find-free-port')
const express = requireOrInstall('express')
const socket = requireOrInstall('socket.io')
// List goes on...
When I uninstall modules, they get installed successfully when I start the server again, which is what I want. However, the script starts throwing Cannot find module errors when I uninstall the first or first two modules of the list that use the function requireOrInstall. That's right, the errors only occur when the script has to install either the first or the first two modules, not when only the second module needs installing.
In this example, the error will be thrown when I uninstall find-free-port, unless I move its require at least one spot down ¯\_(• _ •)_/¯
I've also tried adding a delay directly after the synchronous install to give it a little more breathing time with the following two lines:
var until = new Date().getTime() + 1000
while (new Date().getTime() < until) {}
The pause was there. It didn't fix anything.
#velocityzen came with the idea to check the cache, which I've now added to the script. It doesn't show anything out of the ordinary.
#vaughan's comment on another question noted that this exact error occurs when requiring a module twice. I've changed the script to use require.resolve(), but the error still remains.
Does anybody know what could be causing this?
Edit
Since the question has been answered, I'm posting the one-liner (139 characters!). It doesn't globally define child_modules, has no last try-catch and doesn't log anything in the console:
const req=async m=>{let r=require;try{r.resolve(m)}catch(e){r('child_process').execSync('npm i '+m);await setImmediate(()=>{})}return r(m)}
The name of the function is req() and can be used like in #alex-rokabilis' answer.
It seems that the require operation after an npm install needs a certain delay.
Also the problem is worse in windows, it will always fail if the module needs to be npm installed.
It's like at a specific event snapshot is already known what modules can be required and what cannot. Probably that's why require.cache was mentioned in the comments. Nevertheless I suggest you to check the 2 following solutions.
1) Use a delay
const cp = require("child_process");
const requireOrInstall = async module => {
try {
require.resolve(module);
} catch (e) {
console.log(`Could not resolve "${module}"\nInstalling`);
cp.execSync(`npm install ${module}`);
// Use one of the two awaits below
// The first one waits 1000 milliseconds
// The other waits until the next event cycle
// Both work
await new Promise(resolve => setTimeout(() => resolve(), 1000));
await new Promise(resolve => setImmediate(() => resolve()));
console.log(`"${module}" has been installed`);
}
console.log(`Requiring "${module}"`);
try {
return require(module);
} catch (e) {
console.log(require.cache);
console.log(e);
}
}
const main = async() => {
const http = require("http");
const path = require("path");
const fs = require("fs");
const ffp = await requireOrInstall("find-free-port");
const express = await requireOrInstall("express");
const socket = await requireOrInstall("socket.io");
}
main();
await always needs a promise to work with, but it's not needed to explicitly create one as await will wrap whatever it is waiting for in a promise if it isn't handed one.
2) Use a cluster
const cp = require("child_process");
function requireOrInstall(module) {
try {
require.resolve(module);
} catch (e) {
console.log(`Could not resolve "${module}"\nInstalling`);
cp.execSync(`npm install ${module}`);
console.log(`"${module}" has been installed`);
}
console.log(`Requiring "${module}"`);
try {
return require(module);
} catch (e) {
console.log(require.cache);
console.log(e);
process.exit(1007);
}
}
const cluster = require("cluster");
if (cluster.isMaster) {
cluster.fork();
cluster.on("exit", (worker, code, signal) => {
if (code === 1007) {
cluster.fork();
}
});
} else if (cluster.isWorker) {
// The real work here for the worker
const http = require("http");
const path = require("path");
const fs = require("fs");
const ffp = requireOrInstall("find-free-port");
const express = requireOrInstall("express");
const socket = requireOrInstall("socket.io");
process.exit(0);
}
The idea here is to re-run the process in case of a missing module. This way we fully reproduce a manual npm install so as you guess it works! Also it seems more synchronous rather the first option, but a bit more complex.
I think your best option is either:
(ugly) to install package globally, instead of locally
(best solution ?) to define YOUR new 'package repository installation', when installing, AND when requiring
First, you may consider using the npm-programmatic package.
Then, you may define your repository path with something like:
const PATH='/tmp/myNodeModuleRepository';
Then, replace your installation instruction with something like:
const npm = require('npm-programmatic');
npm.install(`${module}`, {
cwd: PATH,
save:true
}
Eventually, replace your failback require instruction, with something like:
return require(module, { paths: [ PATH ] });
If it is still not working, you may update the require.cache variable, for instance to invalide a module, you can do something like:
delete require.cache[process.cwd() + 'node_modules/bluebird/js/release/bluebird.js'];
You may need to update it manually, to add information about your new module, before loading it.
cp.execSync is an async call so try check if the module is installed in it's call back function. I have tried it, installation is clean now:
const cp = require('child_process')
function requireOrInstall (module) {
try {
require.resolve(module)
} catch (e) {
console.log(`Could not resolve "${module}"\nInstalling`)
cp.execSync(`npm install ${module}`, () => {
console.log(`"${module}" has been installed`)
try {
return require(module)
} catch (e) {
console.log(require.cache)
console.log(e)
}
})
}
console.log(`Requiring "${module}"`)
}
const http = require('http')
const path = require('path')
const fs = require('fs')
const ffp = requireOrInstall('find-free-port')
const express = requireOrInstall('express')
const socket = requireOrInstall('socket.io')
When node_modules not available yet :
When node_modules available already: