Spawn process inside process or detached it pkg builder - javascript

I dont sure what the problem here, mongod process not spawn inside program.exe that create with pkg. i test it first before compile the script can launch mongod process. after i tested it, spawn cant read pkg filesystem ( snapshot ).
const { spawn } = require('child_process');
const { parse } = require('path')
let processPath = parse(process.argv[0]);
let processDir = processPath.dir;
const args = [
'-f', `${__dirname}\\configs\\mongodb.yml`,
'--dbpath', `${processDir}\\database\\data`,
'--logpath', `${processDir}\\database\\log\\system.log`,
];
const options = {
cwd: `${processDir}\\bin`
};
const mongod = spawn('mongod', args, options);
mongod.stdout.on('data', chunk => {
console.log(chunk.toString())
});
mongod.stdout.on('error', chunk => {
console.log(chunk.toString())
});
mongod.on('spawn', () => {
console.log('success')
});
mongod.on('error', function(error) {
console.log(error)
});
Build Dir
build
build/program.exe
build/bin
build/bin/mongod.exe
build/database
build/database/data
build/database/log/system.log
Package.json pkg configurations
"bin": "dist/application.js",
"pkg": {
"targets": ["node16-win-x64"],
"outputPath": "dist/build",
"assets": [
"dist/configs/*"
]
}

Here is my solution to this issue, tested on Linux Ubuntu 22.04 LTS.
Case scenario:
I needed to include an executable file hello_world as an asset into /snapshot/project/bin/hello_world virtual path and based on some conditions execute it inside the Linux environment.
The problem:
I was getting the following error when I've been trying to execute the command via child_process.spawn:
/bin/sh: 1: /snaponshot/project/bin/hello_world: not found
So clearly my OS is trying to execute hello_world command via /bin/sh, however, the system is unable to access to /snapshot virtual filesystem, therefor not able to execute it.
The workaround:
Clearly, the main file system is unable to access the virtual file system, but we can do the opposite, by copying our executable file from the virtual file system into the main file system and executing it from there, basically, this is what I did:
//node packages
const fs = require('fs');
const os = require('os');
const path = require('path');
const {execSync, spawn} = require('child_process');
// executable file name
const executable = 'hello_world';
//file path to the asset executable file
const remoteControlFilePath = path.join(__dirname, `../bin/${executable}`);
let executableFileFullPath = remoteControlFilePath;
// avoid the workaround if the parent process in not pkg-ed version.
if (process.pkg) {
// creating a temporary folder for our executable file
const destination = fs.mkdtempSync(`${os.tmpdir()}${path.sep}`);
const destinationPath = path.join(destination, executable);
executableFileFullPath = destinationPath;
// copy the executable file into the temporary folder
fs.copyFileSync(remoteControlFilePath, destinationPath);
// on Linux systems you need to manually make the file executable
execSync(`chmod +x ${destinationPath}`);
}
// using {detached: true}, execute the command independently of its parent process
// to avoid the main parent process' failing if the child process failed as well.
const child = spawn(executableFileFullPath, {detached: true});
child.stdout.on('data', (data) => {
console.log(`child stdout:\n${data}`);
});
child.stderr.on('data', (data) => {
console.error(`child stderr:\n${data}`);
});
child.on('exit', (code, signal) => {
console.log(`child process exited with code ${code} and signal ${signal}`);
});

Related

Load local dll with node-ffi: No such file

I try to load a local .dll according the examples on stackoverflow and node-ffi documentation.
But I get the error ENOENT: no such file or directory, open '../test/user32.dll.so'. The file is there (no exception).
The extension '.so' is added automatically. Any idea what I'm doing wrong? Is this code plattform dependent? I'm on Debian.
const path = require('path');
const fs = require('fs');
const ffi = require('ffi');
function setCursor() {
const dllFile = path.join('../test', 'user32.dll');
if (!fs.existsSync(dllFile)) {
throw (new Error('dll does not exist'));
}
const user32 = ffi.Library(dllFile, {
"SetCursorPos": [
"bool", ["int32", "int32"]
]
});
console.log(user32.SetCursorPos(0, 0));
}
setCursor();
It looks like path doesn't recognize ../test as being the parent folder. I think path.join(__dirname, '..', 'test', 'user32.dll'); should get you to the right place.

How to use node command to run multiple .js files one after the other?

I'm currently using the node command to run a few premade scripts. Right now this is what I type into Git:
node file1.js
node file2.js
node file3.js
I have to wait for each one to finish before typing the next "node file.js"
Is there a way to do that for all of the files in the folder as opposed to typing them out one after the other? Thanks!
You can use fs.readdir to first get all files in current dir
const fs = require('fs')
const files = fs.readdirSync('.')
Then filter out the .js files:
const jsFiles = files.filter(f => f.endsWith('.js'))
The execute them one by one using child_process:
const { spawnSync } = require('child_process')
for (const file of jsFiles) {
spawnSync('node', [file], { shell: true, stdio: 'inherit' })
}
I'm using spawnSync so that it'll execute the files synchronously (one-by-one).
You can use exec() function to run commands through Javascript. Insert this line of code at the top of file1.js and call exec() function at the end of the execution and so on.
Example
index.js
const { exec } = require("child_process");
exec("dir", (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
if (stderr) {
console.log(`stderr: ${stderr}`);
return;
}
console.log(`stdout: ${stdout}`);
});
The output of node index.js:
stdout: Volume in drive C is Windows 10
Volume Serial Number is 3457-05DE
Directory of C:\dev\node
10.08.2020 20:52 <DIR> .
10.08.2020 20:52 <DIR> ..
10.08.2020 20:52 <DIR> .vscode
10.08.2020 21:39 310 index.js
10.08.2020 21:15 239 package.json
2 File(s) 549 bytes

Node JS filesystem module reading directory exclude files

I have some Node JS javascript code that reads folders inside of a directory, however it's currently reading folders and files, and I just need it to read folders and can't figure out what U'm missing:
router.get('/check', (req, res) => {
fs.readdir('./directory', function(err, files) {
if (err) {
res.status(500).send({ error: { status: 500, message: 'error' } })
return
}
console.log('success')
})
})
I was thinking about doing something like files[0].length > X for instance to only show names that contain more than X characters, or filter out file extensions etc, I ideally just need directories since I have a .empty file inside.
You can check reference on documentation. readdir() will return contents of the directory. You need to filter folders or files. Simply you can call and create new array files.filter(fileName => fs.statSync(path + fileName).isFile().
ref
Update
Given sample code will filter files and folders into to seperated variables. You can implement into your project.
const fs = require('fs');
const path = require('path');
const dir = fs.readdirSync(__dirname);
const folders = dir.filter(element => fs.statSync(path.join(__dirname, element)).isDirectory());
const files = dir.filter(element => fs.statSync(path.join(__dirname, element)).isFile);
console.log('folders', folders, 'files', files);

"export '...' was not found in '...' " while webpack build

I developed using electron-boilerplate.
I set up environment on my desktop same with my co-worker, node#10.2.0, npm#5.6.0, only difference was OS(co-worker : Windows7 64bit, me: Windows 10 64bit).
However, warning messages outputs follows on runtime.
warning in ./src/pages/config.js
"export 'login' was not found in '../lib/hoeat-api'
The starting point of the program is as follows.
npm run build/start.js
Here is the start.js code:
const childProcess = require("child_process");
const electron = require("electron");
const webpack = require("webpack");
const config = require("./webpack.app.config");
const env = "development";
const compiler = webpack(config(env));
let electronStarted = false;
const watching = compiler.watch({}, (err, stats) => {
if (!err && !stats.hasErrors() && !electronStarted) {
electronStarted = true;
childProcess
.spawn(electron, ["."], { stdio: "inherit" })
.on("close", () => {
watching.close();
});
}
});
Why does the build using webpack fail?
Thanks for your help. thank you

How to view files created within a Zeit Docker/Node container

According to the Zeit docs
There are no limitations inside Docker deployments when it comes to the file system. It's always writable and readable.
And indeed my little test seems to write files successfully:
app.get('/write', (req, res) => {
console.log({
__dirname,
cwd: process.cwd()
})
const text = `some bit of text`
const dirpath = path.resolve(process.cwd(), 'uploads')
const fullpath = path.resolve(dirpath, `file-${+new Date()}.txt`)
mkdirp(dirpath, function(error) {
if (error) {
console.error(error)
} else {
fs.writeFile(fullpath, text, error => {
if (error) {
console.error('error writing', error)
} else {
console.log(`file written at ${fullpath}`)
fs.readdir(dirpath, function(err, items) {
for (var i = 0; i < items.length; i++) {
console.log(items[i])
}
})
res.send('File written')
}
})
}
})
})
After several refreshes of the /write route, this will print the list of files. However within the Zeit "source" panel, I only see the files copied by my Dockerfile:
For reference, my Dockerfile:
FROM node:carbon
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3000
CMD [ "npm", "start" ]
# ^^^^^^^^^^^^ "start": "node ./build/server"
Within the Zeit/Now environment, is there any way to view/intereact with these files, via ssh or some other method?
Nope. And that is because you can't access the state of the deployment, but only its source and logging!
It makes sense, after all, you should be running a stateless application...

Categories