Executing mongoimport inside code with Javascript/Node.js - javascript

Is there any library available in node.js/javascript that allows an individual to use mongoimport in code?
To my understanding, mongoimport is kinda like an .exe, which you have to execute it first before being able to use its text input environment.
Is it possible to execute mongoimport in my code and then parse whatever commands I need directly in my code?
My current algorithm involves:
fs.appendFile('log.txt',JSON.stringify(obj, null, 2));
obj is an object which specifies what functions to parse into JSON.stringify with the res method of node.js (which requests HTTP responses)
var obj = {};
obj.url = hostNames[i];
obj.statusCode = res.statusCode;
obj.headers = res.headers;
Then I use mongoimport to import this JSON doc into my MongoDB.
mongoimport --host localhost -db scrapeapp -collection scrape --file log.txt --jsonArray
This method is obviously inefficient. I would like to do all these steps in one going.
Help appreciated

This is how I do it in my code
let exec = require('child_process').exec
let command = 'mongoimport -d database -c collection --file import.json'
exec(command, (err, stdout, stderr) => {
// check for errors or if it was succesfuly
cb()
})
I exec the mongoimport command and then I do pass the cb next the code to be accesible, or if you do not use an asynchronous style you can do it synchronously with child_process.execSync(command[,options])

I'm in no way a node expert - but if you have existing JSON files, you could execute mongoimport in Node as shell command as described here or in various answers.

Related

Mongo shell script - How to use libraries?

In a Mongo shell script I need to read a file to delete documents by _id but I can't import the FileReader library.
I launch script from bash to do simple find() and it works:
mongosh --host xxx --port 27017 --username xxx --password xxx --eval "var country='$country';var file='$inputDataFile'" --file scriptFile.js
But whenever I try to import a library in the js it shows the error:
TypeError: Cannot assign to read only property 'message' of object 'SyntaxError:
'import' and 'export' may appear only with 'sourceType
The same js file I call from nodejs and the import is correct.
Right now I do the deletion of _ids contained in a file using nodejs. I would like to find a way to use Mongo shell script for all my queries
I managed to solve my problem with the help of Include External Files and Modules in Scripts
Create package.json in the same directory as your js
Add the necessary libraries to package.json
Install libraries
Use require to import it
Code
var fs = require('fs');
console.log(country)
console.log(file)
db = db.getSiblingDB('mongotest')
const allFileContents = fs.readFileSync(file, 'utf-8');
var lines = allFileContents.split("\r")
for (var i = 0; i < lines.length; i++) {
const user = JSON.parse(lines[i]);
var userId = user._id
if (!(userId instanceof ObjectId)) {
userId = new ObjectId(userId);
}
db.userChat.deleteOne({ _id: userId })
}
Thanks for your help

How can I use a child process to run a .class file from a parent directory?

In this case, I am using Node.js ChildProcess. Let's say the application file (index.js, for example) is in folder1. This folder also has folder2, which is where the class file is. So, when I call spawn from folder1, the command's current directory is folder1. However, I can't do java ./folder2/MyFile.
Here's what I tried:
async function run(path){
let child = spawn('java', [path], {
stdio: [process.stdin, process.stdout, process.stderr] //for testing purposes
})
}
Using function run on ./folder2/MyFile returns:
Error: could not find or load main class ..folder2.MyFile
I assume this has something to do with java and classpath. I saw an answer involving setting the classpath to the target directory (folder2) but it didn't do anything.
In short, how can I run a .class file from a different directory?
You can use exec instead of spawn so you can use two commands with & symbol which the second command runs when the first one finish without fail.
I think this might work for you.
const exec = require('child_process').exec;
exec("cd ./folder2 & java MyFile", function(
error: string,
stdout: string,
stderr: string
) {
console.log(stdout);
console.log(error);
console.log(stderr);
});

Execute a JS file (with logs, etc...) inside another NodeJS process

Here is my problem, I want to create a CLI that automatically runs a test. Without the CLI, I'm able to run everything perfectly with the node command:
node test.js
Basically, I want to do the exact same thing as the command before, so I googled for a technique that does this. I found this:
#!/usr/bin/env node
'use strict';
const options = process.argv;
const { execFile } = require('child_process');
const child = execFile('node', ['../dist/test.js'], (error, stdout, stderr) => {
if (error) {
throw error;
}
console.log(stdout);
});
This method doesn't work for me because, in the test.js file, I'm using the ora package. And because this package is making real-time animations, it doesn't come in stdout.
Is there any way of executing in real time (without subprocess) my test.js using Node? I'm open to other methods, but I want to publish the CLI on NPM, so keep in mind that it has to be in JavaScript 😊.
You can find every file that I've talked here on GitHub. Normally, you wouldn't need this link, but I'm giving it to you if you need to have a closer look.
You should simply call your test() function from your CLI code, after requiring the module that defines it. Have a look at mocha and jasmine: you will see that while both tools provide a CLI, they also provide instructions for invoking the test frameworks from arbitrary JS code.
I can't think of a way without a sub-process. but this may help.
The child process exec will not work with the continuous output commands as it buffers the output the process will halt when that buffer is full.
The suitable solution is spwan :
var spwan = require('child_process').spwan
var child = spwan('node', ['../dist/test.js'])
child.stdout.on('data', function(data) {
console.log(data)
})
child.stderr.on('data', function(data) {
console.log(data)
})
Here is my solution, you can use the fs library to get the code of the file, and then, you simply use eval to execute in the same process.
const fs = require("fs");
function run(file) {
fs.readFile(file, (err, data) => {
eval(data.toString('utf8'))
})
}

Is there a way to 'pretty' print MongoDB shell output to a file?

Specifically, I want to print the results of a mongodb find() to a file. The JSON object is too large so I'm unable to view the entire object with the shell window size.
The shell provides some nice but hidden features because it's an interactive environment.
When you run commands from a javascript file via mongo commands.js you won't get quite identical behavior.
There are two ways around this.
(1) fake out the shell and make it think you are in interactive mode
$ mongo dbname << EOF > output.json
db.collection.find().pretty()
EOF
or
(2) use Javascript to translate the result of a find() into a printable JSON
mongo dbname command.js > output.json
where command.js contains this (or its equivalent):
printjson( db.collection.find().toArray() )
This will pretty print the array of results, including [ ] - if you don't want that you can iterate over the array and printjson() each element.
By the way if you are running just a single Javascript statement you don't have to put it in a file and instead you can use:
$ mongo --quiet dbname --eval 'printjson(db.collection.find().toArray())' > output.json
Since you are doing this on a terminal and just want to inspect a record in a sane way, you can use a trick like this:
mongo | tee somefile
Use the session as normal - db.collection.find().pretty() or whatever you need to do, ignore the long output, and exit. A transcript of your session will be in the file tee wrote to.
Be mindful that the output might contain escape sequences and other garbage due to the mongo shell expecting an interactive session. less handles these gracefully.
Just put the commands you want to run into a file, then pass it to the shell along with the database name and redirect the output to a file. So, if your find command is in find.js and your database is foo, it would look like this:
./mongo foo find.js >> out.json
Put your query (e.g. db.someCollection.find().pretty()) to a javascript file, let's say query.js. Then run it in your operating system's shell using command:
mongo yourDb < query.js > outputFile
Query result will be in the file named 'outputFile'.
By default Mongo prints out first 20 documents IIRC. If you want more you can define new value to batch size in Mongo shell, e.g.
DBQuery.shellBatchSize = 100.
Using print and JSON.stringify you can simply produce a valid JSON result.
Use --quiet flag to filter shell noise from the output.
Use --norc flag to avoid .mongorc.js evaluation. (I had to do it because of a pretty-formatter that I use, which produces invalid JSON output)
Use DBQuery.shellBatchSize = ? replacing ? with the limit of the actual result to avoid paging.
And finally, use tee to pipe the terminal output to a file:
// Shell:
mongo --quiet --norc ./query.js | tee ~/my_output.json
// query.js:
DBQuery.shellBatchSize = 2000;
function toPrint(data) {
print(JSON.stringify(data, null, 2));
}
toPrint(
db.getCollection('myCollection').find().toArray()
);
Hope this helps!
I managed to save result with writeFile() function.
> writeFile("/home/pahan/output.txt", tojson(db.myCollection.find().toArray()))
Mongo shell version was 4.0.9
In the new mongodb shell 5.0+ mongosh, it integrate the Node.js fs module, so you can simply do below in the new mongosh shell for pretty print the output:
fs.writeFileSync('output.json', JSON.stringify(db.test.find().toArray(), null, 2))
Without any problems such as the ObjectId has been stripped, etc., which is better than the printjson or .pretty().
The above code can work as the description denotes:
The MongoDB Shell, mongosh, is a fully functional JavaScript and Node.js 14.x REPL environment for interacting with MongoDB deployments. You can use the MongoDB Shell to test queries and operations directly with your database.
The old mongo shell also marked as Legacy, so you should move to this new way.
Using this answer from Asya Kamsky, I wrote a one-line bat script for Windows. The line looks like this:
mongo --quiet %1 --eval "printjson(db.%2.find().toArray())" > output.json
Then one can run it:
exportToJson.bat DbName CollectionName
Also there is mongoexport for that, but I'm not sure since which version it is available.
Example:
mongoexport -d dbname -c collection --jsonArray --pretty --quiet --out output.json
As answer by Neodan mongoexport is quite useful with -q option for query. It also convert ObjectId to standard format of JSON "$oid". E.g:
mongoexport -d yourdb -c yourcol --jsonArray --pretty -q '{"field": "filter value"}' -o output.json
you can use this command to acheive it:
mongo admin -u <userName> -p <password> --quiet --eval "cursor = rs.status(); printjson(cursor)" > output.json

How to execute shell command in Javascript

I want to write a JavaScript function which will execute the system shell commands (ls for example) and return the value.
How do I achieve this?
I'll answer assuming that when the asker said "Shell Script" he meant a Node.js backend JavaScript. Possibly using commander.js to use frame your code :)
You could use the child_process module from node's API. I pasted the example code below.
var exec = require('child_process').exec;
exec('cat *.js bad_file | wc -l',
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
});
I don't know why the previous answers gave all sorts of complicated solutions. If you just want to execute a quick command like ls, you don't need async/await or callbacks or anything. Here's all you need - execSync:
const execSync = require('child_process').execSync;
// import { execSync } from 'child_process'; // replace ^ if using ES modules
const output = execSync('ls', { encoding: 'utf-8' }); // the default is 'buffer'
console.log('Output was:\n', output);
For error handling, add a try/catch block around the statement.
If you're running a command that takes a long time to complete, then yes, look at the asynchronous exec function.
...few year later...
ES6 has been accepted as a standard and ES7 is around the corner so it deserves updated answer. We'll use ES6+async/await with nodejs+babel as an example, prerequisites are:
nodejs with npm
babel
Your example foo.js file may look like:
import { exec } from 'child_process';
/**
* Execute simple shell command (async wrapper).
* #param {String} cmd
* #return {Object} { stdout: String, stderr: String }
*/
async function sh(cmd) {
return new Promise(function (resolve, reject) {
exec(cmd, (err, stdout, stderr) => {
if (err) {
reject(err);
} else {
resolve({ stdout, stderr });
}
});
});
}
async function main() {
let { stdout } = await sh('ls');
for (let line of stdout.split('\n')) {
console.log(`ls: ${line}`);
}
}
main();
Make sure you have babel:
npm i babel-cli -g
Install latest preset:
npm i babel-preset-latest
Run it via:
babel-node --presets latest foo.js
This depends entirely on the JavaScript environment. Please elaborate.
For example, in Windows Scripting, you do things like:
var shell = WScript.CreateObject("WScript.Shell");
shell.Run("command here");
In a nutshell:
// Instantiate the Shell object and invoke its execute method.
var oShell = new ActiveXObject("Shell.Application");
var commandtoRun = "C:\\Winnt\\Notepad.exe";
if (inputparms != "") {
var commandParms = document.Form1.filename.value;
}
// Invoke the execute method.
oShell.ShellExecute(commandtoRun, commandParms, "", "open", "1");
Note: These answers are from a browser based client to a Unix based web server.
Run command on client
You essentially can't. Security says only run within a browser and its access to commands and filesystem is limited.
Run ls on server
You can use an AJAX call to retrieve a dynamic page passing in your parameters via a GET.
Be aware that this also opens up a security risk as you would have to do something to ensure that mrs rouge hacker does not get your application to say run: /dev/null && rm -rf / ......
So in a nutshel, running from JS is just a bad, bad idea.... YMMV
With NodeJS is simple like that!
And if you want to run this script at each boot of your server, you can have a look on the forever-service application!
var exec = require('child_process').exec;
exec('php main.php', function (error, stdOut, stdErr) {
// do what you want!
});
function exec(cmd, handler = function(error, stdout, stderr){console.log(stdout);if(error !== null){console.log(stderr)}})
{
const childfork = require('child_process');
return childfork.exec(cmd, handler);
}
This function can be easily used like:
exec('echo test');
//output:
//test
exec('echo test', function(err, stdout){console.log(stdout+stdout+stdout)});
//output:
//testtesttest
Here is simple command that executes ifconfig shell command of Linux
var process = require('child_process');
process.exec('ifconfig',function (err,stdout,stderr) {
if (err) {
console.log("\n"+stderr);
} else {
console.log(stdout);
}
});
If you are using npm you can use the shelljs package
To install: npm install [-g] shelljs
var shell = require('shelljs');
shell.ls('*.js').forEach(function (file) {
// do something
});
See more: https://www.npmjs.com/package/shelljs
Another post on this topic with a nice jQuery/Ajax/PHP solution:
shell scripting and jQuery
In IE, you can do this :
var shell = new ActiveXObject("WScript.Shell");
shell.run("cmd /c dir & pause");
With nashorn you can write a script like this:
$EXEC('find -type f');
var files = $OUT.split('\n');
files.forEach(...
...
and run it:
jjs -scripting each_file.js
As far as I can tell, there is no built-in function, method or otherwise, in the official ECMAScript specification to run an external process. That said, extensions are allowed, see this note from the spec, for example:
NOTE Examples of built-in functions include parseInt and Math.exp. A
host or implementation may provide additional built-in functions that
are not described in this specification.
One such "host" is Node.js which has the child_process module. Let's try this code to execute the Linux shell command ps -aux, saved in runps.js, based on the child_process documentation:
const { spawn } = require('child_process');
const ps = spawn('ps', ['-aux']);
ps.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
ps.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
ps.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
Which produces the following example output, running it in docker:
$ docker run --rm -v "$PWD":/usr/src/app -w /usr/src/app node:17-bullseye node ./runps.js
stdout: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 0.8 319312 33888 ? Ssl 11:08 0:00 node ./runps.js
root 13 0.0 0.0 6700 2844 ? R 11:08 0:00 ps -aux
child process exited with code 0
The thing I like about this module, is that it's included with the Node.js distribution, no npm install ... needed.
If you search the Node.js code in github for spawn you will find references to the implementation in C or C++ in the engine. Modern browsers like Firefox and Chrome would be reluctant to extend JavaScript with such features, for obvious security reasons, even if the underlying engine such as V8 supports it.
On that note, it's better not to run our container as root, let's try the above example again, adding a random user this time.
$ docker run --rm -u 7000 -v "$PWD":/usr/src/app -w /usr/src/app node:17-bullseye node ./runps.js
stdout: USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
7000 1 5.0 0.8 319312 33812 ? Ssl 11:19 0:00 node ./runps.js
7000 13 0.0 0.0 6700 2832 ? R 11:19 0:00 ps -aux
child process exited with code 0
Of course that's better but not enough. If this approach is used at all, more precautions must be taken, such as ensuring that no arbitrary user commands can be executed.
Windows 10
My version of Windows 10 still has Windows Script Host which can run JScript on the console with the wscript.exe or cscript.exe programs, i.e. no browser needed. To try it out you can open a PowerShell Windows Terminal. Save the following code into a file which you can call shell.js:
WScript.StdOut.WriteLine("Hallo, ECMAScript on Windows!");
WScript.CreateObject("WScript.Shell").run("C://Windows//system32//mspaint.exe");
And on the command line, run:
cscript .\shell.js
Which shows the following and opens Paint:
Microsoft (R) Windows Script Host Version 5.812
Copyright (C) Microsoft Corporation. All rights reserved.
Hallo, ECMAScript on Windows!
Other variations exist. Find the documentation applicable to your preferred JavaScript runtime environment.
const fs = require('fs');
function ls(startPath) {
fs.readdir(startPath, (err, entries) => {
console.log(entries);
})
}
ls('/home/<profile_name>/<folder_name>')
The startPath used here is in reference with debian distro
Js file
var oShell = new ActiveXObject("Shell.Application");
oShell.ShellExecute("E:/F/Name.bat","","","Open","");
Bat file
powershell -Command "& {ls | Out-File -FilePath `E:F/Name.txt}"`
Js file run with node namefile.js
const fs = require('fs')
fs.readFile('E:F/Name.txt', (err, data) => {
if (err) throw err;
console.log(data.toString());
})
You can also do everything in one solution with an asynchronous function.
Directly there could be security problems.

Categories