I have a nodeJS application and I need to run a python script in order to get a certain response. I am using python-shell in order to do that, but I am getting no response.
I have tried also using a child-process, same response.
Here I call the python script:
var ps = require('python-shell');
ps.PythonShell.run('./face_detect.py', array1, function (err, data) {
if (err) req.send(err);
req.send(data.toString())
});
This is a snippet of my python script:
import cv2
import sys
import os
import numpy as np
students = sys.argv[1]
# get the names and put them in an array ---> subjects
imagePath = "class/welcome.jpg"
cascPath = "haarcascade_frontalface_alt.xml"
faceCascade = cv2.CascadeClassifier(cascPath)
.....
for (x, y, w, h) in faces:
num = 0
crop_img = cv2.UMat(image[y-40:y+h+100,x-40:x+h+40])
cv2.imwrite("face" + str(num) + ".jpg", crop_img)
test_img = cv2.imread("face" + str(num) + ".jpg")
num = num + 1
predicted_img1 = predict(test_img)
absences["A"] = 1
for name, a in absences.items():
if a == 0:
noshow.append(name)
print(noshow)
cv2.waitKey(0)
I expect it to return an array.
Can anyone help me with this?
The correct syntax for passing argument from Nodejs python-shell to Python script is:
ps.PythonShell.run('./face_detect.py', { args: array1 }, function (err, data) { ... })
Here the value of sys.argv[1] in your Python script will not contain the Nodejs array1 value because you don't set the args property in your PythonShell options.
Note also this should probably be res.send instead of req.send, depending on your program, and I advise you to return if there is an error to prevent "headers already sent" exception.
Related
If I execute a certain shell command in node js, the output is on the console. Is there a way I can save it in a variable so it can be POST to Sqlite database.
const shell = require('shelljs');
shell.exec('arp -a');
In this scenario, I want to store the IP address of a specific MAC/Physical address into the database. How can this be done?
Any help would be much appreciated. Thank you
You need to get the output of the command you're passing to exec. To do that, just call stdout, like this:
const shell = require('shelljs');
const stdout = shell.exec('arp -a').stdout;
Then just parse that output to get your ipaddress:
const entries = stdout.split('\r\n');
// entries sample
[ '',
'Interface: 10.17.60.53 --- 0xd',
' Internet Address Physical Address Type',
' 10.11.10.52 6c-4b-90-1d-97-b8 dynamic ',
' 10.10.11.254 xx-yy-53-2e-98-44 dynamic ']
Then you can filter your wanted address with some more manipulation.
EDIT:
To get the ip address, you could do:
let ipAddr = null;
for (let i = 0; i < entries.length; i++) {
if (entries[i].indexOf('6c-4b-90-1d-97-b8') > -1) {
ipAddr = entries[i].match(/\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b/)[0];
break;
}
}
console.log(ipAddr); // '10.11.10.52'
I'm merely copy pasting from the docs. You should research more.
You need to add a listener to stdout
var child = exec('arp -a', {async:true});
child.stdout.on('data', function(data) {
/* ... do something with data ... */
});
Or adding the callback directly when calling exec
exec('some_long_running_process', function(code, stdout, stderr) {
console.log('Exit code:', code);
console.log('Program output:', stdout);
console.log('Program stderr:', stderr);
});
You can access the result of the command run using shell.exec with the .output property. Try the code below.
var shell = require('shelljs');
var result = shell.exec('arp -a').output;
If you don't want the result in the console, you can specify the silent option.
var result = shell.exec('arp -a', {silent: true}).output;
Now, you can use regular expressions to extract ip and mac address from the result.
I am getting the result of the command like below:
? (xxx.xxx.xxx.xxx) at xx:xx:xx:xx:xx:xx [ether] on eth0
? (yyy.yyy.yyy.yyy) at yy:yy:yy:yy:yy:yy [ether] on eth0
You can use the following code to extract ip and mac.
var res = result.split("\n").map(function(item){
return item.match(/\((\d+\.\d+\.\d+\.\d+)\) at (..:..:..:..:..:..)/);
});
console.log(res[0][1]); //IP of first interface
console.log(res[0][2]); //MAC of first interface
console.log(res[1][1]); //IP of second interface
console.log(res[1][2]); //MAC of second interface
NOTE
I was not able to find the .output property in the documentation but trying the shell.exec function in the node console revealed it.
The .stdout property or the exec function mentioned in other answers doesn't work for me. They are giving undefined errors.
I'm writing my first application in Node.js. I am trying to read some data from a file where the data is stored in the JSON format.
I get this error:
SyntaxError: Unexpected token in JSON at position 0
at Object.parse (native)
Here is this part of the code:
//read saved addresses of all users from a JSON file
fs.readFile('addresses.json', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
storage = JSON.parse(data);
Here is the console.log output (and I checked the .json file itself, it's the same):
Read JSON file: {
"addresses": []
}
That seems to me like a correct JSON. Why does JSON.parse() fail then?
You have a strange char at the beginning of the file.
data.charCodeAt(0) === 65279
I would recommend:
fs.readFile('addresses.json', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
data = data.trim();
//or data = JSON.parse(JSON.stringify(data.trim()));
storage = JSON.parse(data);
}});
JSON.parse() does not allow trailing commas. So, you need to get rid of it:
JSON.parse(JSON.stringify(data));
You can find more about it here.
It might be the BOM[1].
I have done a test by saving a file with content {"name":"test"} with UTF-8 + BOM, and it generated the same error.
> JSON.parse(fs.readFileSync("a.json"))
SyntaxError: Unexpected token in JSON at position 0
And based on a suggestion here [2], you can replace it or drop it before you call JSON.parse().
For example:
var storage = {};
fs.readFile('a.json', 'utf8', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
console.log(typeof(data))
storage = JSON.parse(data.trim());
}
});
or
var storage = {};
fs.readFile('a.json', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
console.log(typeof(data))
storage = JSON.parse(data.toString().trim());
}
})
You can also remove the first 3 bytes (for UTF-8) by using Buffer.slice().
try it like this
fs.readFile('addresses.json','utf-8', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
storage = JSON.parse(data);
its because of the BOM that needs an encoding to be set before reading the file. its been issued in nodejs respository in github
https://github.com/nodejs/node-v0.x-archive/issues/186
To further explain #Luillyfe's answer:
Ah-ha! fs.readFileSync("data.json") returns a Javascript object!
Edit: Below is incorrect...But summarizes what one might think at first!
I had through that was a string. So if the file was saved as UTF-8/ascii, it would probably not have an issue? The javascript object returned from readFileSync would convert to a string JSON.parse can parse? No need to call JSON.stringify?
I am using powershell to save the file. Which seems to save the file as UTF-16 (too busy right now to check). So I get "SyntaxError: Unexpected token � in JSON at position 0."
However, JSON.stringify(fs.readFileSync("data.json")) correctly parses that returned file object into a string JSON can parse.
Clue for me is my json file contents looking like the below (after logging it to the console):
�{ " R o o m I D _ L o o k u p " : [
{
" I D " : 1 0 ,
" L o c a t i o n " : " f r o n t " ,
" H o u s e " : " f r o n t r o o m "
}
}
That doesn't seem like something a file would load into a string...
Incorrect being (this does not crash...but instead converts the json file to jibberish!):
const jsonFileContents = JSON.parse(JSON.stringify(fs.readFileSync("data.json")));
I can't seem to find this anywhere. But makes sense!
Edit: Um... That object is just a buffer. Apologies for the above!
Solution:
const fs = require("fs");
function GetFileEncodingHeader(filePath) {
const readStream = fs.openSync(filePath, 'r');
const bufferSize = 2;
const buffer = new Buffer(bufferSize);
let readBytes = 0;
if (readBytes = fs.readSync(readStream, buffer, 0, bufferSize, 0)) {
return buffer.slice(0, readBytes).toString("hex");
}
return "";
}
function ReadFileSyncUtf8(filePath) {
const fileEncoding = GetFileEncodingHeader(filePath);
let content = null;
if (fileEncoding === "fffe" || fileEncoding === "utf16le") {
content = fs.readFileSync(filePath, "ucs2"); // utf-16 Little Endian
} else if (fileEncoding === "feff" || fileEncoding === "utf16be") {
content = fs.readFileSync(filePath, "uts2").swap16(); // utf-16 Big Endian
} else {
content = fs.readFileSync(filePath, "utf8");
}
// trim removes the header...but there may be a better way!
return content.toString("utf8").trimStart();
}
function GetJson(filePath) {
const jsonContents = ReadFileSyncUtf8(filePath);
console.log(GetFileEncodingHeader(filePath));
return JSON.parse(jsonContents);
}
Usage:
GetJson("data.json");
Note: I don't currently have a need for this to be async yet. Add another answer if you can make this async!
As mentioned by TamusJRoyce I ended up using the util.TextDecoder class to come up with a robust way to read both UTF-8 (without BOM) and UTF-8 (with BOM). Here's the snippit, assuming that file input.json is UTF-8 (with or without BOM) and contains valid JSON.
const fs = require('fs');
const util = require('util');
const rawdata = fs.readFileSync('input.json');
const textDecoder = new util.TextDecoder('utf-8');
const stringData = textDecoder.decode(rawdata);
const objects = JSON.parse(stringData);
const fs = require('fs');
const myConsole = new console.Console(fs.createWriteStream('./output.json'));
myConsole.log(object);
This will create an output file with all the output which can been triggered through console.log(object).
This is the easiest way to convert the console.log() output into a file.`
I am trying to make a Python 3 application to download weather data from my account at http://www.osanywhereweather.com. I have found JavaScript source code that does exactly this at https://github.com/zrrrzzt/osanywhereweather. I am assuming that the github code works. When inspecting the source of osanywhereweather.com, it seems to me that the github code resembles that very much.
I am new to Python 3 and I have never coded in JavaScript, and I know nothing about cryptographics. I have, however, done a fair share of coding over the last 35 or years, so I read code fairly well. I therefore thought it would be relatively easy to translate the github JavaScript code to Python 3. I was wrong, it seems.
The code of interest is the part of the code that hashes e-mail and password together with a "challenge" received from osanwhereweather.com in order to authenticate me.
I have not been able to test the JavaScript code, but as I said I think it compares well with the source of the osanywhereweather.com page. By analyzing the traffic in my web browser, I can see the information exchanged between osanywhereweather.com and my browser, so that I have got a consistent set of challenge and saltedHash.
When trying to create the same saltedHash based on the corresponding challenge with my Python 3 code, I get a different result.
I have tried internet searches to see if I can find out what I'm doing wrong, but to no avail. If anyone is proficient in JavaScript, Python and cryptographics and is able to point out what I'm doing wrong, I would indeed be grateful.
JavaScript code:
'use strict';
var crypto = require('crypto');
function osaHash(email, password) {
var shasum = crypto.createHash('sha1').update(email);
var e = '$p5k2$2710$' + shasum.digest('hex').toString().substring(0, 8);
var res = crypto.pbkdf2Sync(password, e, 1e4, 32, 'sha256');
var r = res.toString('base64').replace(/\+/g, '.');
return e + '$' + r;
}
function createHash(opts, callback) {
if (!opts) {
return callback(new Error('Missing required input: options'), null);
}
if (!opts.email) {
return callback(new Error('Missing required param: options.email'), null);
}
if (!opts.password) {
return callback(new Error('Missing required param: options.password'), null);
}
if (!opts.challenge) {
return callback(new Error('Missing required param: options.challenge'), null);
}
var hash = osaHash(opts.email, opts.password);
var hmac = crypto.createHmac('sha1', hash).update(opts.challenge);
var saltedHash = hmac.digest('hex');
return callback(null, saltedHash);
}
module.exports = createHash;
Python 3 code:
import hmac
import hashlib
import base64
e_mail = 'me#mydomain.com'
password = 'Secret'
''' challenge is received from osanywhereweather.com '''
challenge = '15993b900f954e659a016cf073ef90c1'
shasum = hashlib.new('sha1')
shasum.update(e_mail.encode())
shasum_hexdigest = shasum.hexdigest()
shasum_substring = shasum_hexdigest[0:8]
e = '$p5k2$2710$' + shasum_substring
res = hashlib.pbkdf2_hmac('sha256',password.encode(),e.encode(),10000,32)
r = base64.b64encode(res,b'./')
hashstr = str(e) + '$' + str(r)
hmac1 = hmac.new(challenge.encode(), hashstr.encode(), 'sha1')
saltedHash = hmac1.hexdigest()
hashstr = str(e) + '$' + str(r)
In the above line, str(r) will give you: "b'ZogTXTk8T72jy01H9G6Y0L7mjHHR7IG0VKMcWZUbVqQ='".
You need to use r.decode() to get "ZogTXTk8T72jy01H9G6Y0L7mjHHR7IG0VKMcWZUbVqQ=".
hashstr = str(e) + '$' + r.decode()
UPDATE 1
Arugments to hmac.new should be fixed:
hmac1 = hmac.new(hashstr.encode(), challenge.encode(), 'sha1')
UPDATE 2
According to OP's comment, OP doesn't need to do the following.
Another thing is that, crypto.pbkdf2Sync seems does not respect digest argument. It seems always use sha1 digest (At least in my system, NodeJS 0.10.25). So you need to specify sha1 in python side:
res = hashlib.pbkdf2_hmac('sha1', password.encode(), e.encode(), 10000, 32)
Based on falsetru's response, the following Python 3 code has been verified to work with the osanywhereweather.com site:
import hmac
import hashlib
import base64
e_mail = 'me#mydomain.com'
password = 'Secret'
''' challenge is received from osanywhereweather.com '''
challenge = '15993b900f954e659a016cf073ef90c1'
shasum = hashlib.new('sha1')
shasum.update(e_mail.encode())
shasum_hexdigest = shasum.hexdigest()
shasum_substring = shasum_hexdigest[0:8]
e = '$p5k2$2710$' + shasum_substring
res = hashlib.pbkdf2_hmac('sha256',password.encode(),e.encode(),10000,32)
r = base64.b64encode(res,b'./')
hashstr = str(e) + '$' + r.decode()
hmac1 = hmac.new(hashstr.encode(), challenge.encode(), 'sha1')
saltedHash = hmac1.hexdigest()
Thank you to falsetru!
I have a mongo db script in a js file:
query.js
//conn = new Mongo();
//db = conn.getDB("dbName");
functionFoo = function (arg){
//----process arg
}
also I have an array of args known as args_array, (that I fetch from database using mongoid) for which I want to do something like this:
args_array.each do |arg|
//some how call functionFoo(arg) from the query.js file
end
is this possible in rails?
I am able to execute the file from terminal but I want to wrap it in my application so that I can use it from rails console.
I know this old question but in case you still need answer or any one else. This answer works with gem mongo ~> 2.3.
The key to answer you do not need mongoid in this case - in my case I use it for rails model, so I use mongoid (5.1.0) only to get DB connection db = Mongoid.default_client.database - or you can get/create database using mongo gem.
To execute javascript on database you need to call command method db.command({ eval: 'js' }) or db.command({ eval: 'function(n){return db.projects.find({name: n}).toArray();}', args: ['beskhai'], nolock: true })
To get the result you can call .documents db.command(...).documents, The return is a hash {retval: it will be return of you script, ok: is 1 if success} the return object of command call is [Mongo::Operation::Result] https://github.com/mongodb/mongo-ruby-driver/blob/master/lib/mongo/operation/result.rb.
I'm using MongoID 6.0.1, and it easy to query everything you want like that:
db ||= Mongoid.default_client.database
f = """
functionFoo = function (arg){
//----process arg
}
"""
result = db.command({:$eval => f, args: [arg1, arg2, ...arg_n], nolock: true})
#result_data = result.first['retval']
It not only a function, just every thing you want to do with command.
My example is:
db ||= Mongoid.default_client.database
f = """
var collectionNames = db.getCollectionNames(), stats = [];
collectionNames.forEach(function (n) { stats.push(db[n].stats()); });
stats = stats.sort(function(a, b) { return b['size'] - a['size']; });
return stats;
"""
result = db.command({:$eval => f, args: [], nolock: true})
#result_data = result.first['retval']
When working in Python I always have this simple utility function which returns the file name and line number from where the function is called:
from inspect import getframeinfo, stack
def d():
""" d stands for Debug. It returns the file name and line number from where this function is called."""
caller = getframeinfo(stack()[1][0])
return "%s:%d -" % (caller.filename, caller.lineno)
So in my code I simply put a couple debug lines like this to see how far we get before some error occurs:
print d()
# Some buggy code here
print d()
# More buggy code here
print d(), 'here is the result of some var: ', someVar
This works really well for me because it really helps debugging quickly.
I'm now looking for the equivalent in a node backend script. I was searching around but I can't find anything useful (maybe I'm looking for the wrong words?).
Does anybody know how I can create a Javascript/nodejs function which outputs the file name and line number from where the function is called? All tips are welcome!
You can create an Error to get where the Error is, and its stack trace. Then you can put that into a function, to get the line where it is.
function thisLine() {
const e = new Error();
const regex = /\((.*):(\d+):(\d+)\)$/
const match = regex.exec(e.stack.split("\n")[2]);
return {
filepath: match[1],
line: match[2],
column: match[3]
};
}
console.log(thisLine());
This works for me in Google Chrome.
And also in node.
Note to #j08691's comment:
Both this and this seem to be using lineNumber, which is not present (as far as I could test) in NodeJS.
Printing line number with custom string
const moment = require('moment');
const log = console.log;
const path = require('path');
function getTime(time) { return moment().format('YYYY-MM-DD HH:mm:ss') };
function line(num = 2) {
const e = new Error();
const regex = /\((.*):(\d+):(\d+)\)$/
const match = regex.exec(e.stack.split("\n")[num]);
const filepath = match[1];
const fileName = path.basename(filepath);
const line = match[2];
const column = match[3];
return {
filepath,
fileName,
line,
column,
str: `${getTime()} - ${fileName}:${line}:${column}`
};
}
log(line().str, "mylog1");
log(line().str, "mylog2");
log(line().str, "mylog3");
OUTPUT
2021-11-22 13:07:15 - test.js:44:5 mylog1
2021-11-22 13:07:15 - test.js:45:5 mylog2
2021-11-22 13:07:15 - test.js:46:5 mylog3
You can use this gulp plugin gulp-log-line . It Logs file and line number without the extra cost of reading the stack.
you just have to install gulp and gulp-log-line using the
npm install gulp --save and npm install gulp-log-line command. after that you need to create and write the below code in gulpfile.js and run
gulp log-line to create a duplicate file in the build folder :-
var gulp = require('gulp');
var logLine = require('gulp-log-line');
gulp.task('line-log', function() {
return gulp.src("file.js", {buffer : true})
//Write here the loggers you use.
.pipe(logLine(['console.log', 'winston.info']))
.pipe(gulp.dest('./build'))
})
gulp.task('default', ['line-log'])
Example
file.js :-
console.log('First log')
var someVariable
console.log('Second log')
Becomes
console.log('file.js:1', 'First log')
var someVariable
console.log('file.js:3', 'Second log')
The only way I've found to get anything relating to line numbers is to trap the window.onerror function, and when there's an error that will get passed the error message, the file URL and the line number:
window.onerror = function(msg, url, line) {
alert(msg + "\n" + url + ":" + line);
};
This works for me on Chrome - I don't know about other browsers.
EDIT when this answer was given in Feb' 15 there was no mention of NodeJS in the question. That was only added in November '17.