Node js backend calling a python function - javascript

I am in a bit of a fix - I have a node.js app running on my backend, and I have a chunk of code written in Python. This Python program needs to be running constantly in the background, and I need to call a function in this program from the JavaScript code on an API call from some client.
I was thinking of running the Python program as a daemon, but I could not find anything on how I would call a function on that daemon through my javascript code.
I have never worked with daemons either, so at this point, I'm clueless. I would like to know if something like this is possible.
The only other option I can think of have is to switch to Django and maintain the data as part of the Django app itself. I could do this, but I prefer not to. I can not write the Python code in JavaScript because of its dependence on some exclusive libraries that I couldn't find on npm.
If anyone has faced this problem before, please let me know. Thank you!

Here's a simple example (with pip install flask). Assume the function is "is this a real word"; and the heavy prep task is loading the dictionary. Here's code:
from flask import Flask, request, jsonify
app = Flask(__name__)
# heavy loading
dictionary = frozenset(
line.rstrip('\n') for line in open('/usr/share/dict/words'))
# lightweight processing
#app.route('/real_word', methods=['POST'])
def real_word():
result = request.form['word'] in dictionary
return jsonify(result)
# quick-and-dirty start
if __name__ == '__main__':
app.run(host='127.0.0.1', port=7990)
When you run it, you can execute a request to 127.0.0.1:7990/real_word, sending your word as a POST parameter. For example, assuming npm install request:
var request = require('request');
function realWord(word) {
return new Promise(function(fulfill, reject) {
request.post(
'http://127.0.0.1:7990/real_word', {
form: {
word: word
}
},
function (error, response, body) {
if (!error && response.statusCode == 200) {
fulfill(JSON.parse(body));
} else {
reject(error, response);
}
}
);
});
}
realWord("nuclear").then(console.log); // true
realWord("nucular").then(console.log); // false
(Obviously, in an example so simple, reading a list of words is hardly "heavy", and there's really no reason to JSONify a single boolean; but you can take the exactly same structure of code and apply it to wrap pretty much any function, with any kind of input/output you can serialise.)
If it's just for your needs, you can just run the Python program as-is; if you want something that is production-quality, look up how to host a Flask app on a WSGI container, like Gunicorn or mod_wsgi.

You can use node's child process to to call the python script, you could even pipe the result back to javascript if you wish ?
https://nodejs.org/api/child_process.html
https://medium.freecodecamp.com/node-js-child-processes-everything-you-need-to-know-e69498fe970a
Javascript.js
var exec = require('child_process').exec;
cmd = `python python.py` // insert any command line argument
exec(cmd, function(error, stdout, stderr) {
console.log(stdout);
});
Python.py
print("Heyo");
Even tho this example only catches a single output from the console, you could implement a stream.

Related

Which is the easiest way to allow comunication between NodeJS and a Python script [duplicate]

Node.js is a perfect match for our web project, but there are few computational tasks for which we would prefer Python. We also already have a Python code for them.
We are highly concerned about speed, what is the most elegant way how to call a Python "worker" from node.js in an asynchronous non-blocking way?
This sounds like a scenario where zeroMQ would be a good fit. It's a messaging framework that's similar to using TCP or Unix sockets, but it's much more robust (http://zguide.zeromq.org/py:all)
There's a library that uses zeroMQ to provide a RPC framework that works pretty well. It's called zeroRPC (http://www.zerorpc.io/). Here's the hello world.
Python "Hello x" server:
import zerorpc
class HelloRPC(object):
'''pass the method a name, it replies "Hello name!"'''
def hello(self, name):
return "Hello, {0}!".format(name)
def main():
s = zerorpc.Server(HelloRPC())
s.bind("tcp://*:4242")
s.run()
if __name__ == "__main__" : main()
And the node.js client:
var zerorpc = require("zerorpc");
var client = new zerorpc.Client();
client.connect("tcp://127.0.0.1:4242");
//calls the method on the python object
client.invoke("hello", "World", function(error, reply, streaming) {
if(error){
console.log("ERROR: ", error);
}
console.log(reply);
});
Or vice-versa, node.js server:
var zerorpc = require("zerorpc");
var server = new zerorpc.Server({
hello: function(name, reply) {
reply(null, "Hello, " + name, false);
}
});
server.bind("tcp://0.0.0.0:4242");
And the python client
import zerorpc, sys
c = zerorpc.Client()
c.connect("tcp://127.0.0.1:4242")
name = sys.argv[1] if len(sys.argv) > 1 else "dude"
print c.hello(name)
For communication between node.js and Python server, I would use Unix sockets if both processes run on the same server and TCP/IP sockets otherwise. For marshaling protocol I would take JSON or protocol buffer. If threaded Python shows up to be a bottleneck, consider using Twisted Python, which
provides the same event driven concurrency as do node.js.
If you feel adventurous, learn clojure (clojurescript, clojure-py) and you'll get the same language that runs and interoperates with existing code on Java, JavaScript (node.js included), CLR and Python. And you get superb marshalling protocol by simply using clojure data structures.
If you arrange to have your Python worker in a separate process (either long-running server-type process or a spawned child on demand), your communication with it will be asynchronous on the node.js side. UNIX/TCP sockets and stdin/out/err communication are inherently async in node.
I've had a lot of success using thoonk.js along with thoonk.py. Thoonk leverages Redis (in-memory key-value store) to give you feed (think publish/subscribe), queue and job patterns for communication.
Why is this better than unix sockets or direct tcp sockets? Overall performance may be decreased a little, however Thoonk provides a really simple API that simplifies having to manually deal with a socket. Thoonk also helps make it really trivial to implement a distributed computing model that allows you to scale your python workers to increase performance, since you just spin up new instances of your python workers and connect them to the same redis server.
I'd consider also Apache Thrift http://thrift.apache.org/
It can bridge between several programming languages, is highly efficient and has support for async or sync calls. See full features here http://thrift.apache.org/docs/features/
The multi language can be useful for future plans, for example if you later want to do part of the computational task in C++ it's very easy to do add it to the mix using Thrift.
I'd recommend using some work queue using, for example, the excellent Gearman, which will provide you with a great way to dispatch background jobs, and asynchronously get their result once they're processed.
The advantage of this, used heavily at Digg (among many others) is that it provides a strong, scalable and robust way to make workers in any language to speak with clients in any language.
Update 2019
There are several ways to achieve this and here is the list in increasing order of complexity
Python Shell, you will write streams to the python console and it
will write back to you
Redis Pub Sub, you can have a channel
listening in Python while your node js publisher pushes data
Websocket connection where Node acts as the client and Python acts
as the server or vice-versa
API connection with Express/Flask/Tornado etc working separately with an API endpoint exposed for the other to query
Approach 1 Python Shell Simplest approach
source.js file
const ps = require('python-shell')
// very important to add -u option since our python script runs infinitely
var options = {
pythonPath: '/Users/zup/.local/share/virtualenvs/python_shell_test-TJN5lQez/bin/python',
pythonOptions: ['-u'], // get print results in real-time
// make sure you use an absolute path for scriptPath
scriptPath: "./subscriber/",
// args: ['value1', 'value2', 'value3'],
mode: 'json'
};
const shell = new ps.PythonShell("destination.py", options);
function generateArray() {
const list = []
for (let i = 0; i < 1000; i++) {
list.push(Math.random() * 1000)
}
return list
}
setInterval(() => {
shell.send(generateArray())
}, 1000);
shell.on("message", message => {
console.log(message);
})
destination.py file
import datetime
import sys
import time
import numpy
import talib
import timeit
import json
import logging
logging.basicConfig(format='%(asctime)s : %(levelname)s : %(message)s', level=logging.INFO)
size = 1000
p = 100
o = numpy.random.random(size)
h = numpy.random.random(size)
l = numpy.random.random(size)
c = numpy.random.random(size)
v = numpy.random.random(size)
def get_indicators(values):
# Return the RSI of the values sent from node.js
numpy_values = numpy.array(values, dtype=numpy.double)
return talib.func.RSI(numpy_values, 14)
for line in sys.stdin:
l = json.loads(line)
print(get_indicators(l))
# Without this step the output may not be immediately available in node
sys.stdout.flush()
Notes: Make a folder called subscriber which is at the same level as source.js file and put destination.py inside it. Dont forget to change your virtualenv environment

sails.js how to call a REST API from a script using restler or not?

I haven't figured out how to get a response from restler in sails.js script. The goal is to import some data each day from a REST API to my database. So I chose to use sails.js script to be able to crontab the call.
When I search what I could use to contact the API. I saw a topic about restler which seem pretty easy to use.
Sadly when I try it I cannot catch the response from the API, I use this basic example
module.exports = {
friendlyName: 'test',
description: 'this is a test',
fn: async function () {
sails.log('Running custom shell script... (`sails run test`)');
var rest = require('restler');
rest.get('http://google.com').on('complete', function(result) {
sails.log("In callback")
if (result instanceof Error) {
sails.log('Error:', result.message);
this.retry(5000); // try again after 5 sec
} else {
sails.log(result);
}
});
sails.log('finished');
}
};
When I run the script this is what I get
info: Initializing hook... (`api/hooks/custom`)
info: Initializing `apianalytics` hook... (requests to monitored routes will be logged!)
info: ·• Auto-migrating... (alter)
info: Hold tight, this could take a moment.
info: ✓ Auto-migration complete.
debug: Running custom shell script... (`sails run test`)
debug: finished
I try other method and other URL too, but apparently I never get in the callback.
So I assume that the script doesn't "wait" the .on('complete') before continuing his execution.
I know, and use, on the database call .then to avoiding that, I think this is called a 'promise'. So from what I understand the problem is around that but sadly after searching, I don't find any answer to solve my issue.
I am not married to restler, and if another and better solution exists don't hesitate, the goal is again to get the content from a REST API from sails.js script.
Thank you for the time you take to answer my question.

Basic Node.js question with regards to serverless functions

I am developing an Alexa Skill for the first time. For the Fulfillment sections, I planned to hook them up to serverless functions (written in Node.js) on Azure. I developed the intents with Google's Dialogflow which I plan to export to Amazon Alexa's console. I am a C# programmer but willing to learn Node.js/Javascript and have a few basic questions.
I installed/used the "azure-functions-core-tools" from Github to create my serverless function in Node.js. It created the file below.
The name of my Function is HelloWorld
Index.js file below was created by "func new" command and contains the following:
module.exports = async function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
if (req.query.name || (req.body && req.body.name)) {
context.res = {
// status: 200, /* Defaults to 200 */
body: "Hello " + (req.query.name || req.body.name)
};
}
else {
context.res = {
status: 400,
body: "Please pass a name on the query string or in the request body"
};
}
};
My questions are
1) Are you limited to one function per file?
2) Can I have multiple functions in the same file?
3) If so, how is that possible because just looking at this, there is no name for this function?
4) If not, how can I call other functions outside this function?
5) I am taking an online class on Node.js but wonder if I really should take a Javascript class instead. What would you recommend?
Many Thanks
Answers:
No
Yes. You basically need to search for basic syntax using a search engine (example: google.com)
Because some js-file/module will use this as a module, importing it using require and then using it as a method. For example :
-
// file name - run.js
var asyncFunction = require('index.js'); // This fetches the function, as its exported in index.js
asyncFunction() // this executes the function.
You use export and require.
Nodejs is a complete set of environment which uses javascript as the programming language. So it basically depends on the content of that course.

Node.js alternative to PHP's exec

I want to write a piece of C software (like prime number factorization) and place it on my web-server (based on Node.js).
Then I would like to have an HTML-document with a form and a button.
When the button is clicked the C-program should be executed (on the server) with the a text-field of the form passed as input parameter. The
output of the C-program should be passed to an HTML-textfield again.
In PHP something like this is possible:
<?php
exec('~/tools/myprog.o');
?>
How would I go about something like this in Node.js (Express.js is fine as well). Especially the input/output redirection?
I feel that your question is 2 part.
PHP's exec equivalent for node.js
Execute C program and provide its out put using node.js.
Sticking to pure node.js, you would be able to execute and get access to stdout and stderr with following code.
var exec = require('child_process').exec;
exec("ls -l", function(error, stdout, stderr) {
console.log(error || stdout || stderr);
});
For #2,
I am not sure if you could directly execute a "c" program, so I guess you'd need to do something like this
var exec = require('child_process').exec;
exec("gcc ~/tools/myprog.c -o prime && ./prime", function(error, stdout, stderr) {
console.log(error || stdout || stderr);
});
Above code is definitely not thread safe and if concurrent requests are made, it will break, Ideally you'd not compile it for every request, you'd compile and cache for first request (you can match time stamp of source and compiled output, if compiled output is absent or its time stamp is less - earlier then the source code then you need to compile it) and then use cached output on subsequent requests.
In express you can do something like below
var exec = require('child_process').exec;
router.get('/prime/:num',function(req, res, next){
exec("gcc ~/tools/myprog.c -o prime && ./prime " + req.params.num, function(error, stdout, stderr) {
if(error){
res.status(400).json({'message': error.message});
return;
}
if(stderr){
res.status(400).json({'message': stderr});
return;
}
res.status(200).json({'result': stdout});
});
});
You can play with above code snippets and get what you are looking for.
Hope this helps. Please let me know if you need anything else.

Stream a command's response up to node

I'm trying to make a node package which executes a permanent script that keeps printing data and passes it to a package caller.
I use the exec object, in order to call this command:
var exec = require('child_process').exec;
// ...
exec("script always", function(error, stdout, stderr) {
if (error instanceof Error) {
throw error;
}
// here's where everything gets messy
if(callback)
callback(stream)
});
This command keeps printing data until I stop it. Now, how could I stream this console output to a node event another wants to implement? I've been reading about the Readable stream, but I don't know how to tie it all together (actually, heh, they have the function getReadableStreamSomehow() for the sake of the examples).
You should use spawn, not exec, as described here. This will return an object where you have direct access to the program's streams, so you can easily pipe them, e.g., or subscribe to their data events.
See this example for a sample on how to capture output of the curl command.
Hope this helps :-)

Categories