Meteor - How to Run Multiple Server Processes Simultaneously? - javascript

My Meteor app needs to run 13 separate server processes, each on a setInterval. Essentially, I am pinging 13 different external APIs for new data, and performing calculations on the response and storing the results in Mongo. Each process looks something like this:
Meteor.setInterval(function () {
try {
var response = Meteor.http.call("GET", <url>, {
params: <params>,
timeout: <timeout>
});
}
catch (err) {
console.log(err);
}
if (response.statusCode === 200) {
// handle the response
...
}
}, 10000);
Unfortunately, Meteor chokes up after only three of these interval functions are turned on and running side by side. I start getting socket hangup errors and JS Allocation Failed errors thrown in console. I presume this has something to do with Node's single-threading. Does anybody know what the solution is for this? I've looked long and hard... I'm really wondering if I have to split out the back-end from 1 Meteor app with 13 processes (which doesn't seem to run) to 13 Meteors (or Node.js apps), each with 1 process. Thanks!

Try https://atmospherejs.com/vsivsi/job-collection
Benefits:
Jobs can be added to a queue, and you have granular control over when they succeed or fail... failed jobs can easily be re-queued.
It's automatically clustered against all of your Meteor processes that are tied to the same collection.

Status update: a large part of the problem has to do with Node being single-threaded. I solved the CPU limitation problem by splitting out this monolithic Meteor app into 13 microservices Meteor apps, all connected to the same MongoDB replica set.
This way, all cores on the CPU are being utilized, rather than Meteor trying to handle all requests and processes on just one.

Related

Should I multithread my Node JS web server?

I have a simple Node.JS HTTPS web server using socket.io. Users can send to the server lots of different information. In a few seconds, you may have the web server receive multiple one line objects to the web server from one user (i.e. { code: '124' }, but there may be multiple users doing this at once. Then the web server returns information to all users in that socket.io room.
As this information comes in, the web server intermittently saves this data to a simple MySQL database although I am limiting saves so that there aren't multiple small MySQL writes per second or anything like that.
My thought process is that as more users log onto and connect to the web server, the code may become blocked as Node.JS is single threaded, and this may cause a lag in information getting back to the users in real time, or it may cause problems with the latest data being saved to the database. I was thinking of doing something like this so that the database updates is handled by a separate web worker -
socket.on('data', function(msg) {
try {
const newWorker = new Worker('./src/worker.js');
newWorker.on('message', function(result) {
io.to(`${socketID}`).emit('newData', result);
});
newWorker.on('error', (err) => {
io.to(`${socketID}`).emit('newData', { error: err });
console.dir(err);
});
newWorker.postMessage(msg);
} catch(e) {
console.log(e);
io.to(`${socketID}`).emit('criticalError', "We ran into an error - try refreshing");
}
});
I know however, that async processes can be run on multiple threads in the background of Node.JS and Node.JS is generally quite performant.
My question is, given that there may be multiple database writes happening simultaneously, as well as multiple pieces of information coming in, which then need to be sent back to users in any given second, does it make sense for me to use web workers to make this kind of process mulithreaded? Or is Node.JS capable enough to handle all of this in the background on multiple threads without me needing to worry?

How to run child process in Mean Stack

I have an Mean application which uses nodejs, angularjs and expressjs.
Here I have called my server from the angular controller as below
Angular Controller.js
$http.post('/sample', $scope.sample).then(function (response) {
--
--
}
and in Server.js as below
app.post('/sample', userController.postsample);
Here I am doing my operation with mongodb in that post sample from above code.
Here I got struck how to do my calculation part like I have a big calculation which takes some time (assume 1 hour) to complete. So from client side I will trigger that calculation from my angular controller.
My problem is that calculation should run in separately so that other UIs and operations of other pages should not be interupted.
I had seen that child process in nodejs but I didn't understand how to trigger or exec that from child process from controller and if it get request in app.post then is it possible to access other pages.
EDIT:
I have planned to do in Spawn a child_process but I have another problem continuing the above.
Lets consider application contains 3 users and 2 users are accessing the application at same time.
My case is If first person triggered the child_process name it as first operation and it is in process and at that moment when second person need to trigger the process name it as 2nd operation as he also needed to calculate.
Here my questions are
What happens if another person started the spawn command. If it hangs or keep in queue or both execute parallel.
If 2nd operation is in queue then when it will start the operation.
If 2nd operation is in queue then how can i know how many are in queue at a point of time
Can anyone help to solve.
Note: the question was edited - see updates below.
You have few options to do it.
The most straightforward way would be to spawn the child process from your Express controller that would return the response to the client once the calculation is done, but if it takes so long then you may have problems with socket timeouts etc. This will not block your server or the client (if you don't use "Sync" function on the server and synchronous AJAX on the client) but you will have problems with the connection hanging for so long.
Another option would be to use WebSocket or Socket.io for those requests. The client could post a message to the server that it wants some computation to get started and the server could spawn the child process, do other things and when the child returns just send the message to the client. The disadvantage of that is a new way of communication but at least there would be no problems with timeouts.
To see how to combine WebSocket or Socket.io with Express, see this answer that has examples for both WebSocket and Socket.io - it's very simple actually:
Differences between socket.io and websockets
Either way, to spawn a child process you can use:
spawn
exec
execFile
fork
from the core child_process module. Just make sure to never use any functions with "Sync" in their name for what you want to do because those would block your server from serving other requests for the entire time of waiting for the child to finish - which may be an hour in your case, but even if it would be a second it could still ruin the concurrency completely.
See the docs:
https://nodejs.org/api/child_process.html
Update
Some update for the edited question. Consider this example shell script:
#!/bin/sh
sleep 5
date -Is
It waits for 5 seconds and prints the current time. Now consider this example Node app:
let child_process = require('child_process');
let app = require('express')();
app.get('/test', (req, res) => {
child_process.execFile('./script.sh', (err, data) => {
if (err) {
return res.status(500).send('Error');
}
res.send(data);
});
});
app.listen(3344, () => console.log('Listening on 3344'));
Or using ES2017 syntax:
let child_process = require('mz/child_process');
let app = require('express')();
app.get('/test', async (req, res) => {
try {
res.send((await child_process.execFile('./script.sh'))[0]);
} catch (err) {
res.status(500).send('Error');
}
});
app.listen(3344, () => console.log('Listening on 3344'));
It runs that shell script for requests on GET /test and returns the result.
Now start two requests at the same time:
curl localhost:3344/test & curl localhost:3344/test & curl localhost:3344/test &
and see what happens. If the returned times differ by 5 seconds and you get one response after another with 5 seconds intervals then the operations are queued. If you get all responses at the same time with more or less the same timestamp then those are all run in parallel.
Sometimes it's best to make an experiment like this to see what happens.

How to run simultaneous Node child processes

TL;DR: I have an endpoint on an Express server that runs some cpu-bound logic in a child_process. The problem is that if the server gets more than one request for that endpoint it won't run both requests simultaneously- it queues them up and runs them one-at-a-time. Is there a way to use Node child_process so that my server will perform multiple child processes simultaneously?
Long-Version: The major downfall of Node is that it is single-threaded and a logic-heavy (cpu-bound) request can make the server stop dead in its tracks so that it can't take anymore requests until that logic is finished running. I thought that I could work around this using child_process, which is working great in freeing up my server to take other requests. BUT- it will only execute child_processes one at a time, creating a queue that can get pretty backed-up. I also have a Node cluster setup so that my server is split into 8 separate "virtual servers" (8-core machine), so I guess I can technically run 8 of these child processes at once, but I want to be able to handle more traffic than that. Looking for a solution that will still allow me to use Node and Express, please only suggest using different technologies if you are absolutely sure this can't be efficiently done in my current environment. Thanks in advance for the help!
Endpoint:
app.get('/cpu-exec-file', function(req, res) {
child_process.execFile('node', ['./blocking_tasks/mathCruncher.js'], {timeout:30000}, function(err, stdout, stderr) {
var data = JSON.parse(stdout);
res.send(data);
})
});
mathCruncher.js:
var obj = {}
function myLoop (i) {
setTimeout(function () {
obj[i] = Math.random() * 100;
if (--i) {
myLoop(i);
} else {
string = JSON.stringify(obj);
console.log(string); // goes to stdout.
}
}, 1000)
};
myLoop(10);
Is there a way to use Node child_process so that my server will perform multiple child processes simultaneously?
message queue and back-end process.
i do exactly what you're wanting, using RabbitMQ. there are several other great messaging systems out there, like ZeroMQ and even Redis w/ some pub-sub libraries on top of it.
the gist of it is to send a request to your queueing system and have another process pick up the message, then run the process to do the work.
if you need a response from the worker, you can use bi-directional messaging with either a Request/Reply setup, or use status messages for really-long-running things.
if you're interested in the RabbitMQ side of things, I have a free email course on various patterns with RabbitMQ, including Request/Reply and status emails: http://derickbailey.com/email-courses/rabbitmq-patterns-for-applications/
and if you're interested in ground-up training on RMQ w/ Node, check out my training course at http://rabbitmq4devs.com

In NodeJS, how do I re-establish a socket connection with another server that may have gone down?

So, I have a Express NodeJS server that is making a connection with another app via an upagraded WebSocket uri for a data feed. If this app goes down, then obviously the WebSocket connection gets closed. I need to reconnect with this uri once the app comes back online.
My first approach was to use a while loop in the socket.onclose function to keep attempting to make the re-connection once the app comes back online, but this didn't seem to work as planned. My code looks like this:
socket.onclose = function(){
while(socket.readyState != 1){
try{
socket = new WebSocket("URI");
console.log("connection status: " + socket.readyState);
}
catch(err) {
//send message to console
}
}
};
This approach keeps giving me a socket.readyState of 0, even after the app the URI is accessing is back online.
Another approach I took was to use the JavaScript setTimout function to attempt to make the connection by using an exponential backoff algorithm. Using this approach, my code in the socket.onclose function looks like this:
socket.onclose = function(){
var time = generateInterval(reconnAttempts); //generateInterval generates the random time based on the exponential backoff algorithm
setTimeout(function(){
reconnAttempts++; //another attempt so increment reconnAttempts
socket = new WebSocket("URI");
}, time);
};
The problem with this attempt is that if the app is still offline when the socket connection is attempted, I get the following error, for obvious reasons, and the node script terminates:
events.js:85
throw er; // Unhandled 'error' event
Error: connect ECONNREFUSED
at exports._errnoException (util.js:746:11)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1010:19)
I also began using the forever node module to ensure that my node script is always running and to make sure it gets restarted after an unexpected exit. Even though I'm using forever, after a few restarts, forever just stops the script anyway.
I am basically just looking for a way to make my NodeJS server more robust and automatically re-connect with another server that may have gone down for some reason, instead of having to manually restart the node script.
Am I completely off base with my attempts? I am a noob when it comes to NodeJS so it may even be something stupid that I'm overlooking, but I have been researching this for a day or so now and all of my attempts don't seem to work as planned.
Any suggestions would be greatly appreciated! Thanks!
Few suggestions
1) Start using domain which prevents your app from an unexpected termination. Ie your app will run under the domain(run method of domain). You can implement some alert mechanism such as email or sms to which will notify when any error occurs.
2) Start using socket.io for websocket communication, it automatically handles the reconnection. Socket.io uses keep-alive heartbeat and continuously polls from the server.
3) Start using pm2 instead of forever. Pm2 allows clustering for your app which improves the performance.
I think this may improve your app's performance, stability and robustness.

Node/Express pending request

I'm bit new on the Node.js front and for now its awesome. I've encountered a small problem while running node (/w express) locally - Every request after the 10th one hangs and is marked as Pending in the Chrome Inspect Network.
As for modules i'm using less-middleware, express, jade and MySQL and i only do one SQL query (using mysql.createPool). Why is this request still Pending and how can i fix this?
Since i'm new at Node i'm not sure if i've tried everything so any help would be appreciated!
It sounds like you're not releasing the MySQL connection you're getting back from the pool. If you don't, the pool will run out of free connections and will start waiting for any to become available (and until then, stall the request).
So your code should look similar to this:
var pool = mysql.createPool(...);
...
// in your request handler:
pool.getConnection(function(err, connection) {
if (err) ...handle error...;
connection.query(function(err, results) {
// release connection
connection.release();
// handle results
...
// send back a response
res.send(...);
});
});

Categories