NodeJS cluster don't recognize the master worker in clustering - javascript

I'm trying to cluster my node server so I was just testing the example code below.
The below code worked the first time I tried it. I created a new js file and ran the code - worked flawlessly.
Then I deleted the 'practice' js file and moved exactly the same code into my server file to implement it.
Now it won't ever recognize the first worker as the master worker... I have no idea what might have gone wrong.
I have tried setting process.env.NODE_UNIQUE_ID to undefined but it won't reset the master worker! so every time I run this code, I get "Application running!" without "worker loop" which should show everytime it loops through creating a worker, meaning it is not recognising the first worker as the master worker.
Does anyone know what the problem might be?
const cluster = require('cluster');
if (cluster.isMaster) {
var cpuCount = require('os').cpus().length;
for (var i = 0; i < cpuCount; i ++) {
cluster.fork();
console.log(`worker loop ${i}`)
}
} else {
var express = require('express');
var app = express();
app.get('/', function (req, res) {
res.send('Hello World!');
});
app.listen(3000);
console.log('Application running!');
}

Related

NodeJS cluster, Is it really needed?

I decided that i want to investigate what is the best possible way to handle big amount of traffic with NodeJS server, i did a small test on 2 digital ocean servers which has 1GB RAM / 2 CPUs
No-Cluster server code:
// Include Express
var express = require('express');
// Create a new Express application
var app = express();
// Add a basic route – index page
app.get('/', function (req, res) {
res.redirect('http://www.google.co.il');
});
// Bind to a port
app.listen(3000);
console.log('Application running');
Cluster server code:
// Include the cluster module
var cluster = require('cluster');
// Code to run if we're in the master process
if (cluster.isMaster) {
// Count the machine's CPUs
var cpuCount = require('os').cpus().length;
// Create a worker for each CPU
for (var i = 0; i < cpuCount; i += 1) {
cluster.fork();
}
// Code to run if we're in a worker process
} else {
// Include Express
var express = require('express');
// Create a new Express application
var app = express();
// Add a basic route – index page
app.get('/', function (req, res) {
res.redirect('http://www.walla.co.il');
});
// Bind to a port
app.listen(3001);
console.log('Application running #' + cluster.worker.id);
}
And i sent stress test requests to those servers, i excepted that the cluster server will handle more requests but it didn't happen, both servers crashed on the same load, although 2 node services were running on the cluster and 1 service on the non-cluster.
Now i wonder why ? Did i do anything wrong?
Maybe something else is making the servers reach its breakpoint? both servers crashed at ~800 rps
Now i wonder why ? did i do anything wrong?
Your test server doesn't do anything other than a res.redirect(). If your request handlers use essentially no CPU, then you aren't going to be CPU bound at all and you won't benefit from involving more CPUs. Your cluster will be bottlenecked at the handling of incoming connections which is going to be roughly the same with or without clustering.
Now, add some significant CPU usage to your request handler and you should get a different result.
For example, change to this:
// Add a basic route – index page
app.get('/', function (req, res) {
// spin CPU for 200ms to simulate using some CPU in the request handler
let start = Date.now();
while (Date.now() - start < 200) {}
res.redirect('http://www.walla.co.il');
});
Running tests is a great thing, but you have to be careful what exactly you're testing.
What #jfriend00 says is correct; you aren't actually doing enough heavy lifting to justify this, however, you're not actually sharing the load. See here:
app.listen(3001);
You can't bind two services onto the same port and have the OS magically load-balance them[1]; try adding an error handler on app.listen() and see if you get an error, e.g.
app.listen(3001, (err) => err ? console.error(err));
If you want to do this, you'll have to accept everything in your master, then instruct the workers to do the task, then pass the results back to the master again.
It's generally easier not to do this in your Node program though; your frontend will still be the limiting factor. An easier (and faster) way may be to put a special purpose load-balancer in front of multiple running instances of your application (i.e. HAProxy or Nginx).
[1]: That's actually a lie; sorry. You can do this by specifying SO_REUSEPORT when doing the initial bind call, but you can't explicitly specify that in Node, and Node doesn't specify it for you...so you can't in Node.

Getting data from/writing data to localhost with Express

I'm trying to create a webapp for a web art class using node (w/ npm) and express. The idea is to have the body of the site be all one color, but anyone can text the site a hexcode/CSS color at a Twilio number and the color of the site will instantly change to that color value.
Essentially how it works is the server receives a POST request from Twilio at http://example.com/message, which contains the body of the text message. It writes it to a temporary file at ~/app/.data/color.tmp, which is accessed by the client with a jQuery .get() call to http://example.com/color, which returns
So here's the problem: I got a version of the app working on glitch.me, so I know that this code can work, but I'm having a lot of trouble getting it to work on my domain. I installed the app and can start it with npm, and it successfully shows me the HTML page, but the Chrome devtools show the script is receiving a 403 when it tries to access /color. Also, new texts to my site aren't changing the color value in /.data/color.tmp. I thought it might be a permissions issue but I checked them and they seem fine.
Here's the server file and the script on the index.html page:
app/server.js
var express = require('express');
var bodyParser = require('body-parser');
var fs = require('fs');
var app = express();
app.use(bodyParser.urlencoded({extended: false}));
var dataPath = '.data/color.tmp';
// set a new color (saves posted color to disk)
app.post("/message", function (request, response) {
var dataStr = JSON.stringify(request.body.Body);
fs.writeFile(dataPath, dataStr);
response.end();
});
// get the saved color (reading from disk)
app.get("/color", function (request, response) {
var dataStr = fs.readFileSync(dataPath).toString();
response.send(JSON.parse(dataStr));
});
app.get("/", function (request, response) {
response.sendFile(__dirname + '/views/index.html');
});
var listener = app.listen(process.env.PORT, function () {
console.log('listening on port ' + listener.address().port);
});
app/views/index.html
<script>
// checks server for color value and sets background
function checkForColorChange() {
$.get('/color', function getColorComplete(data) {
document.body.style.backgroundColor = data;
console.log(data);
})
}
// Poll the server at 2000ms interval
setInterval(checkForColorChange, 2000);
checkForColorChange();
</script>
Anyway, I feel like I must be missing something really obvious if it worked so easily on Glitch and won't on my website, but I've been stuck for a few days and am not making any progress! Any help would be so appreciated. Let me know if anything's unclear too.
(See update below for a working example)
TL;DR - example:
Original answer
There are few problems with your code:
you're not checking for errors
you're using blocking functions
you're implicitly relying on file permissions but you're not checking it
you're using string concatenation instead of path.join to join paths
you're constantly polling for new data instead of waiting for it to change
you're not catching exceptions of functions that can raise exception
you're not waiting for async operations to finish and you don't handle errors
The main problem that you're experiencing right now is most likely with the file permissions. The good news is that you don't need any file access for what you're doing and using files for that is not optimal anyway. All you need is to store the color in a variable if you don't need it it persist between server restarts - and even if you do then I would use a simple database for that.
For example:
// some initial value:
var color = '#ffffff';
app.post("/message", function (request, response) {
var color = request.body.Body;
response.end();
});
// get the saved color (reading from disk)
app.get("/color", function (request, response) {
response.send(color);
});
app.get("/", function (request, response) {
response.sendFile(__dirname + '/views/index.html');
});
var listener = app.listen(process.env.PORT, function () {
console.log('listening on port ' + listener.address().port);
});
This is the first change that I would use - don't rely on the file system, permissions, race conditions etc.
Another problem that you had with your code was using blocking functions inside of request handlers. You should never use any blocking function (those with "Sync" in their name) except the first tick of the event loop.
Another improvement that I would make would be using WebSocket or Socket.io instead of polling for data on regular intervals. This would be quite easy to code. See this answer for examples:
Differences between socket.io and websockets
A plus of doing that would be that all of your students would get the color changed instantly and at the same time instead of in random moments spanning 2 seconds.
Update
I wrote an example of what I was describing above.
The POST endpoint is slightly different - it uses /color route and color=#abcdef instead of /message and Body=... but you can easily change it if you want - see below.
Server code - server.js:
// requires removed for brevity
const app = express();
const server = http.Server(app);
const io = socket(server);
let color = '#ffffff';
app.use(bodyParser.urlencoded({ extended: false }));
app.use('/', express.static(path.join(__dirname, 'html')));
io.on('connection', (s) => {
console.log('Socket.io client connected');
s.emit('color', color);
});
app.post('/color', (req, res) => {
color = req.body.color;
console.log('Changing color to', color);
io.emit('color', color);
res.send({ color });
});
server.listen(3338, () => console.log('Listening on 3338'));
HTML page - index.html:
<!doctype html>
<html lang=en>
<head>
<meta charset=utf-8>
<meta name=viewport content="width=device-width, initial-scale=1">
<title>Node Live Color</title>
<link href="/style.css" rel=stylesheet>
</head>
<body>
<h1>Node Live Color</h1>
<script src="/socket.io/socket.io.js"></script>
<script src="/script.js"></script>
</body>
</html>
Style sheet - style.css:
body {
transition: background-color 2s ease;
background-color: #fff;
}
Client-side JavaScript - script.js:
var s = io();
s.on('color', function (color) {
document.body.style.backgroundColor = color;
});
What is particularly interesting is how simple is the client side code.
For your original endpoint use this in server.js:
app.post('/message', (req, res) => {
color = req.body.Body;
console.log('Changing color to', color);
io.emit('color', color);
res.end();
});
Full example is available on GitHub:
https://github.com/rsp/node-live-color
I tested it locally and on Heroku. You can click this button to deploy it on Heroku and test yourself:
Enjoy.
I think, the problem is in var dataStr = fs.readFileSync(dataPath).toString();. Please change your dataPath as follow:
var dataPath = __dirname + '/data/color.tmp';
And also make sure that file has read/write permission by the .

How to inject module from different app in Node.js

I've two node apps/services that are running together,
1. main app
2. second app
The main app is responsible to show all the data from diffrent apps at the end. Now I put some code of the second app in the main app and now its working, but I want it to be decoupled. I mean that the code of the secnod app will not be in the main app (by somehow to inject it on runtime )
like the second service is registered to the main app in inject the code of it.
the code of it is just two modules ,is it possible to do it in nodejs ?
const Socket = require('socket.io-client');
const client = require("./config.json");
module.exports = (serviceRegistry, wsSocket) =>{
var ws = null;
var consumer = () => {
var registration = serviceRegistry.get("tweets");
console.log("Service: " + registration);
//Check if service is online
if (registration === null) {
if (ws != null) {
ws.close();
ws = null;
console.log("Closed websocket");
}
return
}
var clientName = `ws://localhost:${registration.port}/`
if (client.hosted) {
clientName = `ws://${client.client}/`;
}
//Create a websocket to communicate with the client
if (ws == null) {
console.log("Created");
ws = Socket(clientName, {
reconnect: false
});
ws.on('connect', () => {
console.log("second service is connected");
});
ws.on('tweet', function (data) {
wsSocket.emit('tweet', data);
});
ws.on('disconnect', () => {
console.log("Disconnected from blog-twitter")
});
ws.on('error', (err) => {
console.log("Error connecting socket: " + err);
});
}
}
//Check service availability
setInterval(consumer, 20 * 1000);
}
In the main module I put this code and I want to decouple it by inject it somehow on runtime ? example will be very helpful ...
You will have to use vm module to achieve this. More technical info here https://nodejs.org/api/vm.html. Let me explain how you can use this:
You can use the API vm.script to create compiled js code from the code which you want run later. See the description from official documentation
Creating a new vm.Script object compiles code but does not run it. The
compiled vm.Script can be run later multiple times. It is important to
note that the code is not bound to any global object; rather, it is
bound before each run, just for that run.
Now when you want to insert or run this code, you can use script.runInContext API.
Another good example from their official documentation:
'use strict';
const vm = require('vm');
let code =
`(function(require) {
const http = require('http');
http.createServer( (request, response) => {
response.writeHead(200, {'Content-Type': 'text/plain'});
response.end('Hello World\\n');
}).listen(8124);
console.log('Server running at http://127.0.0.1:8124/');
})`;
vm.runInThisContext(code)(require);
Another example of using js file directly:
var app = fs.readFileSync(__dirname + '/' + 'app.js');
vm.runInThisContext(app);
You can use this approach for the conditional code which you want to insert.
You can create a package from one of your apps and then reference the package in the other app.
https://docs.npmjs.com/getting-started/creating-node-modules
There are several ways to decouple two applications. One easy way is with pub/sub pattern (in case you don't need a response).
(Now if you have an application that is very couple, it will be very difficult to decouple it unless you do some refactoring.)
zeromq offers a very good implementation of pub/sub and is very fast.
e.g.
import zmq from "zmq";
socket.connect('tcp://127.0.0.1:5545');
socket.subscribe('sendConfirmation');
socket.on('message', function (topic, message) {
// you can get the data from message.
// something like:
const msg = message.toString('ascii');
const data = JSON.parse(msg);
// do some actions.
// .....
});
//don't forget to close the socket.
process.on('SIGINT', () => {
debug("... closing the socket ....");
socket.close();
process.exit();
});
//-----------------------------------------
import zmq from "zmq";
socket.bind('tcp://127.0.0.1:5545');
socket.send(['sendConfirmation', someData]);
process.on('SIGINT', function() {
socket.close();
});
This way you could have two different containers (docker) for your modules, just be sure to open the corresponding port.
What i don't understand, is why you inject wsSocket and also you create a new Socket. Probably what I would do is just to send the
socket id, and then just use it like:
const _socketId = "/#" + data.socketId;
io.sockets.connected[socketId].send("some message");
You could also use another solution like kafka instead of zmq, just consider that is slower but it will keep the logs.
Hope this can get you an idea of how to solve your problem.
You can use npm link feature.
The linking process consists of two steps:
Declaring a module as a global link by running npm link in the module’s root folder
Installing the linked modules in your target module(app) by running npm link in the target folder
This works pretty well unless one of your local modules depends on another local module. In this case, linking fails because it cannot find the dependent module. In order to solve this issue, one needs to link the dependent module to the parent module and then install the parent into the app.
https://docs.npmjs.com/cli/link

Connect to socket-io already defined in index.js using external node script

I am using hapijs in my MEAN stack and implemented socket.io (using this for reference: http://matt-harrison.com/using-hapi-js-with-socket-io/) Everything works fine, no problems there. It works great in my application!
However, there will be script I will be running via command line separately (which will be doing some maintenance on the application) that I was hoping to connect to the same web socket and be able to push to clients messages if data needs to be refreshed.
My index.js taken straight from the example:
var Hapi = require('hapi');
var server = new Hapi.Server();
server.connection({ port: 3000 });
var io = require('socket.io')(server.listener);
io.on('connection', function (socket) {
socket.emit('Hello');
});
server.start();
I tried to create a separate JS file, and do a:
var socket = require('socket.io');
var io = socket.listen(3000);
Then passed io to send a message. This doesn't seem right... I guess I'm wondering if this can even be done. Messing around I've either created a separate web socket or no connection to the client.
Please let me know if I need to provide more information.
Thanks.
T
In your provided code, you're creating 2 servers. [io.listen()][1] listens on a port as a server.
What you need to do instead to pass messages around is to create a socket.io client in your separate script. There's a separate module for this called socket.io-client, which you can require to be a client:
client.js
var io = require('socket.io-client');
var socket = io('http://localhost:3000');
socket.on('beep', function () {
console.log('beep');
socket.emit('boop');
});
server.js
Here's a slightly updated version of your server script too (hapi v9.0.0 has a mandatory callback for server.start()):
var Hapi = require('hapi');
var server = new Hapi.Server();
server.connection({ port: 3000 });
var io = require('socket.io')(server.listener);
io.on('connection', function (socket) {
socket.emit('beep');
socket.on('boop', function () {
console.log('boop');
});
});
server.start(function () {
console.log('Started Server!');
});
If you open up a couple of terminals and run these, you should see messages passed between them and beep and boop logged out:

Make HTTP request inside Web Worker

I am trying to use web-workers or threads in my node application for the first time. I am using the webworker-threads npm module.
Basically I would like each worker to make requests to a server, measure the response time and send it back to the main thread.
I tried it many different ways, but I just can't seem to get it working. The basic examples from the docs work. But when I try to require a module ("request" in my case), the workers just seem to stop working, without any error messages. I saw in the docs that require doesn't work inside a worker, so I tried "importScripts()", which doesn't work either. When using threadpools I tried to use .all.eval() but it didn't work either.
Since this is the first time working with web-workers / threads in node, I might misunderstand how to use those things in general. Here is one example I tried:
server.js
var Worker = require('webworker-threads').Worker;
var worker = new Worker('worker.js');
worker.js
console.log("before import");
importScripts('./node_modules/request/request.js');
console.log("after import");
This basic example only prints before import and then stops.
Web workers are native javascript only so you can't achieve what you want with them. Worker threads don't support node.js api or npm packages(like http or request.js). For concurrency you don't need any multithread magic just use async.js or promises. If you want to play with threads then child_processes is the way to go. You could also use an API to manage child_processes like https://github.com/rvagg/node-worker-farm
Considering your example you could write something like this:
main.js
var workerFarm = require('worker-farm')
, workers = workerFarm(require.resolve('./child'))
, ret = 0;
var urls = ['https://www.google.com', 'http://stackoverflow.com/', 'https://github.com/'];
urls.forEach(function (url) {
workers(url, function (err, res, body, responseTime) {
console.log('Url ' + url + 'finished in ' + responseTime + 'ms');
//Ugly code here use async/promise instead
if (++ret == urls.length)
workerFarm.end(workers);
});
});
child.js
var request = require('request');
module.exports = function(url, cb) {
var start = new Date();
request(url, function(err, res, body) {
var responseTime = new Date() - start;
cb(err, res, body, responseTime);
});
};

Categories