How does concurrency work in Node.js + Express application? - javascript

(Solution found)
I would like to start my question with code example:
var express = require('express');
var customModule = require('./custom-module');
var app = express();
// Page A
app.get('/a', function(req, res) {
res.type('text/plain');
customModule.doHeavyOperation(25, function(result) {
res.send('Page A, number = ' + result);
});
});
// Page B
app.get('/b', function(req, res) {
res.type('text/plain');
res.send('Page B');
});
app.listen(8082);
For Page A call doHeavyOperation() which performs some actions and when they are done, it runs my callback which sends some information to user.
cutsom-module.js:
exports.doHeavyOperation = function(number, callback) {
for (var i=0; i<2000000000; i++) {
number++;
}
callback(number);
};
For doHeavyOperation() it takes ~5 seconds to perform it's logic and to call my callback. When I open in my browser two pages: localhost:8082/a and localhost:8082/b, the second page needs to wait until the first one will load. That means that requests are handled not concurrently.
Instead of doHeavyOperation there can be any other function which will freeze my clients while they try to use my web application. For example, I may want to process some big images that my user upload, to apply some filters on them, etc. It can take 1-3 seconds. So if there will be 100 people using my website at the same time, it will cause big problems.
For example, Apache + PHP deals with this situation pretty good. While script a.php performs some actions, script b.php can be loaded without problems.
How to achieve this behaviour in Node.js + Express?
================================================================================
Accodring to the kamituel's comment, my decidion was to spawn new process for performing "Heavy" calculations and return the result to the parent process. Now application works as I wanted.
I modified my code:
var express = require('express');
var app = express();
// Page A
app.get('/a', function(req, res) {
// Creating child process
var child = require('child_process').fork('custom-module.js');
// Sending value
child.send(25);
// Receiving result
child.on('message', function(result) {
res.type('text/plain');
res.send('Page A, number = ' + result);
});
});
// Page B
app.get('/b', function(req, res) {
res.type('text/plain');
res.send('Page B');
});
app.listen(8082);
custom-module.js:
function doHeavyOperation(number, callback) {
for (var i=0; i<2000000000; i++) {
number++;
}
callback(number);
};
process.on('message', function(number) {
doHeavyOperation(number, function(result) {
process.send(result);
});
});
More info about child_process here.

Javascript (so Node.js as well) uses one main thread for (almost) all of the JS code you write. This means that if you have some long running task (like a huge loop as in your example), this task will block execution of other code (i.e. callbacks).
What can you do about that?
Break the loop into smaller pieces, and yield the control occasionally (i.e. using setTimeout/nextTick or similar). I.e. you can iterate from 0 to 1000, then do the next 1000 of the loop iterations on the next tick and so on.
spawn a new process and handle the heavy task there. Since new process will be outside of Node.js itself, it will not block it.

Related

Does anyone know how to define navigator online in main process in electron?

I know you can use navigator onLine inside the renderer process because it's a rendered inside a browser. But what I'm trying to do is something like this in the main process:
if (navigator.onLine){
mainWindow.loadURL("https://google.com")
} else {
mainWindow.loadFile(path.join(__dirname, 'index.html'));
}
So basically if the user is offline, just load a local html file, and if they're online, take them to a webpage. But, like expected, I keep getting the error that 'navigator is not defined'. Does anyone know how can I somehow import the navigate cdn in the main process? Thanks!
TL;DR: The easiest thing to do is to just ask Electron. You can do this via the net module from within the Main Process:
const { net } = require ("electron");
const isInternetAvailable = () => return net.isOnline ();
// To check:
if (isInternetAvailable ()) { /* do something... */ }
See Electron's documentation on the method; specifically, this approach doesn't tell you whether your service is accessible via the internet, but rather that a service can be contacted (or not even this, as the documentation mentions links which would not involve any HTTP request at all).
However, this is not a reliable measurement and you might want to increase its hit rate by manuallly checking whether a certain connection can be made.
In order to check whether an internet connection is available, you'll have to make a connection yourself and see if it fails. This can be done from the Main Process using plain NodeJS:
// HTTP code basically from the NodeJS HTTP tutorial at
// https://nodejs.dev/learn/making-http-requests-with-nodejs/
const https = require('https');
const REMOTE_HOST = "google.com"; // Or your domain
const REMOTE_EP = "/"; // Or your endpoint
const REMOTE_PAGE = "https://" + REMOTE_HOST + REMOTE_EP;
function checkInternetAvailability () {
return new Promise ((resolve, reject) => {
const options = {
hostname: REMOTE_HOST,
port: 443,
path: REMOTE_EP,
method: 'GET',
};
// Try to fetch the given page
const req = https.request (options, res => {
// Yup, that worked. Tell the depending code.
resolve (true);
req.destroy (); // This is no longer needed.
});
req.on ('error', error => {
reject (error);
});
req.on ('timeout', () => {
// No, connection timed out.
resolve (false);
req.destroy ();
});
req.end ();
});
}
// ... Your window initialisation code ...
checkInternetAvailability ().then (
internetAvailable => {
if (internetAvailable) mainWindow.loadURL (REMOTE_PAGE);
else mainWindow.loadFile (path.join (__dirname, 'index.html'));
// Call any code needed to be executed after this here!
}
).catch (error => {
console.error ("Oops, couldn't initialise!", error);
app.quit (1);
});
Please note that this code here might not be the most desirable since it just "crashes" your app with exit code 1 if there is any error other than connection timeout.
This, however, makes your startup asynchronous, which means that you need to pay attention on the execution chain of your app startup. Also, startup may be really slow in case the timeout is reached, it may be worth considering NodeJS' http module documentation.
Also, it makes sense to actually try to retrieve the page you're wanting to load in the BrowserWindow (constant values REMOTE_HOST and REMOTE_EP), because that also gives you an indication whether your server is up or not, although that means that the page will be fetched twice (in the best case, when the connection test succeeds and when Electron loads the page into the window). However, that should not be that big of a problem, since no external assets (images, CSS, JS) will be loaded.
One last note: This is not a good metric of whether any internet connection is available, it just tells you whether your server answered within the timeout window. It might very well be that any other service works or that the connection just is very slow (i.e., expect false negatives). Should be "good enough" for your use-case though.

NodeJS cluster, Is it really needed?

I decided that i want to investigate what is the best possible way to handle big amount of traffic with NodeJS server, i did a small test on 2 digital ocean servers which has 1GB RAM / 2 CPUs
No-Cluster server code:
// Include Express
var express = require('express');
// Create a new Express application
var app = express();
// Add a basic route – index page
app.get('/', function (req, res) {
res.redirect('http://www.google.co.il');
});
// Bind to a port
app.listen(3000);
console.log('Application running');
Cluster server code:
// Include the cluster module
var cluster = require('cluster');
// Code to run if we're in the master process
if (cluster.isMaster) {
// Count the machine's CPUs
var cpuCount = require('os').cpus().length;
// Create a worker for each CPU
for (var i = 0; i < cpuCount; i += 1) {
cluster.fork();
}
// Code to run if we're in a worker process
} else {
// Include Express
var express = require('express');
// Create a new Express application
var app = express();
// Add a basic route – index page
app.get('/', function (req, res) {
res.redirect('http://www.walla.co.il');
});
// Bind to a port
app.listen(3001);
console.log('Application running #' + cluster.worker.id);
}
And i sent stress test requests to those servers, i excepted that the cluster server will handle more requests but it didn't happen, both servers crashed on the same load, although 2 node services were running on the cluster and 1 service on the non-cluster.
Now i wonder why ? Did i do anything wrong?
Maybe something else is making the servers reach its breakpoint? both servers crashed at ~800 rps
Now i wonder why ? did i do anything wrong?
Your test server doesn't do anything other than a res.redirect(). If your request handlers use essentially no CPU, then you aren't going to be CPU bound at all and you won't benefit from involving more CPUs. Your cluster will be bottlenecked at the handling of incoming connections which is going to be roughly the same with or without clustering.
Now, add some significant CPU usage to your request handler and you should get a different result.
For example, change to this:
// Add a basic route – index page
app.get('/', function (req, res) {
// spin CPU for 200ms to simulate using some CPU in the request handler
let start = Date.now();
while (Date.now() - start < 200) {}
res.redirect('http://www.walla.co.il');
});
Running tests is a great thing, but you have to be careful what exactly you're testing.
What #jfriend00 says is correct; you aren't actually doing enough heavy lifting to justify this, however, you're not actually sharing the load. See here:
app.listen(3001);
You can't bind two services onto the same port and have the OS magically load-balance them[1]; try adding an error handler on app.listen() and see if you get an error, e.g.
app.listen(3001, (err) => err ? console.error(err));
If you want to do this, you'll have to accept everything in your master, then instruct the workers to do the task, then pass the results back to the master again.
It's generally easier not to do this in your Node program though; your frontend will still be the limiting factor. An easier (and faster) way may be to put a special purpose load-balancer in front of multiple running instances of your application (i.e. HAProxy or Nginx).
[1]: That's actually a lie; sorry. You can do this by specifying SO_REUSEPORT when doing the initial bind call, but you can't explicitly specify that in Node, and Node doesn't specify it for you...so you can't in Node.

NodeJS cluster don't recognize the master worker in clustering

I'm trying to cluster my node server so I was just testing the example code below.
The below code worked the first time I tried it. I created a new js file and ran the code - worked flawlessly.
Then I deleted the 'practice' js file and moved exactly the same code into my server file to implement it.
Now it won't ever recognize the first worker as the master worker... I have no idea what might have gone wrong.
I have tried setting process.env.NODE_UNIQUE_ID to undefined but it won't reset the master worker! so every time I run this code, I get "Application running!" without "worker loop" which should show everytime it loops through creating a worker, meaning it is not recognising the first worker as the master worker.
Does anyone know what the problem might be?
const cluster = require('cluster');
if (cluster.isMaster) {
var cpuCount = require('os').cpus().length;
for (var i = 0; i < cpuCount; i ++) {
cluster.fork();
console.log(`worker loop ${i}`)
}
} else {
var express = require('express');
var app = express();
app.get('/', function (req, res) {
res.send('Hello World!');
});
app.listen(3000);
console.log('Application running!');
}

node.js express response.write() not async in Safari

I have a very simple node.js server that I use to ping some servers I need to keep online.
Using Express I have a very simple endpoint I can access that will perform a loop of requests and report the results.
Using res.write() on each loop, the webpage I load can show me the progress as it's happening.
The problem is, this progress doesn't happen in Safari on either OS X or iOS. It waits until the process is complete and then dumps the whole output in 1 go.
Here's an example of my code:
router.route('/test').get(function(req, res)
{
res.write('<html><head></head><body>');
res.write('Starting tests...<br />');
performServerTests(req, res, function(results)
{ // Each loop within performServerTests also uses res.write()
res.write('<br />Complete');
res.end('</body></html>');
});
});
Is there a known reason why Safari would wait for the res.end() call before displaying what it already has, while Chrome shows each res.write() message as it receives it?
Thanks
When using chunked transfers (what you're trying to do), browsers are generally waiting for a minimum amount of data to be received before starting rendering. The exact size is browser-specific - see Using "transfer-encoding: chunked", how much data must be sent before browsers start rendering it? for some quite recent data points on this.
You example could for example be written like this (adding some headers to be explicit too):
router.route('/test').get(function(req, res)
{
res.setHeader('Content-Type', 'text/html; charset=UTF-8');
res.setHeader('Transfer-Encoding', 'chunked');
res.write('<html><head></head><body>');
res.write('Starting tests...<br />');
var buf = ""
for (var i = 0; i < 500; i++) {
buf += " "
}
res.write(buf);
performServerTests(req, res, function(results)
{ // Each loop within performServerTests also uses res.write()
res.write('<br />Complete');
res.end('</body></html>');
});
});

Make HTTP request inside Web Worker

I am trying to use web-workers or threads in my node application for the first time. I am using the webworker-threads npm module.
Basically I would like each worker to make requests to a server, measure the response time and send it back to the main thread.
I tried it many different ways, but I just can't seem to get it working. The basic examples from the docs work. But when I try to require a module ("request" in my case), the workers just seem to stop working, without any error messages. I saw in the docs that require doesn't work inside a worker, so I tried "importScripts()", which doesn't work either. When using threadpools I tried to use .all.eval() but it didn't work either.
Since this is the first time working with web-workers / threads in node, I might misunderstand how to use those things in general. Here is one example I tried:
server.js
var Worker = require('webworker-threads').Worker;
var worker = new Worker('worker.js');
worker.js
console.log("before import");
importScripts('./node_modules/request/request.js');
console.log("after import");
This basic example only prints before import and then stops.
Web workers are native javascript only so you can't achieve what you want with them. Worker threads don't support node.js api or npm packages(like http or request.js). For concurrency you don't need any multithread magic just use async.js or promises. If you want to play with threads then child_processes is the way to go. You could also use an API to manage child_processes like https://github.com/rvagg/node-worker-farm
Considering your example you could write something like this:
main.js
var workerFarm = require('worker-farm')
, workers = workerFarm(require.resolve('./child'))
, ret = 0;
var urls = ['https://www.google.com', 'http://stackoverflow.com/', 'https://github.com/'];
urls.forEach(function (url) {
workers(url, function (err, res, body, responseTime) {
console.log('Url ' + url + 'finished in ' + responseTime + 'ms');
//Ugly code here use async/promise instead
if (++ret == urls.length)
workerFarm.end(workers);
});
});
child.js
var request = require('request');
module.exports = function(url, cb) {
var start = new Date();
request(url, function(err, res, body) {
var responseTime = new Date() - start;
cb(err, res, body, responseTime);
});
};

Categories