How do I configure the concurrency of node.js? - javascript

Is there a way to configure the maximum capacity of Node.js? For example, say, I have 5 URLs, but with limited hardware resource, I only want to process 2 at a time. Is there an option that I can set in Node.js, such that I don't need to control it in my code?
urls.txt
https://example.com/1
https://example.com/2
https://example.com/3
https://example.com/4
https://example.com/5
index.js
const readline = require('readline')
const fs = require('fs')
const rl = readline.createInterface({
input: fs.createReadStream('urls.txt')
})
rl.on('line', (input) => {
console.log(`Do something with: ${input}`);
})

Is there an option that I can set in Node.js, such that I don't need to control it in my code?
Nope. That's what your code is for.
Alternatively, let the kernel handle it, since you're concerned about system resources.

Related

How to solve error EROFS: read-only file system, open '/var/task/db.json'?

const jsonServer = require('json-server')
const cors = require('cors')
const path = require('path')
const server = jsonServer.create()
const router = jsonServer.router(path.join(__dirname, 'db.json'))
const middlewares = jsonServer.defaults()
server.use(cors())
server.use(jsonServer.bodyParser)
server.use(middlewares)
server.use(router)
const PORT = 8000
server.listen(PORT, () => {
console.log(`JSON Server is running on http://localhost:${PORT}`)
})
I am using cyclic.sh for deployment. How to solve this error?
I was trying to deploy a json-server. And while doing a post request I got this error.
Taking as a starting point that you shouldn't use a tool like json-server in production environments, I could understand you might be using it for demo purposes. That case, there are some serverless solutions which prevents your code from open files in write mode, so that's why the jsonServer.router(...) line fails its execution.
If you don't matter about that db.json file being updated by the requests (because anyway, your deployment solution doesn't seem to allow it), you could just use instead a js object, that can be modified in memory (but, of course, the json file will still keep intact). So, instead of:
const router = jsonServer.router(path.join(__dirname, 'db.json'))
try using something like:
const fs = require('fs')
const db = JSON.parse(fs.readFileSync(path.join(__dirname, 'db.json')))
const router = jsonServer.router(db)

Read files from a directory with a given path in JS

Is it possible to return the contents of a static path to a directory instead of using an .
I want to write a script that reads the contents of a directory on the file system to a given time daily. This is integrated in a webapp I can't edit.
Short answer: you can't.
You need to do this server-side. Here is an answer from a similar question, using node.js.
You can use the fs.readdir or fs.readdirSync methods.
fs.readdir
const testFolder = './tests/';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file);
});
});
fs.readdirSync
const testFolder = './tests/';
const fs = require('fs');
fs.readdirSync(testFolder).forEach(file => {
console.log(file);
});
The difference between the two methods, is that the first one is asynchronous, so you have to provide a callback function that will be executed when the read process ends.
The second is synchronous, it will return the file name array, but it will stop any further execution of your code until the read process ends.

how should I handle this nodejs code to do the rigth things?

this is my first question here. I need some help with my code structure. My api in node with express have to do the following:
receipt GET /api/file/{filename}
return the file content, its could be a big one (a few GB).
For now, I could get the files with streams, but I don't the best practice to handle error in this case.
'use strict';
const fs = require('fs');
const express = require('express');
const app = express();
const path = require('path');
const filePath = path.join(__dirname, `../file`);
console.log(filePath)
app.get('/api/:filename', (req, res) => {
let filename = req.params.filename
const streamFile = fs.createReadStream(`${filePath}/${filename}`);
streamFile.pipe(res);
} );
module.exports = app;
Should I make another dir, maybe 'modules', and there code an async function to read and pipe the files, and call the function from app.get in routes dir ?
Remember that Express is an "un-opinionated, minimalist web framework for Node.js applications", unopinionated means that it doesn't decide for you a lot of aspects of what tool you use for each specific task, and that is the main difference with another frameworks like Rails. Said that, you could use the classical and and old try and catch, in this case around your I/O operation. A module is a way to mantain separation of concerns and it's a way to organize your code so you can fastly identify what is the part of your code that is causing a malfunction. So in this case i don't consider it necessary because your router's callback is doing one thing and that is ok.
app.get('/api/:filename', (req, res) => {
let filename = req.params.filename
try{
const path = `${filePath}/${filename}`;
if (!fs.existsSync(path)) return res.status(404).send('You could send any message here...');
const streamFile = fs.createReadStream(path);
streamFile.pipe(res);
} catch {
res.status(500).send();
};
});

JavaScript can't receive child stream one line at a time

When using child_process.spawn in Node, it spawns a child process and automatically create stdin, stdout and stderr streams to interact with the child.
const child = require('child_process');
const subProcess = child.spawn("python", ["myPythonScript.py"])
subProcess.stdout.on('data', function(data) {
console.log('stdout: ' + data);
});
I thus imlemented this in my project but the thing is that the subprocess actually write on the output stream only when the buffer reach a certain size. And not when the buffer is set with data (whatever the size of the data).
Indeed, i'd like to receive the subprocess output stream directly when it writes it on the output stream, and not when it has filled the whole buffer. any solution ?
EDIT: As pointed out by t.888, it should actually be working as i expect. And it actually does if I spawn another subprocess. A c++ one this time. But I don't know why it does not work when I spawn my python script. Actually, the python script sends only big chunks of messages via stdout (probably when the buffer is full)
I think that you need readline instead.
const fs = require('fs');
const readline = require('readline');
async function processLineByLine() {
const fileStream = fs.createReadStream('input.txt');
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity
});
// Note: we use the crlfDelay option to recognize all instances of CR LF
// ('\r\n') in input.txt as a single line break.
for await (const line of rl) {
// Each line in input.txt will be successively available here as `line`.
console.log(`Line from file: ${line}`);
}
}
processLineByLine();
From https://nodejs.org/api/readline.html#readline_example_read_file_stream_line_by_line
I solved my problem yesterday. It was actually due to python itself and not child_process function.
I have to do
const subProcess = child.spawn("python", ["-u", "myPythonScript.py"])
instead of
const subProcess = child.spawn("python", ["myPythonScript.py"])
indeed, -u argument tells python to flush data as soon as possible.

A way to call execl, execle, execlp, execv, execvP or execvp from Node.js

POSIX systems expose family of exec functions, that allow one to load something maybe different into current process, keeping open file descriptors, process identifier and so on.
This can be done for variety of reasons, and in my case this is bootstrapping — I want to change command line options of my own process, and then reload it over existing process, so there would be no child process.
Unfortunately, to much of my surprise, I could not find the way to call any of exec* functions in Node.js. So, what is the correct way to replace currently running Node.js process with other image?
I have created a module to invoke execvp function from NodeJS: https://github.com/OrKoN/native-exec
It works like this:
var exec = require('native-exec');
exec('ls', {
newEnvKey: newEnvValue,
}, '-lsa'); // => the process is replaced with ls, which runs and exits
Since it's a native node addon it requires a C++ compiler installed. Works fine in Docker, on Mac OS and Linux. Probably, does not work on Windows. Tested with node 6, 7 and 8.
Here is an example using node-ffi that works with node v10. (alas, not v12)
#!/usr/bin/node
"use strict";
const ffi = require('ffi');
const ref = require('ref');
const ArrayType = require('ref-array');
const stringAry = ArrayType('string');
const readline = require("readline");
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout
});
rl.question('Login: ', (username) => {
username = username.replace(/[^a-z0-9_]/g, "");
rl.close();
execvp("/usr/bin/ssh", "-e", "none", username+'#localhost');
});
function execvp() {
var current = ffi.Library(null,
{ execvp: ['int', ['string',
stringAry]],
dup2: ['int', ['int', 'int']]});
current.dup2(process.stdin._handle.fd, 0);
current.dup2(process.stdout._handle.fd, 1);
current.dup2(process.stderr._handle.fd, 2);
var ret = current.execvp(arguments[0], Array.prototype.slice.call(arguments).concat([ref.NULL]));
}
I ended up using ffi module, and exported execvp from libc.

Categories