NodeJS WriteStream - javascript

I am learning the NodeJS WriteStream. I cannot understand the effect of the res.end() function call bellow. What would happen if the res.end() call did not exists in the following example? I removed that, but did not any change and in both case the result returned to the client.
const fs = require("fs");
const server = require("http").createServer();
server.on("request", (req, res) => {
const readable = fs.createReadStream("test-file.txt");
readable.on("data", (chunk) => {
res.write(chunk);
});
// Is the following piece of code really needed?
readable.on("end", () => {
res.end();
});
});
server.listen(8000, "127.0.0.1", () => {
console.log("Listening...");
});

In some scenarios the res.end is used to end the response , as it says , otherwise it would never stop.
And it may be used if the data being read is large , with async functions that need time to be executed , or with shell scripts with exports etc ..

Related

Node.js running a shell command from the same process

I'm trying to make a way to boot up a Minecraft server from nodejs, but I'm having trouble making a way to run commands from nodejs.
const { spawn } = require('node:child_process')
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const fs = require('fs');
app.get('/start', (req, res) => {
fs.writeFile('minecraftstatus.txt', 'on', (err) => {
if (err) throw err;
});
const command = spawn('java', ['-jar', '-Xms2048M','-Xmx2048M', '-Dfile.encoding=utf8', 'server.jar', 'nogui'])
// the `data` event is fired every time data is
// output from the command
command.stdout.on('data', output => {
// the output data is captured and printed in the callback
fs.appendFile('console.txt', ("\n" + output.toString()), 'utf-8', err => {
console.log(err)
})
console.log("Output: ", output.toString())
})
res.status(200).send("OK")
});
app.listen(80, () => {
console.log('Server started on port 80');
});
From what you see above, whenever a user sends a GET request, it sends a command and appends any output to a text file. I need to make a way in order to send commands to Minecraft. I need to send commands to the same shell that nodejs ran the command.
I've tried this:
app.get('/mcstop', (req, res) => {
try {
const command2 = spawn('/stop')
// the `data` event is fired every time data is
// output from the command
command2.stdout.on('data', output => {
// the output data is captured and printed in the callback
console.log("Output: ", output.toString())
})
res.status(200).send("OK")
}
catch {
console.log("Oh no...")
}
});
Where it sends /stop to the shell, but it seems like it isn't being ran on the same shell as where the Minecraft server was created from.
How could I achieve this?

Scrape multiple domains with axios, cheerio and handlebars on node js

I am trying to make a webscraper, that outputs certain data from node js into the javascript, or html file im working on. Its important that the data of multiple sub pages can be scraped (that I have no code access to) and be displayed in the same html or js file. The problem is that I cant output the results I get from the axios function into global. If i could my problem would be solved.
So far I have been trying to use axios to get the data I need and cheerio to modify it. I created a const named "articles" where I pushed in every title I needed from the website im scraping.
const axios = require('axios')
const cheerio = require('cheerio')
const express = require('express')
const hbs = require('hbs')
const url = 'https://www.google.com/'
const articles = []
axios(url)
.then(response => {
const html = response.data
const $ = cheerio.load(html)
$('.sprite', html).parent().children('a').each(function() {
const text = $(this).attr('title')
articles.push({
text
})
})
console.log(articles)
const finalArray = articles.map(a => a.text);
console.log(finalArray)
}).catch(err => console.log(err))
That works well so far. If I ouput the finalArray I get the array I want to. But once im outside of the axios function the array is empty. Only way it worked for me is when I put the following code inside the axios function, but in this case I wont be able to scrape multiple websides.
console.log(finalArray) //outputs empty array
// with this function I want to get the array displayed in my home.hbs file.
app.get('/', function(req, res){
res.render('views/home', {
array: finalArray
})
})
Basicly all I need is to get the finalArray into global so I can use it in the app.get function to render the Website with the scraped data.
There are two cases here. Either you want to re-run your scraping code on each request, or you want to run the scraping code once when the app starts and re-use the cached result.
New request per request:
const axios = require("axios");
const cheerio = require("cheerio");
const express = require("express");
const scrape = () =>
axios
.get("https://www.example.com")
.then(({data}) => cheerio.load(data)("h1").text());
express()
.get("/", (req, res) => {
scrape().then(text => res.json({text}));
})
.listen(3000);
Up-front, one-off request:
const scrapingResultP = axios
.get("https://www.example.com")
.then(({data}) => cheerio.load(data)("h1").text());
express()
.get("/", (req, res) => {
scrapingResultP.then(text => res.json({text}));
})
.listen(3000);
Result:
$ curl localhost:3000
{"text":"Example Domain"}
It's also possible to do a one-off request without a callback or promise that uses a race condition to populate a variable in scope of the request handlers as well as the scraping response handler. Realistically, the server should be up by the time the request resolves, though, so it's common to see this:
let result;
axios
.get("https://www.example.com")
.then(({data}) => (result = cheerio.load(data)("h1").text()));
express()
.get("/", (req, res) => {
res.json({text: result});
})
.listen(3000);
Eliminating the race by chaining your Express routes and listener from the axios response handler:
axios.get("https://www.example.com").then(({data}) => {
const text = cheerio.load(data)("h1").text();
express()
.get("/", (req, res) => {
res.json({text});
})
.listen(3000);
});
If you have multiple requests you need to complete before you start the server, try Promise.all. Top-level await or an async IIFE can work too.
Error handling has been left as an exercise.
Problem has been resolved. I used this code, instead of the normal axios.get(url) function:
axios.all(urls.map((endpoint) => axios.get(endpoint))).then(
axios.spread(({data:user}, {data:repos}) => {
with "user", and "repos" I am now able to enter both URL data and can execute code regarding the URL i like to chose in that one function.

TypeError: server.listen is not a function in NodeJS

When i create new folder i can able to see server.listen method but i can't see this method in my another folder which i use for NODEJS codes. I use Visual Studio Code and I can't understand why.
const http = require('http');
const server = http.createServer = ((req, res) => {
console.log(req);
});
server.listen(3000);
http.createServer is a function which takes the handler function as an argument, so you call it like this:
const server = http.createServer((req, res) => {
console.log(req);
});
In the code you posted, http.createServer = ((req, res) => {... attempts to assign the function to http.createServer.
Module http included in node. Try install #types/node. It works on me.
const http = require('http')
const server = http.createServer(() => {
console.log('got')
})
server.listen(3000)
Also, req and res should had type notation except any.

Merge Two codes

I have 2 files in Node js .I want to merge these 2, but I am facing problem..
This file calls function from python file
const app = express()
let runPy = new Promise(function(success, nosuccess) {
const { spawn } = require('child_process');
const pyprog = spawn('python', ['./ml.py']);
pyprog.stdout.on('data', function(data) {
success(data);
});
pyprog.stderr.on('data', (data) => {
nosuccess(data);
});
});
app.get('/', (req, res) => {
res.write('welcome\n');
runPy.then(function(testMLFunction) {
console.log(testMLFunction.toString());
res.end(testMLFunction);
});
})
app.listen(4000, () => console.log('Application listening on port 4000!'))
python file ml.py
def testMLFunction():
return "hello from Python"
print(testMLFunction())
Below file works on button click with post method
var fs = require('fs');
var server = http.createServer(function (req, res) {
if (req.method === "GET") {
res.writeHead(200, { "Content-Type": "text/html" });
fs.createReadStream("./form.html", "UTF-8").pipe(res);
} else if (req.method === "POST") {
var result = "";
req.on("data", function (chunk) {
console.log(chunk.toString());
result = chunk;
//body=body.toUpperCase;
});
req.on("end", function(){
res.writeHead(200, { "Content-Type": "text/html" });
res.end(result);
});
}
}).listen(3000);
how can I do that..
There are several things wrong here. I will explain as plain as possible.
You forgot to add in your code var express = require('express')
The promise you made, runPy, must be wrapped in a function, whereas your approach will instantly start the promise upon loading the script itself.
You are resolving/rejecting on first incoming output, you shouldn't do that because you won't be able to know what really happened in the shell. You need to store those output lines, this is the only way of you knowing what the script tells you.
In runPy you must resolve/reject upon pyprogr close event.
You cannot access directly the method of another script, no matter what that kind of file that is a py, sh, bat, js. However, you can access internal functions of it by passing arguments to the shell, and of course, that script must have the logic required to deal with those arguments.
When using spawn/exec you must keep in mind that YOU ARE NOT the user executing the script, the node user is, so different outcomes may occur.
Most importantly, your targeted script must PRINT/ECHO to shell, no returns! The best approach would be to print some json string, and parse it in javascript after the shell is closed, so you can have access to an object instead of a string.
Below you will find a demo for your use case, i changed the python file so it can print something.
ml.py
print('I\'m the output from ml.py')
index.js
const express = require('express');
const app = express()
let runPy = function () { // the promise is now wrapped in a function so it won't trigger on script load
return new Promise(function (success, nosuccess) {
const {spawn} = require('child_process');
const pyprog = spawn('python', ['./ml.py'], {shell: true}); // add shell:true so node will spawn it with your system shell.
let storeLines = []; // store the printed rows from the script
let storeErrors = []; // store errors occurred
pyprog.stdout.on('data', function (data) {
storeLines.push(data);
});
pyprog.stderr.on('data', (data) => {
storeErrors.push(data);
});
pyprog.on('close', () => {
// if we have errors will reject the promise and we'll catch it later
if (storeErrors.length) {
nosuccess(new Error(Buffer.concat(storeErrors).toString()));
} else {
success(storeLines);
}
})
})
};
let path = require('path');
app.use(express.json());
app.use(express.urlencoded({ extended: true })); // you need to set this so you can catch POST requests
app.all('/', (req, res) => { // i've change this from .get to .all so you can catch both get and post requests here
console.log('post params', req.body);
if(req.body.hasOwnProperty('btn-send')){
runPy()
.then(function (pyOutputBuffer) {
let message = 'You sent this params:\n' +JSON.stringify(req.body, null,2) + '\n';
message += Buffer.concat(pyOutputBuffer).toString();
res.end(message);
})
.catch(console.log)
}else{
res.sendFile(path.join(__dirname,'form.html')); // you need an absolute path to 'file.html'
}
});
app.listen(4000, () => console.log('Application listening on port 4000!'));
form.html
<div>hello there</div>
<form action="/" method="post">
<input type="text" value="" name="some-text"/>
<button type="submit" value="1" name="btn-send" >Press me!</button>
</form>

How do I write the result from res.json to a proper json file using node.js? [duplicate]

This question already has answers here:
Writing to files in Node.js
(18 answers)
Closed 5 years ago.
This is the code snippet. The query returns in json form but how do I write these values in a JSON file?
app.get('/users', function(req, res) {
User.find({}, function(err, docs) {
res.json(docs);
console.error(err);
})
});
If you're going to be writing to a file within a route callback handler you should use the Asynchronous writeFile() function or the fs.createWriteStream() function which are a part of the fs Module in the Node.js Core API . If not, your server will be unresponsive to any subsequent requests because the Node.js thread will be blocking while it is writing to the file system.
Here is an example usage of writeFile within your route callback handler. This code will overwrite the ./docs.json file every time the route is called.
const fs = require('fs')
const filepath = './docs.json'
app.get('/users', (req, res) => {
Users.find({}, (err, docs) => {
if (err)
return res.sendStatus(500)
fs.writeFile(filepath, JSON.stringify(docs, null, 2), err => {
if (err)
return res.sendStatus(500)
return res.json(docs)
})
})
})
Here is an example usage of writing your JSON to a file with Streams. fs.createReadStream() is used to create a readable stream of the stringified docs object. Then that Readable is written to the filepath with a Writable stream that has the Readable data piped into it.
const fs = require('fs')
app.get('/users', (req, res) => {
Users.find({}, (err, docs) => {
if (err)
return res.sendStatus(500)
let reader = fs.createReadStream(JSON.stringify(docs, null, 2))
let writer = fs.createWriteStream(filename)
reader.on('error', err => {
// an error occurred while reading
writer.end() // explicitly close writer
return res.sendStatus(500)
})
write.on('error', err => {
// an error occurred writing
return res.sendStatus(500)
})
write.on('close', () => {
// writer is done writing the file contents, respond to requester
return res.json(docs)
})
// pipe the data from reader to writer
reader.pipe(writer)
})
})
Use node's file system library 'fs'.
const fs = require('fs');
const jsonData = { "Hello": "World" };
fs.writeFileSync('output.json', JSON.strigify(jsonData));
Docs: fs.writeFileSync(file, data[, options])

Categories