Node Winston log file forced string conversion - javascript

In a Node project, I want to show the contents of a Winston log file in a React interface. Reading the file:
let content;
fs.readFile("path", "utf-8", function read(err, data) {
if (err)
throw err;
content = data;
});
I send them to the interface:
router.get("/", function (req, res) {
res.status(200).send(JSON.stringify(content));
});
And i get the content in a .jsx file:
getLogs().then(res => {
let datafromfile = JSON.parse(res);
// Use the data
return;
}).catch(err => {
return err.response;
});
The issue i am having is that fs converts all the data into a string (since i am putting the utf-8 encoding and do not want to be returned a buffer) and therefore i cannot manipulate the objects in the log file to show them structurally in the interface. Can anyone guide how to approach this problem?

I have not debugged this, but a lot of this depends on whether or not the the Winston file your loading actually has JSON in it.
If it does then JSONStream is your friend and leaning through or through2 is helpful you in node (streams).
following, code/pseudo
router.get("/", function (req, res) {
const logPath = ‘somePath’; // maybe it comes from the req.query.path
const parsePath = null; // or the token of where you want to attemp to start parsing
fs.createReadStream(logPath)
.pipe(JSONStream.parse(parsePath))
.pipe(res);
});
JSONStream
fs.createReadStream and node docs
through2

Related

How to send data from NodeJS server side to the JS client side, only when data is ready?

On my website, when the user clicks on a button, some user's data will be stored in a database and after that I want the server to send notification data to the Javascript frontend file to change the UI.
Right now, the Js file (index.js) receives data right after the website loads (always false). I want it to be received only when the data is ready on the server.
I searched a lot but couldn't find an answer to my problem?
I appreciate any help :)
server.js
var requestValidation = false;
app.post("/", function(req, res){
var name = req.body.personName;
var email = req.body.personEmail;
var collabTopic = req.body.collabTopic;
const newUser = new User({ //mongoDB schema
name: name,
email: email,
collabTopic: collabTopic
});
newUser.save(function(err){ //adding data to mongoDB
if(!err){
requestValidation = true;
}
});
});
app.get("/succ", function(req, res){
res.json(requestValidation);
});
index.js
const url = "http://localhost:3000/succ";
const getData = async (url) => {
try {
const response = await fetch(url);
const json = await response.json();
console.log(json);
} catch (error) {
console.log(error);
}
};
getData(url);
I'm not sure this is completely the answer you're looking for, but it's definitely a tool/feature to consider as you rework your approach.
app.post("/", async (req, res) => {
let result = await INSERT MONGODB UPDATE OR INSERT FUNCTION;
res.render("YOUR TEMPLATE", result);
});
You probably can't plug and play this, but when you finish a MongoDB operation, it returns a json object with some details on whether or not there was success. For example, a MongoDB insert operation returns something like this (stored in the variable result that I created)
{ "acknowledged" : true, "insertedId" : ObjectId("5fd989674e6b9ceb8665c57d") }
and then you can pass this value on as you wish.
Edit: This is what tkausl referred to in a comment.
Here is an example if you want to pass the content of a txt file to the client with express and jquery:
in express:
app.get('/get', (req, res) => {
fs.readFile('test.txt', (err, data) => {
if (err) throw err;
return res.json(JSON.parse(data));
})
})
jquery in client side:
$.getJSON( "http://localhost:3000/get", function( data ) {
geojsondata1 = JSON.stringify(data)
}
now you can do anything you want with the variable data

Why downloading a file from node.js server multiple times results in empty files

I am glad to get some help.
Here is my problem:
I have built a web server with node.js that should send a csv file to the client when requested through a certain route. The csv file is created from json using the fast-csv package. The json data comes from a mongoDB and is processed with mongoose.
When I request this route once, it works fine. However, when it is requested a second time, an empty file is sent to the client. By the way, the headers reach the client correctly.
I have to restart the server to download my file again.
What did I try:
Basically, I have now lost track of everything I have tried. This behavior occurs both when using postman and when querying via the browser.
I've tried implementing promises in my handler function.
I've tried to unsubscribe
res somehow (but yes, that was a stupid approach).
I`ve tried to write the file into the fs and to send it on a second request. ...
Maybe one of you can tell what's going wrong here at first glance:
const { format } = require("#fast-csv/format");
const csvStream = format({ delimiter: ";", headers: true });
const router = express.Router();
router.route("/:collection/csv").get(requireModel, createCsv);
const csvFromDatabase = (data, res) => {
csvStream.pipe(res);
const processData = (data) => {
data.forEach((row) => {
const { _id, __v, ...newRow } = row._doc;
csvStream.write({ ...newRow });
});
csvStream.end();
};
processData(data);
};
const createCsv = async (req, res) => {
const { model } = req;
const items = await model.find({});
res.setHeader("Content-disposition", "attachment; filename=file.csv");
res.setHeader("Content-type", "text/html; charset=UTF-8");
csvFromDatabase(items, res);
};
Thank you very much for your help. I hope I didn't bore you with too stupid questions.
You need to recreate csvStream for each new request:
const csvFromDatabase = (data, res) => {
const csvStream = format({ delimiter: ";", headers: true });
csvStream.pipe(res);
…
};

How to send piped response in Next.js API using wkhtmltoimage?

I'm new to Next.js, and I'm trying to use wkhtmltoimage but I can't seem to send the generated image stream as a response in my Next.js API.
const fs = require('fs')
const wkhtmltoimage = require('wkhtmltoimage').setCommand(__dirname + '/bin/wkhtmltoimage');
export default async function handler(req, res) {
try {
await wkhtmltoimage.generate('<h1>Hello world</h1>').pipe(res);
res.status(200).send(res)
} catch (err) {
res.status(500).send({ error: 'failed to fetch data' })
}
}
I know I'm doing plenty of stuff wrong here, can anyone point me to the right direction?
Since you're concatenating __dirname and /bin/wkhtmltoimage together, that would mean you've installed the wkhtmltoimage executable to ./pages/api/bin which is probably not a good idea since the pages directory is special to Next.js.
We'll assume you've installed the executable in a different location on your filesystem/server instead (e.g., your home directory). It looks like the pipe function already sends the response, so the res.status(200).send(res) line will cause problems and can be removed. So the following should work:
// ./pages/api/hello.js
const homedir = require("os").homedir();
// Assumes the following installation path:
// - *nix: $HOME/bin/wkhtmltoimage
// - Windows: $env:USERPROFILE\bin\wkhtmltoimage.exe
const wkhtmltoimage = require("wkhtmltoimage").setCommand(
homedir + "/bin/wkhtmltoimage"
);
export default async function handler(req, res) {
try {
res.status(200);
await wkhtmltoimage.generate("<h1>Hello world</h1>").pipe(res);
} catch (err) {
res.status(500).send({ error: "failed to fetch data" });
}
}

passing audio from mongodb to audio tag

For my project I'm trying to create an audio player. The database aspect of storing files is new to me as I've only stored strings before.
So far, what I've been able to do is:
Store the audio file into the database.(I'm linking to a file here for simplicity but in the future it will be uploaded)
Retrieve the audio file as an object.
Store the audio file in the public folder for use.
Server side code(the route code is separate from the server code)
let fs = require('fs');
var bodyParser = require('body-parser')
var urlencodedParser = bodyParser.urlencoded({
extended: false
})
const MongoClient = require('mongodb').MongoClient;
const Binary = require('mongodb').Binary;
const ObjectId = require('mongodb').ObjectId;
module.exports = function(app) {
app.get('/music', function(req, res) {
//STEP ONE
var data = fs.readFileSync(__dirname + '/../public/recordings/Piso 21 - Puntos Suspensivos.mp3');
var insert_data = {};
insert_data.name = 'Piso 21 - Puntos Suspensivos.mp3';
insert_data.file_data = Binary(data);
MongoClient.connect("mongodb://localhost/songs", {
useNewUrlParser: true
}, function(err, db) {
if (err) throw err;
var dbo = db.db("songs");
dbo.collection("song").insertOne(insert_data, function(err, res) {
if (err) throw err;
console.log("1 document inserted");
db.close();
});
});
//STEP TWO
MongoClient.connect("mongodb://localhost/songs", {
useNewUrlParser: true
}, function(err, db) {
if (err) throw err;
var dbo = db.db("songs");
dbo.collection("song").findOne({
name: 'Piso 21 - Puntos Suspensivos.mp3'
}, function(err, result) {
if (err) throw err;
db.close();
//STEP THREE
fs.writeFile(result.name, result.file_data.buffer, function(err) {
if (err) throw err;
console.log(result);
});
});
});
res.render('audio');
});
The third step is what I don't what to do. I'd like to send the result object to the audio.ejs page and somehow give the audio tag access to it without having to save it in the public folder an then have to delete it after use.
Something like this,
STEP THREE
res.render('audio', result);
and somehow give an audio tag access to it in the audio.ejs page
UPDATE
let fs = require('fs');
var bodyParser = require('body-parser')
var urlencodedParser = bodyParser.urlencoded({ extended: false })
const MongoClient = require('mongodb');
const Binary = require('mongodb').Binary;
const ObjectId = require('mongodb').ObjectId;
const Grid = require('gridfs-stream');
const db = new MongoClient.Db('songs', new MongoClient.Server("localhost", 27017));
const gfs = Grid(db, MongoClient);
const bcrypt = require('bcryptjs');
module.exports = function(app){
app.get('/audio/:filename', function (req, res) {
MongoClient.connect("mongodb://localhost/songs", { useNewUrlParser: true }, function(err, db) {
if (err) throw err;
var dbo = db.db("songs");
dbo.collection("song").findOne({name: req.params.filename}, function(err, result){
if (err) throw err;
db.close();
const readstream = gfs.createReadStream(result.file_data);
readstream.on('error', function (error) {
res.sendStatus(500);
});
console.log(res);
res.type('audio/mpeg');
readstream.pipe(res);
});
});
});
In old-timey database lingo media objects are called BLOBS -- binary large objects. In Mongo they're handled with a subsystem known as gridfs. There's a nice npm module called gridfs-stream to make this easier.
An easy way to deliver media objects to browsers is to make them available behind URLs that look like https://example.com/audio/objectname.mp3. And, they should be delivered with the appropriate Content-Type header for the codec in use (audio/mpeg for MP3). Then the src tag can simply name the URL and you're rockin' and rollin'. The audio tag in the browser page looks something like this:
<audio controls src="/audio/objectname.mp3" ></audio>
So, if you want to deliver audio directly via express, you need a route with a parameter, something like
app.get('/audio/:filename', ...
Then the node program uses something like this not debugged!)
const mongo = require('mongodb');
const Grid = require('gridfs-stream');
...
const db = new mongo.Db('yourDatabaseName', new mongo.Server("host", 27017));
const gfs = Grid(db, mongo);
...
app.get('/audio/:filename', function (req, res) {
const readstream = gfs.createReadStream({filename: req.params.filename})
readstream.on('error', function (error) {
res.sendStatus(500)
})
res.type('audio/mpeg')
readstream.pipe(res)
});
This is cool because streams are cool: your node program doesn't need to slurp the whole audio file into RAM. Audio files can be large.
gridfs offers the mongofiles command line utility for loading files into gridfs.
But, all that being said: Most scalable media services use static media files delivered from file systems and/or content delivery networks. Servers like apache and nginx have many programmer years invested in making file delivery fast and efficient. The database holds the pathnames for the files in the CDN.
How to troubleshoot this kind of thing?
Watch the browser's console log.
Hit the media URL directly from a browser. See what you get. If it's empty, something's wrong with your retrieval code.
In dev tools in the browser, look at the Network tab (in Google Chrome). Look for the media object, and examine what's going on.
I think what you're looking for is a stream, so you can stream data from the server to the webpage directly without saving it. Node js comes with this functionality more of it from the documentation here https://nodejs.org/api/stream.html

How do I write the result from res.json to a proper json file using node.js? [duplicate]

This question already has answers here:
Writing to files in Node.js
(18 answers)
Closed 5 years ago.
This is the code snippet. The query returns in json form but how do I write these values in a JSON file?
app.get('/users', function(req, res) {
User.find({}, function(err, docs) {
res.json(docs);
console.error(err);
})
});
If you're going to be writing to a file within a route callback handler you should use the Asynchronous writeFile() function or the fs.createWriteStream() function which are a part of the fs Module in the Node.js Core API . If not, your server will be unresponsive to any subsequent requests because the Node.js thread will be blocking while it is writing to the file system.
Here is an example usage of writeFile within your route callback handler. This code will overwrite the ./docs.json file every time the route is called.
const fs = require('fs')
const filepath = './docs.json'
app.get('/users', (req, res) => {
Users.find({}, (err, docs) => {
if (err)
return res.sendStatus(500)
fs.writeFile(filepath, JSON.stringify(docs, null, 2), err => {
if (err)
return res.sendStatus(500)
return res.json(docs)
})
})
})
Here is an example usage of writing your JSON to a file with Streams. fs.createReadStream() is used to create a readable stream of the stringified docs object. Then that Readable is written to the filepath with a Writable stream that has the Readable data piped into it.
const fs = require('fs')
app.get('/users', (req, res) => {
Users.find({}, (err, docs) => {
if (err)
return res.sendStatus(500)
let reader = fs.createReadStream(JSON.stringify(docs, null, 2))
let writer = fs.createWriteStream(filename)
reader.on('error', err => {
// an error occurred while reading
writer.end() // explicitly close writer
return res.sendStatus(500)
})
write.on('error', err => {
// an error occurred writing
return res.sendStatus(500)
})
write.on('close', () => {
// writer is done writing the file contents, respond to requester
return res.json(docs)
})
// pipe the data from reader to writer
reader.pipe(writer)
})
})
Use node's file system library 'fs'.
const fs = require('fs');
const jsonData = { "Hello": "World" };
fs.writeFileSync('output.json', JSON.strigify(jsonData));
Docs: fs.writeFileSync(file, data[, options])

Categories