Piping console.log() contents to a .txt file in Node.js - javascript

I have a Node.js file that outputs a bunch of test results, easily >1000 lines, to Terminal. The file looks something like this:
launchChromeAndRunLighthouse('https://www.apple.com/', flags).then(results => { console.log(results); });
The console.log() is only there because I couldn't figure out another way to view the results. I need a way to create a file through Node.js, not the command line, that contains all of the CLI output/results.
I thought that fs.appendFile('example.txt', 'append this text to the file', (err) => {}); could be of use, but what I need to "append" to the file is the function's results. When I try that, the file only contains [object Object] instead of the actual results of the tests.
I'm a beginner to Node, any advice is highly appreciated.

You are close, but you need to include the appendFile inside of your other function. This assumes that your 'results' object is of the string type. If not then you need to get the string version of this object.
The lighthouse docs specify the format of the log information that is returned. If you add output: json to the flags object then you can use it lik
launchChromeAndRunLighthouse('https://www.apple.com/', flags).then(results => {
fs.appendFile('example.txt', JSON.stringify(results), (err) => {
console.log('error appending to file example.txt');
});
});

Related

Node.js/jade -- How can I pass mysql data as a local variable into inline javascript?

My Node script has this in it:
var connection = mysql.createConnection(...);
connection.connect();
connection.query(/*sql query*/, function(err, rows, fields){
app.get('/', function(req, res){
res.render('index', { data: JSON.stringify(rows) });
});
});
Then if I do this in my Jade template:
body
p !{data}
It displays the data from the MySql query exactly as you'd expect. But if instead I do:
body
script(type='text/javascript').
console.log(!{data});
It gives me [Object, Object, Object, Object....
Why is it interpreted differently if it's part of the client Javascript? And how can I fix this?
I put JSON.stringify in the local variable assignment because if I didn't, nothing would get passed through no matter where in the template I tried to put it. Is there another way I'm supposed to be transforming the data maybe?
You should have to give an index like
console.log(!{data[1]}); to view the objects in console..
I have recently run into this issue like you. I think it would be helpful to point out a few things:
The mysql library from node already returns your data into json. you using stringify returns it as a string; I believe that is why your getting that object back in the console log. try just returning data:rows and accessing the data via dot syntax notation in your template. this is what worked for me.

How do you use Papa Parse without jquery?

I am trying to implement Papa Parse, but I do not want to use the jquery library. Could someone please show me how to parse in normal javascript using a file form my local storage?
Ok, so when I do this I am getting the string value and not the csv values. What am I doing wrong? Also, where do I insert the callback function that I want to use?
function parseMe(url) {
Papa.parse(url, {
complete: function(results) {
console.log(results); // results appear in dev console
}
});
}
parseMe('csv/test.csv');
Close, but file is not a string, it's a File object obtained from the DOM (docs). To do this, you will need to place a <input type="file"> tag in your page. The user will have to select the file. After they've chosen a file, you can obtain the File object with something like document.getElementById("file").files[0] -- assuming you've given the input tag an ID of "file" of course.
Also, you can cut out all the cruft in the config object since those are all defaults.
function parseMe(file) {
Papa.parse(file, {
complete: function(results) {
console.log(results); // results appear in dev console
}
});
}
parseMe(document.getElementById("file").files[0]);
Parsing a file is asynchronous so you have to get the results in a callback function which executes later.

node.js does not recognise the url in the unfluff module

Any help will be appreciated.
I need to extract data from websites and found that node-unfluff does the job (see https://github.com/ageitgey/node-unfluff). There is two ways to call this module.
First, from command line which works!
Second, from node js which doesn't work.
extractor = require('unfluff');
data = extractor('test.html');
console.log(data);
Output : {"title":"","lang":null,"tags":[],"image":null,"videos":[],"text":""}
The data returns an empty json object. It appears like it cannot read the test.html.
It seems like it doesn't recognise test.html. The example says, "my html data", is there a way to get html data ? Thanks.
From the docs of unfluff:
extractor(html, language)
html: The html you want to parse
language (optional): The document's two-letter language code. This
will be auto-detected as best as possible, but there might be cases
where you want to override it.
You are passing a filename, and it expects the actual HTML of the file to be passed in.
If you are doing this in a scripting context, I'd recommend doing
data = extractor(fs.readFileSync('test.html'));
however if you are doing this in the context of a server or some time when blocking will be an issue, you should do:
fs.readFile('test.html', function(err, html){
var data = extractor(html);
console.log(data);
));

How to parse json file to dictionary

i want parse a json file to Dictionary and want write some data to it.
this is what i have, but i become a empty Dictionary
var users = {};
fs.readFile('login.json', function read(err, data) {
if (err) {
throw err;
}
users = JSON.parse(data);
});
In Node.js you can require JSON files, so your code could simply become:
var users = require('./login.json');
Though note the data will be cached, so if your login.json file changes without an application restart the users object will stay the same.
readFile is an asynchronous function. If you want to do anything with the data in it, you must do so in the callback function (or at some point after you know the callback has been run).
You may want to use readFileSync instead.

fs.writeFile() only saving part of string

I'm creating a text editor using node-webkit. When the user clicks a "Save" menu item, I get write a plain text file to disk using the fs.writeFile() method:
fs.writeFile(file, txt, function (err) {
if (err) throw err;
console.log("file saved");
});
However, it's not saving the entire string passed through the "txt" variable. It's only saving the first 300 characters or so to the file.
I've tried using this method, and the synchronous method fs.writeFileSync. Both are having the same problem. I've tried logging the txt string passed to the method to make sure there's nothing wrong there.
Any ideas why I'm not getting the full text in my saved file?
According to this post: https://groups.google.com/forum/#!topic/node-webkit/3M-0v92o9Zs in the node-webkit Google group, it is likely an encoding issue. Try changing the encoding. I was having the same problem and changed my encoding to utf16le, as specified in that thread, and it fixed the issue; the whole string was written to the file.
My code is now: fs.writeFileSync(path, data, {encoding:'utf16le'});

Categories