How to write console.log to a file instead - javascript

Now I show the information using:
console.log (kraken.id, markets)
However, I want to write all the information that goes to the console to a file instead. How can that be done by completing the below code?
'use strict';
var ccxt = require('ccxt');
(async () => {
let kraken = new ccxt.kraken()
let markets = await kraken.load_markets()
//console.log (kraken.id, markets)
//How to write above console.log to file?
const fs = require('fs');
fs.writeFile("/Users/Andreas/Desktop/NODE/myproject/files/test.txt", "allinfoAsstring", function (err) {
if (err) {
return console.log(err);
}
console.log("The file was saved!");
});
})()

You can try to create an Object out of your variables and format them as a JSON string.
/* ... */
const obj = {kraken, markets}
const fs = require('fs');
fs.writeFile("/Users/Andreas/Desktop/NODE/myproject/files/test.txt", JSON.stringify(obj), function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
Later, you can retrieve the values from the file, by running
fs.readFile('/Users/Andreas/Desktop/NODE/myproject/files/test.txt', 'utf8', function(err, data) {
const obj = JSON.parse(data)
console.log("The data from the file is: " + obj)
})

thx for the diverse solutions. the simplest way for me was
node app.js > app.log 2>&1
This would redirect stdout to a file named app.log and redirect stderr to stdout.
So all my console.log are going to app.log

You can use JSON.stringify(obj), every object can be convert into string via this method.

I recommend not using console.log in production as it sync code.
you can use winston instead
and you got in an easy way all the logs to file (if you wanted) by using new transports
const logger = winston.createLogger({
level: 'info',
format: winston.format.json(),
defaultMeta: { service: 'user-service' },
transports: [
//
// - Write to all logs with level `info` and below to `combined.log`
// - Write all logs error (and below) to `error.log`.
//
new winston.transports.File({ filename: 'error.log', level: 'error' }),
new winston.transports.File({ filename: 'combined.log' })
]
});

An addition to loicnestler's answer that means you don't need to change your 'console.log' statements:
Update the reference to 'console.log', so it calls 'writeFile' when logging:
const log = console.log;
console.log = (data) => {
log(data);
<loicnestler's writeFile logic>
}

In Node, console.log() calls util.inspect() to print objects.
You should call that directly and write it to a file.

Related

How to get the current module's file path in a node module?

I want to print the correct filepath even if the function is imported in some other module inorder to handle the errors correctly. How can I do that? I am using serverless stack.
Please refer the following code,
class Logger {
filePath: string;
constructor(fp: string) {
filePath = fp;
}
printLog(info) {
const { timestamp, message } = info;
return `${timestamp} ${filePath}: ${message}`;
}
}
This is used in dbConnection.ts as,
const logger = new Logger(__filename);
export const connectToDB = () => {
try {
//DB connection code.
} catch(error) {
logger.print({ timestamp: new Date().toISOString(), message: error.message });
}
};
Now, I want to connect to db from some other module lets say, test.ts then I will use it as follows,
export const test = () => {
//some code here...
connectToDB();
}
When there occurs an error while connecting to DB, then It prints something like this,
2022-05-27T05:24:47.548Z src/test.ts: Error in connecting DB url is unreachable please check your internet connection.
In order to have proper debuggability, I want to print the filename from where the exception is actually thrown. That is src/dbConnection.ts and not src/test.ts.
Try using
__filename
__filename: This will return the path of the file executing
__dirname: This will return the path of the directory in which the file executing is located.
Check if it does what you need like
console.log(__filename);
Try to change filePath to this.filePath in your Logger Class

how to save node js consol log to .json file

I want to save the output of node.js & console.log as a JSON file.
The first part is working fine but unable to save it on the file. I also tried to save through:
$node filename.js > test.json
It's working but it can't format files properly
var gplay = require('google-play-scraper');
const fs = require('fs')
function def() {
gplay.app({appId: 'com.sablostudio.printing3d.town.builder', country: 'us'}).then(console.log) ;
}
def();
fs.writeFile('./newconstomer.json', JSON.stringify(--------???, null, 2), err => {
if (err) {
console.log(err);
} else {
console.log("file done");
}
You can do this with writeFile in this way:
First, you need a logger function that takes the log and saves it to the newconstomer.json file:
function logger(log){
fs.writeFile('./newconstomer.json', JSON.stringify(log), function (err) {
if (err) {
console.log(err);
}
console.log("file done");
});
}
Now, replace the console.log method with the logger function:
function def() {
gplay.app({appId: 'com.sablostudio.printing3d.town.builder', country: 'us'}).then( logger ) ; // <--- place the logger function here
}
def();
Explanation:
The logger function takes a log parameter and parsed it by JSON.stringify() method.

How to save a stdin data in a JSON in Nodejs

I have the following problem, I want some questions to be asked in my console, and when I receive the answers I save them in a json and then use them, I am not very familiar with Nodejs, I suppose it is something simple but it does not work for me
When trying to send customer.Token = answers [1]; what happens is that "token" no longer exists in my json. And for example if I use customer.Token =" Hi "; my json file changes perfectly
I need the answer that the user is giving at that moment to be sent
I've been trying all morning to make this work but I can't find a solution, if someone knows <3 it would help me a lot
here below I leave my code
const customer = require('./db.json');
const fs = require("fs");
function jsonReader(filePath, cb) {
fs.readFile(filePath, (err, fileData) => {
if (err) {
return cb && cb(err);
}
try {
const object = JSON.parse(fileData);
return cb && cb(null, object);
} catch (err) {
return cb && cb(err);
}
});
}
jsonReader("./db.json", (err, customer) => {
if (err) {
console.log("Error reading file:", err);
return;
}
customer.Token = "HI";
fs.writeFile("./db.json", JSON.stringify(customer), err => {
if (err) console.log("Error writing file:", err);
});
});
var questions = ['Cual es tu nombre?' ,
'Cual es tu Token?' ,
'Cual es tu numero de orden?'
]
var answers = [];
function question(i){
process.stdout.write(questions[i]);
}
process.stdin.on('data', function(data){
answers.push(data.toString().trim());
if(answers.length < questions.length){
question(answers.length);
}else{
process.exit();
}
})
question(0);
y en mi JSON:
{"name":"Jabibi","order_count":103,"Token":"HI"}
Although it's possible to get your script working with traditional callbacks, I think switching to promises and modern async/await syntax would be easier to read and follow.
readline (built-in module for Node.js) can be used for getting input from user.
You can use fs/promises instead of fs to take advantage of promises.
It seems like you were trying to create a new customer object (using the user's input) and then write it as JSON to your file system.
The script below writes to a temporary file path. Once tested that the right data is getting correctly written to a file, you can change the file path (I didn't want to overwrite an existing file on your machine).
const fs = require("fs/promises");
const readline = require("readline").createInterface({
input: process.stdin,
output: process.stdout,
});
function getUserInput(displayText) {
return new Promise((resolve) => {
readline.question(displayText, resolve);
});
}
const questions = [
["name", "Cual es tu nombre?"],
["token", "Cual es tu Token?"],
["order_count", "Cual es tu numero de orden?"],
];
async function main() {
const newCustomer = {};
for (const [property, question] of questions) {
const answer = await getUserInput(question.concat("\n"));
newCustomer[property] = answer;
}
console.log("New customer:", newCustomer);
const oldCustomer = await fs
.readFile("./db.json")
.then((data) => data.toString("utf-8"))
.then(JSON.parse);
// You can do something with old customer here, if you need to.
console.log("Old customer:", oldCustomer);
// I don't want to overwrite any existing file on your machine.
// You can change the file path and data below to whatever they should be
// once you've got your script working.
await fs.writeFile(
`./db_temp_${Date.now()}.json`,
JSON.stringify(newCustomer)
);
}
main()
.catch(console.error)
.finally(() => readline.close());

Replace a string in a file with nodejs

I use the md5 grunt task to generate MD5 filenames. Now I want to rename the sources in the HTML file with the new filename in the callback of the task. I wonder what's the easiest way to do this.
You could use simple regex:
var result = fileAsString.replace(/string to be replaced/g, 'replacement');
So...
var fs = require('fs')
fs.readFile(someFile, 'utf8', function (err,data) {
if (err) {
return console.log(err);
}
var result = data.replace(/string to be replaced/g, 'replacement');
fs.writeFile(someFile, result, 'utf8', function (err) {
if (err) return console.log(err);
});
});
Since replace wasn't working for me, I've created a simple npm package replace-in-file to quickly replace text in one or more files. It's partially based on #asgoth's answer.
Edit (3 October 2016): The package now supports promises and globs, and the usage instructions have been updated to reflect this.
Edit (16 March 2018): The package has amassed over 100k monthly downloads now and has been extended with additional features as well as a CLI tool.
Install:
npm install replace-in-file
Require module
const replace = require('replace-in-file');
Specify replacement options
const options = {
//Single file
files: 'path/to/file',
//Multiple files
files: [
'path/to/file',
'path/to/other/file',
],
//Glob(s)
files: [
'path/to/files/*.html',
'another/**/*.path',
],
//Replacement to make (string or regex)
from: /Find me/g,
to: 'Replacement',
};
Asynchronous replacement with promises:
replace(options)
.then(changedFiles => {
console.log('Modified files:', changedFiles.join(', '));
})
.catch(error => {
console.error('Error occurred:', error);
});
Asynchronous replacement with callback:
replace(options, (error, changedFiles) => {
if (error) {
return console.error('Error occurred:', error);
}
console.log('Modified files:', changedFiles.join(', '));
});
Synchronous replacement:
try {
let changedFiles = replace.sync(options);
console.log('Modified files:', changedFiles.join(', '));
}
catch (error) {
console.error('Error occurred:', error);
}
Perhaps the "replace" module (www.npmjs.org/package/replace) also would work for you. It would not require you to read and then write the file.
Adapted from the documentation:
// install:
npm install replace
// require:
var replace = require("replace");
// use:
replace({
regex: "string to be replaced",
replacement: "replacement string",
paths: ['path/to/your/file'],
recursive: true,
silent: true,
});
You can also use the 'sed' function that's part of ShellJS ...
$ npm install [-g] shelljs
require('shelljs/global');
sed('-i', 'search_pattern', 'replace_pattern', file);
Full documentation ...
ShellJS - sed()
ShellJS
If someone wants to use promise based 'fs' module for the task.
const fs = require('fs').promises;
// Below statements must be wrapped inside the 'async' function:
const data = await fs.readFile(someFile, 'utf8');
const result = data.replace(/string to be replaced/g, 'replacement');
await fs.writeFile(someFile, result,'utf8');
You could process the file while being read by using streams. It's just like using buffers but with a more convenient API.
var fs = require('fs');
function searchReplaceFile(regexpFind, replace, cssFileName) {
var file = fs.createReadStream(cssFileName, 'utf8');
var newCss = '';
file.on('data', function (chunk) {
newCss += chunk.toString().replace(regexpFind, replace);
});
file.on('end', function () {
fs.writeFile(cssFileName, newCss, function(err) {
if (err) {
return console.log(err);
} else {
console.log('Updated!');
}
});
});
searchReplaceFile(/foo/g, 'bar', 'file.txt');
On Linux or Mac, keep is simple and just use sed with the shell. No external libraries required. The following code works on Linux.
const shell = require('child_process').execSync
shell(`sed -i "s!oldString!newString!g" ./yourFile.js`)
The sed syntax is a little different on Mac. I can't test it right now, but I believe you just need to add an empty string after the "-i":
const shell = require('child_process').execSync
shell(`sed -i "" "s!oldString!newString!g" ./yourFile.js`)
The "g" after the final "!" makes sed replace all instances on a line. Remove it, and only the first occurrence per line will be replaced.
Expanding on #Sanbor's answer, the most efficient way to do this is to read the original file as a stream, and then also stream each chunk into a new file, and then lastly replace the original file with the new file.
async function findAndReplaceFile(regexFindPattern, replaceValue, originalFile) {
const updatedFile = `${originalFile}.updated`;
return new Promise((resolve, reject) => {
const readStream = fs.createReadStream(originalFile, { encoding: 'utf8', autoClose: true });
const writeStream = fs.createWriteStream(updatedFile, { encoding: 'utf8', autoClose: true });
// For each chunk, do the find & replace, and write it to the new file stream
readStream.on('data', (chunk) => {
chunk = chunk.toString().replace(regexFindPattern, replaceValue);
writeStream.write(chunk);
});
// Once we've finished reading the original file...
readStream.on('end', () => {
writeStream.end(); // emits 'finish' event, executes below statement
});
// Replace the original file with the updated file
writeStream.on('finish', async () => {
try {
await _renameFile(originalFile, updatedFile);
resolve();
} catch (error) {
reject(`Error: Error renaming ${originalFile} to ${updatedFile} => ${error.message}`);
}
});
readStream.on('error', (error) => reject(`Error: Error reading ${originalFile} => ${error.message}`));
writeStream.on('error', (error) => reject(`Error: Error writing to ${updatedFile} => ${error.message}`));
});
}
async function _renameFile(oldPath, newPath) {
return new Promise((resolve, reject) => {
fs.rename(oldPath, newPath, (error) => {
if (error) {
reject(error);
} else {
resolve();
}
});
});
}
// Testing it...
(async () => {
try {
await findAndReplaceFile(/"some regex"/g, "someReplaceValue", "someFilePath");
} catch(error) {
console.log(error);
}
})()
I ran into issues when replacing a small placeholder with a large string of code.
I was doing:
var replaced = original.replace('PLACEHOLDER', largeStringVar);
I figured out the problem was JavaScript's special replacement patterns, described here. Since the code I was using as the replacing string had some $ in it, it was messing up the output.
My solution was to use the function replacement option, which DOES NOT do any special replacement:
var replaced = original.replace('PLACEHOLDER', function() {
return largeStringVar;
});
ES2017/8 for Node 7.6+ with a temporary write file for atomic replacement.
const Promise = require('bluebird')
const fs = Promise.promisifyAll(require('fs'))
async function replaceRegexInFile(file, search, replace){
let contents = await fs.readFileAsync(file, 'utf8')
let replaced_contents = contents.replace(search, replace)
let tmpfile = `${file}.jstmpreplace`
await fs.writeFileAsync(tmpfile, replaced_contents, 'utf8')
await fs.renameAsync(tmpfile, file)
return true
}
Note, only for smallish files as they will be read into memory.
This may help someone:
This is a little different than just a global replace
from the terminal we run
node replace.js
replace.js:
function processFile(inputFile, repString = "../") {
var fs = require('fs'),
readline = require('readline'),
instream = fs.createReadStream(inputFile),
outstream = new (require('stream'))(),
rl = readline.createInterface(instream, outstream);
formatted = '';
const regex = /<xsl:include href="([^"]*)" \/>$/gm;
rl.on('line', function (line) {
let url = '';
let m;
while ((m = regex.exec(line)) !== null) {
// This is necessary to avoid infinite loops with zero-width matches
if (m.index === regex.lastIndex) {
regex.lastIndex++;
}
url = m[1];
}
let re = new RegExp('^.* <xsl:include href="(.*?)" \/>.*$', 'gm');
formatted += line.replace(re, `\t<xsl:include href="${repString}${url}" />`);
formatted += "\n";
});
rl.on('close', function (line) {
fs.writeFile(inputFile, formatted, 'utf8', function (err) {
if (err) return console.log(err);
});
});
}
// path is relative to where your running the command from
processFile('build/some.xslt');
This is what this does.
We have several file that have xml:includes
However in development we need the path to move down a level.
From this
<xsl:include href="common/some.xslt" />
to this
<xsl:include href="../common/some.xslt" />
So we end up running two regx patterns one to get the href and the other to write
there is probably a better way to do this but it work for now.
Thanks
Nomaly, I use tiny-replace-files to replace texts in file or files. This pkg is smaller and lighter...
https://github.com/Rabbitzzc/tiny-replace-files
import { replaceStringInFilesSync } from 'tiny-replace-files'
const options = {
files: 'src/targets/index.js',
from: 'test-plugin',
to: 'self-name',
}
# await
const result = replaceStringInFilesSync(options)
console.info(result)
I would use a duplex stream instead. like documented here nodejs doc duplex streams
A Transform stream is a Duplex stream where the output is computed in
some way from the input.
<p>Please click in the following {{link}} to verify the account</p>
function renderHTML(templatePath: string, object) {
const template = fileSystem.readFileSync(path.join(Application.staticDirectory, templatePath + '.html'), 'utf8');
return template.match(/\{{(.*?)\}}/ig).reduce((acc, binding) => {
const property = binding.substring(2, binding.length - 2);
return `${acc}${template.replace(/\{{(.*?)\}}/, object[property])}`;
}, '');
}
renderHTML(templateName, { link: 'SomeLink' })
for sure you can improve the reading template function to read as stream and compose the bytes by line to make it more efficient

Using Node.JS, how do I read a JSON file into (server) memory?

Background
I am doing some experimentation with Node.js and would like to read a JSON object, either from a text file or a .js file (which is better??) into memory so that I can access that object quickly from code. I realize that there are things like Mongo, Alfred, etc out there, but that is not what I need right now.
Question
How do I read a JSON object out of a text or js file and into server memory using JavaScript/Node?
Sync:
var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('file', 'utf8'));
Async:
var fs = require('fs');
var obj;
fs.readFile('file', 'utf8', function (err, data) {
if (err) throw err;
obj = JSON.parse(data);
});
The easiest way I have found to do this is to just use require and the path to your JSON file.
For example, suppose you have the following JSON file.
test.json
{
"firstName": "Joe",
"lastName": "Smith"
}
You can then easily load this in your node.js application using require
var config = require('./test.json');
console.log(config.firstName + ' ' + config.lastName);
Asynchronous is there for a reason! Throws stone at #mihai
Otherwise, here is the code he used with the asynchronous version:
// Declare variables
var fs = require('fs'),
obj
// Read the file and send to the callback
fs.readFile('path/to/file', handleFile)
// Write the callback function
function handleFile(err, data) {
if (err) throw err
obj = JSON.parse(data)
// You can now play with your datas
}
At least in Node v8.9.1, you can just do
var json_data = require('/path/to/local/file.json');
and access all the elements of the JSON object.
Answer for 2022, using ES6 module syntax and async/await
In modern JavaScript, this can be done as a one-liner, without the need to install additional packages:
import { readFile } from 'fs/promises';
let data = JSON.parse(await readFile("filename.json", "utf8"));
Add a try/catch block to handle exceptions as needed.
In Node 8 you can use the built-in util.promisify() to asynchronously read a file like this
const {promisify} = require('util')
const fs = require('fs')
const readFileAsync = promisify(fs.readFile)
readFileAsync(`${__dirname}/my.json`, {encoding: 'utf8'})
.then(contents => {
const obj = JSON.parse(contents)
console.log(obj)
})
.catch(error => {
throw error
})
Using fs-extra package is quite simple:
Sync:
const fs = require('fs-extra')
const packageObj = fs.readJsonSync('./package.json')
console.log(packageObj.version)
Async:
const fs = require('fs-extra')
const packageObj = await fs.readJson('./package.json')
console.log(packageObj.version)
using node-fs-extra (async await)
const readJsonFile = async () => {
const myJsonObject = await fs.readJson('./my_json_file.json');
console.log(myJsonObject);
}
readJsonFile() // prints your json object
https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfile_file_options_callback
var fs = require('fs');
fs.readFile('/etc/passwd', (err, data) => {
if (err) throw err;
console.log(data);
});
// options
fs.readFile('/etc/passwd', 'utf8', callback);
https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfilesync_file_options
You can find all usage of Node.js at the File System docs!
hope this help for you!
function parseIt(){
return new Promise(function(res){
try{
var fs = require('fs');
const dirPath = 'K:\\merge-xml-junit\\xml-results\\master.json';
fs.readFile(dirPath,'utf8',function(err,data){
if(err) throw err;
res(data);
})}
catch(err){
res(err);
}
});
}
async function test(){
jsonData = await parseIt();
var parsedJSON = JSON.parse(jsonData);
var testSuite = parsedJSON['testsuites']['testsuite'];
console.log(testSuite);
}
test();
Answer for 2022, using v8 Import assertions
import jsObject from "./test.json" assert { type: "json" };
console.log(jsObject)
Dynamic import
const jsObject = await import("./test.json", {assert: { type: "json" }});
console.log(jsObject);
Read more at:
v8 Import assertions
So many answers, and no one ever made a benchmark to compare sync vs async vs require. I described the difference in use cases of reading json in memory via require, readFileSync and readFile here.
If you are looking for a complete solution for Async loading a JSON file from Relative Path with Error Handling
// Global variables
// Request path module for relative path
const path = require('path')
// Request File System Module
var fs = require('fs');
// GET request for the /list_user page.
router.get('/listUsers', function (req, res) {
console.log("Got a GET request for list of users");
// Create a relative path URL
let reqPath = path.join(__dirname, '../mock/users.json');
//Read JSON from relative path of this file
fs.readFile(reqPath , 'utf8', function (err, data) {
//Handle Error
if(!err) {
//Handle Success
console.log("Success"+data);
// Parse Data to JSON OR
var jsonObj = JSON.parse(data)
//Send back as Response
res.end( data );
}else {
//Handle Error
res.end("Error: "+err )
}
});
})
Directory Structure:

Categories