I run into a problem, which I cant solve.
Im making an app, where on the first page I need to choose one of two machines, there are 2 buttons on page and when one of them is clicked, i make POST to /mechineChoose where I pass id of selected machine. Then I need to change config.js file, where I have all params needed for rest of app.
const config = {
machineName: "Machine",
...
So in my code I need to change machineName, right now I use fs module to read and then write to file, but problem is that I cant change this name more than once. When I restart app, Im able to change the name, but when trying to choose second machine, nothing happens.
router.post("/machineChoose", async (req, res) => {
console.log(req.body.machineChoose);
if (req.body.machineChoose == 1) {
machineX = "Machine1";
} else {
machineX = "Machine2";
}
console.log(machineX);
fs.readFile('./config.js', 'utf-8', function (err,data){
if (err){
console.log(err);
}
var result = data.replace(config.machineName,machineX);
fs.writeFileSync('./config.js', result, 'utf-8', function(err){
if (err) return console.log(err);
});
});
return res.send("")
})
Any idea how to solve it ?
After writing to the file, you need to reload the config-object as it will still hold the previous state in-memory and thus further calls to data.replace(...) will not replace anything, since it will still be called with "Machine".
I would do something like this (although you should consider using a real database):
router.post("/machineChoose", async (req, res) => {
const chosenMachine = req.body.machineChoose == 1 ? "Machine1" : "Machine2";
const config = await readConfig();
config.machineName = chosenMachine;
await writeConfig(config);
res.status(204).end();
});
async function writeConfig(currentConfig) {
try {
await fs.promises.writeFile("./config.json", JSON.stringify(currentConfig));
} catch (e) {
console.log("Could not write config file", e)
throw e;
}
}
async function readConfig() {
try {
const rawConfig = await fs.promises.readFile("./config.json", {encoding: 'utf-8'});
return JSON.parse(rawConfig);
} catch (e) {
console.log("Could not read config file", e)
throw e;
}
}
I've been making a project recently and I basically need to check for new text in a text file.
My code was this:
const fs = require('fs');
fs.watch('./file.txt', (event, filename) => {
fs.readFile('./file.txt', (err, data) => {
if (err) throw err;
data = JSON.parse(data);
console.log(data);
}
}
It worked great. However, sometimes, I must delete this file for whatever reasons, and thus my code crashes too!
Any idea on how to handle this? Thank you for your answers
Node's built-in module fs doesn't support file deletion detection very well. There is a workaround using a package called nsfw which is a wrapper around a native library that provides much better support for deletion detection.
The API is a bit odd but it is a solid package nonetheless.
Here is an example of what you're attempting to do using nsfw.
const nsfw = require("nsfw");
const path = require("path");
const fs = require("fs");
const file = path.join(__dirname, "file.txt");
let watcher;
nsfw(
file,
([event, ...restEvents]) => {
switch (event.action) {
case nsfw.actions.DELETED: {
watcher.stop();
return; // or handle this however you need to..
}
default: {
fs.readFile(file, (err, data) => {
if (err) throw err;
try {
data = JSON.parse(data);
console.log(data);
} catch (error) {
console.error(error)
}
});
}
}
}
)
.then((w) => {
watcher = w;
watcher.start()
});
I've been working on a Node project that involves fetching some data from BigQuery. Everything has been fine so far; I have my credential.json file (from BigQuery) and the project works as expected.
However, I want to implement a new feature in the project and this would involve fetching another set of data from BigQuery. I have an entirely different credential.json file for this new dataset. My project seems to recognize only the initial credential.json file I had (I named them differently though).
Here's a snippet of how I linked my first credential.json file:
function createCredentials(){
try{
const encodedCredentials = process.env.GOOGLE_AUTH_KEY;
if (typeof encodedCredentials === 'string' && encodedCredentials.length > 0) {
const google_auth = atob(encodedCredentials);
if (!fs.existsSync('credentials.json')) {
fs.writeFile("credentials.json", google_auth, function (err, google_auth) {
if (err) console.log(err);
console.log("Successfully Written to File.");
});
}
}
}
catch (error){
logger.warn(`Ensure that the environment variable for GOOGLE_AUTH_KEY is set correctly: full errors is given here: ${error.message}`)
process.kill(process.pid, 'SIGTERM')
}
}
Is there a way to fuse my two credential.json files together? If not, how can I separately declare which credential.json file to use?
If not, how can I separately declare which credential.json file to use?
What I would do I would create a function which is the exit point to BigQuery and pass an identifier to your function which credential to generate, This credential will then be used when calling BigQuery.
The below code assume you changed this
function createCredentials(){
try{
const encodedCredentials = process.env.GOOGLE_AUTH_KEY;
To this:
function createCredentials(auth){
try{
const encodedCredentials = auth;
And you can use it like this
import BigQuery from '#google-cloud/bigquery';
import {GoogApi} from "../apiManager" //Private code to get Token from client DB
if (!global._babelPolyfill) {
var a = require("babel-polyfill")
}
describe('Check routing', async () => {
it('Test stack ', async (done, auth) => {
//Fetch client Auth from local Database
//Replace the 2 value below with real values
const tableName = "myTest";
const dataset = "myDataset";
try {
const bigquery = new BigQuery({
projectId: `myProject`,
keyFilename: this.createCredentials(auth)
});
await bigquery.createDataset(dataset)
.then(
args => {
console.log(`Create dataset, result is: ${args}`)
})
.catch(err => {
console.log(`Error in the process: ${err.message}`)
})
} catch (err) {
console.log("err", err)
}
})
})
I am trying to read the contents of a specific path. for that purpose, i used the following code:
code1:
const contentsOfPersonalFolder = fs.readdirSync(rootPathToPersonal);
but i know in advance that i do not have access permission to read some of the contents that will be returned from the previous line of code.
To check whether or not I have access permission to read some files, i would use the following code
code2:
try {
fs.accessSync(path, fs.constants.R_OK);
logger.info('The directory: ', path, 'can be read');
} catch (err) {
logger.error('The directory: ', path, 'can not be read due inaccessibility');
}
The problem now is, the code in code1 will return an array of all available files in the specified path. and if one of the these files is not
accessible due read right protection, then it will throw and the program will throw.
what i want to achieve is to iterate through all the available files in the specified path in code1, and then check each item using the code in code2 and
if the file is accessible for reading i would like to do some logic, and if it is not accessible for reading i would do something else.
please let me know how to achieve that.
you could use fs.access to check the users permissions
https://nodejs.org/api/fs.html#fs_fs_access_path_mode_callback
const testFolder = './tests/';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file);
fs.access(file, fs.constants.R_OK, (err) => {
if (err) {
console.error("file is not readable");
return;
}
// do your reading operations
});
});
})
const fs = require('fs');
const isAvailableToRead = file => {
try {
fs.accessSync(file, fs.constants.R_OK);
return true;
} catch (err) {
return false;
}
}
const readDirectory = path => {
const files = fs.readdirSync(path);
files.forEach(file => {
if(isAvailableToRead(file)) {
console.log(`Do some logic ${file}`);
}
});
}
readDirectory(__dirname);
I use the md5 grunt task to generate MD5 filenames. Now I want to rename the sources in the HTML file with the new filename in the callback of the task. I wonder what's the easiest way to do this.
You could use simple regex:
var result = fileAsString.replace(/string to be replaced/g, 'replacement');
So...
var fs = require('fs')
fs.readFile(someFile, 'utf8', function (err,data) {
if (err) {
return console.log(err);
}
var result = data.replace(/string to be replaced/g, 'replacement');
fs.writeFile(someFile, result, 'utf8', function (err) {
if (err) return console.log(err);
});
});
Since replace wasn't working for me, I've created a simple npm package replace-in-file to quickly replace text in one or more files. It's partially based on #asgoth's answer.
Edit (3 October 2016): The package now supports promises and globs, and the usage instructions have been updated to reflect this.
Edit (16 March 2018): The package has amassed over 100k monthly downloads now and has been extended with additional features as well as a CLI tool.
Install:
npm install replace-in-file
Require module
const replace = require('replace-in-file');
Specify replacement options
const options = {
//Single file
files: 'path/to/file',
//Multiple files
files: [
'path/to/file',
'path/to/other/file',
],
//Glob(s)
files: [
'path/to/files/*.html',
'another/**/*.path',
],
//Replacement to make (string or regex)
from: /Find me/g,
to: 'Replacement',
};
Asynchronous replacement with promises:
replace(options)
.then(changedFiles => {
console.log('Modified files:', changedFiles.join(', '));
})
.catch(error => {
console.error('Error occurred:', error);
});
Asynchronous replacement with callback:
replace(options, (error, changedFiles) => {
if (error) {
return console.error('Error occurred:', error);
}
console.log('Modified files:', changedFiles.join(', '));
});
Synchronous replacement:
try {
let changedFiles = replace.sync(options);
console.log('Modified files:', changedFiles.join(', '));
}
catch (error) {
console.error('Error occurred:', error);
}
Perhaps the "replace" module (www.npmjs.org/package/replace) also would work for you. It would not require you to read and then write the file.
Adapted from the documentation:
// install:
npm install replace
// require:
var replace = require("replace");
// use:
replace({
regex: "string to be replaced",
replacement: "replacement string",
paths: ['path/to/your/file'],
recursive: true,
silent: true,
});
You can also use the 'sed' function that's part of ShellJS ...
$ npm install [-g] shelljs
require('shelljs/global');
sed('-i', 'search_pattern', 'replace_pattern', file);
Full documentation ...
ShellJS - sed()
ShellJS
If someone wants to use promise based 'fs' module for the task.
const fs = require('fs').promises;
// Below statements must be wrapped inside the 'async' function:
const data = await fs.readFile(someFile, 'utf8');
const result = data.replace(/string to be replaced/g, 'replacement');
await fs.writeFile(someFile, result,'utf8');
You could process the file while being read by using streams. It's just like using buffers but with a more convenient API.
var fs = require('fs');
function searchReplaceFile(regexpFind, replace, cssFileName) {
var file = fs.createReadStream(cssFileName, 'utf8');
var newCss = '';
file.on('data', function (chunk) {
newCss += chunk.toString().replace(regexpFind, replace);
});
file.on('end', function () {
fs.writeFile(cssFileName, newCss, function(err) {
if (err) {
return console.log(err);
} else {
console.log('Updated!');
}
});
});
searchReplaceFile(/foo/g, 'bar', 'file.txt');
On Linux or Mac, keep is simple and just use sed with the shell. No external libraries required. The following code works on Linux.
const shell = require('child_process').execSync
shell(`sed -i "s!oldString!newString!g" ./yourFile.js`)
The sed syntax is a little different on Mac. I can't test it right now, but I believe you just need to add an empty string after the "-i":
const shell = require('child_process').execSync
shell(`sed -i "" "s!oldString!newString!g" ./yourFile.js`)
The "g" after the final "!" makes sed replace all instances on a line. Remove it, and only the first occurrence per line will be replaced.
Expanding on #Sanbor's answer, the most efficient way to do this is to read the original file as a stream, and then also stream each chunk into a new file, and then lastly replace the original file with the new file.
async function findAndReplaceFile(regexFindPattern, replaceValue, originalFile) {
const updatedFile = `${originalFile}.updated`;
return new Promise((resolve, reject) => {
const readStream = fs.createReadStream(originalFile, { encoding: 'utf8', autoClose: true });
const writeStream = fs.createWriteStream(updatedFile, { encoding: 'utf8', autoClose: true });
// For each chunk, do the find & replace, and write it to the new file stream
readStream.on('data', (chunk) => {
chunk = chunk.toString().replace(regexFindPattern, replaceValue);
writeStream.write(chunk);
});
// Once we've finished reading the original file...
readStream.on('end', () => {
writeStream.end(); // emits 'finish' event, executes below statement
});
// Replace the original file with the updated file
writeStream.on('finish', async () => {
try {
await _renameFile(originalFile, updatedFile);
resolve();
} catch (error) {
reject(`Error: Error renaming ${originalFile} to ${updatedFile} => ${error.message}`);
}
});
readStream.on('error', (error) => reject(`Error: Error reading ${originalFile} => ${error.message}`));
writeStream.on('error', (error) => reject(`Error: Error writing to ${updatedFile} => ${error.message}`));
});
}
async function _renameFile(oldPath, newPath) {
return new Promise((resolve, reject) => {
fs.rename(oldPath, newPath, (error) => {
if (error) {
reject(error);
} else {
resolve();
}
});
});
}
// Testing it...
(async () => {
try {
await findAndReplaceFile(/"some regex"/g, "someReplaceValue", "someFilePath");
} catch(error) {
console.log(error);
}
})()
I ran into issues when replacing a small placeholder with a large string of code.
I was doing:
var replaced = original.replace('PLACEHOLDER', largeStringVar);
I figured out the problem was JavaScript's special replacement patterns, described here. Since the code I was using as the replacing string had some $ in it, it was messing up the output.
My solution was to use the function replacement option, which DOES NOT do any special replacement:
var replaced = original.replace('PLACEHOLDER', function() {
return largeStringVar;
});
ES2017/8 for Node 7.6+ with a temporary write file for atomic replacement.
const Promise = require('bluebird')
const fs = Promise.promisifyAll(require('fs'))
async function replaceRegexInFile(file, search, replace){
let contents = await fs.readFileAsync(file, 'utf8')
let replaced_contents = contents.replace(search, replace)
let tmpfile = `${file}.jstmpreplace`
await fs.writeFileAsync(tmpfile, replaced_contents, 'utf8')
await fs.renameAsync(tmpfile, file)
return true
}
Note, only for smallish files as they will be read into memory.
This may help someone:
This is a little different than just a global replace
from the terminal we run
node replace.js
replace.js:
function processFile(inputFile, repString = "../") {
var fs = require('fs'),
readline = require('readline'),
instream = fs.createReadStream(inputFile),
outstream = new (require('stream'))(),
rl = readline.createInterface(instream, outstream);
formatted = '';
const regex = /<xsl:include href="([^"]*)" \/>$/gm;
rl.on('line', function (line) {
let url = '';
let m;
while ((m = regex.exec(line)) !== null) {
// This is necessary to avoid infinite loops with zero-width matches
if (m.index === regex.lastIndex) {
regex.lastIndex++;
}
url = m[1];
}
let re = new RegExp('^.* <xsl:include href="(.*?)" \/>.*$', 'gm');
formatted += line.replace(re, `\t<xsl:include href="${repString}${url}" />`);
formatted += "\n";
});
rl.on('close', function (line) {
fs.writeFile(inputFile, formatted, 'utf8', function (err) {
if (err) return console.log(err);
});
});
}
// path is relative to where your running the command from
processFile('build/some.xslt');
This is what this does.
We have several file that have xml:includes
However in development we need the path to move down a level.
From this
<xsl:include href="common/some.xslt" />
to this
<xsl:include href="../common/some.xslt" />
So we end up running two regx patterns one to get the href and the other to write
there is probably a better way to do this but it work for now.
Thanks
Nomaly, I use tiny-replace-files to replace texts in file or files. This pkg is smaller and lighter...
https://github.com/Rabbitzzc/tiny-replace-files
import { replaceStringInFilesSync } from 'tiny-replace-files'
const options = {
files: 'src/targets/index.js',
from: 'test-plugin',
to: 'self-name',
}
# await
const result = replaceStringInFilesSync(options)
console.info(result)
I would use a duplex stream instead. like documented here nodejs doc duplex streams
A Transform stream is a Duplex stream where the output is computed in
some way from the input.
<p>Please click in the following {{link}} to verify the account</p>
function renderHTML(templatePath: string, object) {
const template = fileSystem.readFileSync(path.join(Application.staticDirectory, templatePath + '.html'), 'utf8');
return template.match(/\{{(.*?)\}}/ig).reduce((acc, binding) => {
const property = binding.substring(2, binding.length - 2);
return `${acc}${template.replace(/\{{(.*?)\}}/, object[property])}`;
}, '');
}
renderHTML(templateName, { link: 'SomeLink' })
for sure you can improve the reading template function to read as stream and compose the bytes by line to make it more efficient