I am trying to write data to a text file in Node.js. When initializing the write stream I get the following error message:
Be sure you have included at the top of your node program
var fs = require('fs');
Also It looks like you spelled 'Stream' incorrectly
fs.createWriteSteam > fs.createWriteStream
Another issue that may come up is trying to get createWriteStream from 'fs/promises'.
It doesn't exist on 'fs/promises'. (At least on NodeJS version 14);
My solution was to import both:
var fs = require('fs');
var fsPromises = require('fs/promises');
Related
Trying to use Dom-Parser with Discord.js. Couldn't find help from any where else.
Error on line 15 fs.readFile
I also had a lot problems getting fs working. First it wasn't defined then it could not be runned before initalization, just got that fixed (I hope)
// Discord stuff.
const Discord = require('discord.js');
const client = new Discord.Client();
const config = require('./config.json');
const token = config.token;
// DomParser
var DomParser = require('dom-parser');
var parser = new DomParser();
var data = fs.readFileSync(filepathHidden);
// Other requirements
var fs = require('fs');
// when getting online.
client.once('ready', () => {
console.log('WAHAHAHA IM ALIVE!');
}),
fs.readFile('https://url.com)', 'utf8', function(err, html){
if (!err){
var dom = parser.parseFromString(html);
console.log(dom.getElementsByClassName('new_solution_box_title').innerHTML);
}
})
client.login(token);
var is hoisted.
So since you have var fs, there is a variable called fs in the function / module / global scope where you have that statement.
It starts out undefined.
When you say fs = require('fs') you assign the file system module to it. At that point it stops being undefined.
On the previous line when you try to read it with fs.readFileSync, you haven't yet defined it so you get an error.
Order matters.
See answer for same error in another stack overflow question.
The link to the answer that helped me is from Arun Kumar Mohan
What he recommended was to import as follows: import * as fs from 'fs'
Please read NodeJS Documentation about URL object support for FS API. It seems that only file: protocol is supported when using URLs.
However, I don't even recommend you to use fs.readFileSync because it is a blocking operation (code execution will stop until it finishes) and there is a network transfer involved.
Try fetching the file first instead using fetch, XMLHttpRequest or your preferred library, and then parse it. Or at least use fs.readFile which will perform its tasks asynchronously (file: protocol limitation still applies).
For the non defined error, you must import/require fs before using it. In other words, var fs = require('fs') must appear before any other use of fs.
I am trying to write a function that returns me a Blob or File object, blob_object, while taking in the local filepath of an object, tmpFilePath, and I have to use NodeJS to do it as the function is used as part of a Firebase cloud function. I have tried many methods, but none of them work. They mainly revolve around this.
Attempt 1: Using streamToBlob. Inspired from reddit
const streamToBlob = require('stream-to-blob');
const fs = require('fs-extra');
const input_read_stream = fs.createReadStream(tmpFilePath);
const blob_object = await streamToBlob(input_read_stream);
Error: ReferenceError: Blob is not defined
Attempt 2: using blob. Inspired from stackoverflow
const Blob = require('blob');
const fs = require('fs-extra');
const file_buffer = fs.readFileSync(tmpFilePath);
const blob_object = new Blob([file_buffer]);
Error: TypeError: Blob is not a constructor.
A workable solution would mean that upon writing code in my file, file.js, I would be able to run node file.js and console.log a Blob or File object. Does anyone know how this can be done in a series of steps? I'm on Node 8.
As of writing, Blob support is still experimental, but this should work in recent versions of Node.js:
import fs from "fs";
import { Blob } from "buffer";
let buffer = fs.readFileSync("./your_file_name");
let blob = new Blob([buffer]);
So your second example should work if you upgrade Node to v16.
I am trying to get json values using nodejs but not working.I have searched some question in stackoverflow related this but always I am getting [Object Object] like this.I do not know Why I am getting like this.Anyone can resolve this issue?
file.json:
{
"scripts": {
"mr": "place",
"kg": "time",
"bh": "sec"
}
}
extension.js:
var fs = require("fs");
var file = JSON.parse(fs.readFileSync("c:\\xampp\\htdocs\\projects\\file.json", "utf8"));
console.log(file);
This is not duplicate. I have tried many ways but not working.
Note:I am using this code inside my visual studio code extension.
In node, you can import JSON like a JavaScript file
const file = require('./file.json')
console.log(file)
See is there a require for json in node.js for more info
const data = require("./file.json")
console.log(data.scripts)
Try this one out it is simple.
const fs = require('fs');
const paht = require('path');
console.log(paht.join(__dirname,'../file.json'));
let file = JSON.parse(fs.readFileSync(paht.join(__dirname,'../file.json'), "utf8"));
__dirname gives you the directory of your current file, I used path.join to make sure I can go further.
I put the json file in upper directory in my case
I want to make a simple node module that can be run from the command line that I can input files into, then it might change every instance of 'red' to 'blue' for example and then save that as a new file. Is there a simple example out there somewhere that I can edit to fit my purposes? I've tried looking but couldn't find one that was sufficiently simple to understand how to modify it. Can anyone help?
A simple example of replace.js (both old and new files are supposed to be in UTF-8 encoding):
'use strict';
const fs = require('fs');
const oldFilePath = process.argv[2];
const newFilePath = process.argv[3];
const oldFileContent = fs.readFileSync(oldFilePath, 'utf8');
const newFileContent = oldFileContent.replace(/red/g, 'blue');
fs.writeFileSync(newFilePath, newFileContent);
How to call:
node replace.js test.txt new_test.txt
Documentation on used API:
process.argv
fs.readFileSync()
fs.writeFileSync()
I am trying to unzip a gzipped file in Node but I am running into the following error.
Error: incorrect header check
at Zlib._handle.onerror (zlib.js:370:17)
Here is the code the causes the issue.
'use strict'
const fs = require('fs');
const request = require('request');
const zlib = require('zlib');
const path = require('path');
var req = request('https://wiki.mozilla.org/images/f/ff/Example.json.gz').pipe(fs.createWriteStream('example.json.gz'));
req.on('finish', function() {
var readstream = fs.createReadStream(path.join(__dirname, 'example.json.gz'));
var writestream = fs.createWriteStream('example.json');
var inflate = zlib.createInflate();
readstream.pipe(inflate).pipe(writestream);
});
//Note using file system because files will eventually be much larger
Am I missing something obvious? If not, how can I determine what is throwing the error?
The file is gzipped, so you need to use zlib.Gunzip instead of zlib.Inflate.
Also, streams are very efficient in terms of memory usage, so if you want to perform the retrieval without storing the .gz file locally first, you can use something like this:
request('https://wiki.mozilla.org/images/f/ff/Example.json.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('example.json'));
Otherwise, you can modify your existing code:
var gunzip = zlib.createGunzip();
readstream.pipe(gunzip).pipe(writestream);