I am trying to read a text file that is in the source(src) folder of the react project(creat-react-app), manipulate the values and write back the new value to the same text file.
I am unable to read the values from the file, even though the code that reads the file is logging out old data, not sure where is that coming from. Because even if change the data in the text file directly, it doesn't read the new value.
I am using a package called browserify-fs (https://www.npmjs.com/package/browserify-fs) for reading and writing to a file.
var fs = require('browserify-fs');
var reader = new FileReader();
export const getData = () => {
let initialString = "abcd";
fs.readFile('file.txt', function (err, data) {
if (err) {
return console.error(err);
}
console.log(initialString + data.toString());
});
};
export const writeData = () => {
let data = "abcd";
fs.writeFile("file.txt", data, err => {
// In case of a error throw err.
if (err) throw err;
});
}
Does it have to do something with webpack-loader for importing the types of file for the build or is it related specifically to create-react-app package which defines the files and folder structure for auto-importing types of files?
I am still not sure what is the actual issue causing. Any help would be appreciated.
P.S: I know using CRUD operations on the browser is not a recommended practice, just using for a personal project(learning purpose).
Related
I am trying to achieve the following:
User selects file on website
User calls Firebase Cloud function and passes file into the function
Cloud function uploads the file that to storage.
So far I am able to do all of the above, however, when I try to access the above file in storage, a file with no extension is downloaded. The original file was a pdf, but I am still unable able to open it with PDF viewers. It appears I am storing something in storage, although I am not exactly sure what.
Here is an example of how my front-end code works:
const getBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = error => reject(error);
});
var document_send = document.getElementById('myFile')
var send_button = document.getElementById('send_button')
send_button.addEventListener('click', async () => {
var sendDocument = firebase.functions().httpsCallable('sendDocument')
try {
await sendDocument({
docu: await getBase64(document_send.files[0])
})
} catch (error) {
console.log(error.message);
}
})
Here is an example of how my cloud function works:
const functions = require("firebase-functions");
const admin = require("firebase-admin");
exports.sendDocument = functions.https
.onCall((data, context) => {
return admin.storage().bucket()
.file("randomLocationName")
//.file("randomLocationName"+".pdf") - tried this also
.save(data.docu);
})
.catch((error) => {
console.log(error.message);
return error;
});
});
I do not receive an error message as the function runs without error.
The save() function seems to take either a string or Buffer as first parameter.
> save(data: string | Buffer, options?: SaveOptions)
The issue arises when you pass the base64 string directly instead of a Buffer. Try refactoring the code as shown below:
return admin.storage().bucket()
.file("randomLocationName" + ".pdf") // <-- file extension required
.save(Buffer.from(data.docu, "base64"));
Cloud Functions also have a 10 MB max request size so you won't be able to upload large images that way. You can use Firebase client SDKs to upload files directly, restrict access using security rules and use Cloud Storage Triggers for Cloud Function in case you want to process the file and update the database. Alternatively, use signed URLs for uploading the files if you are using GCS without Firebase.
I download an OpenAPI file from localhost and convert it to .json. It becomes something like this:
"components":{"responses":{"r200":{"content":{"application/json":{"schema":{"properties" ....
I'm doing this using this JavaScript code:
const processData = async () => {
const req = await axios.get('http://localhost:5004/swagger');
let reqJson = JSON.stringify(req.data);
fs.writeFile('swagger.json', reqJson, (err) => {
if (err) throw err;
})
}
processData()
If there are any changes in the OpenAPI file on localhost, I want to download it, convert to .json, save it as a new file and compare with the original swagger.json. It will be the same to previous code, but
fs.writeFile('newSwagger.txt' ....
And changes has to be in error field in url.
Question: How can I compare these files, and show any changes in a popup on a web site like:
Attention: there is changes in Backend API:
Missing api/xxx/yyy
Added api/zzz/yyy
I read that createRreadStream doesn't put the whole file into the memory, instead it works with chunks. However I have a situation where I am simultaneously writing and reading from a file; Write gets finished first, then I delete the file from disk. Somehow, readstream was able to complete reading whole file without any error.
Does anyone have any explanation for this ? Am I wrong to think that streams doesn't load the file into memory?
Here's the code for writing to a file
const fs = require('fs');
const file = fs.createWriteStream('./bigFile4.txt');
function write(stream,data) {
if(!stream.write(data))
return new Promise(resolve=>stream.once('drain',resolve));
return true;
}
(async() => {
for(let i=0; i<1e6; i++) {
const res = write(file,'a')
if(res instanceof Promise)
await res;
}
write(file,'success');
})();
For Reading I used this,
const file = fs.createReadStream('bigFile4.txt')
file.on('data',(chunk)=>{
console.log(chunk.toString())
})
file.on('end',()=>{
console.log('done')
})
At least on UNIX-type OS'es, if you open a file and then remove it, the file data will still be available to read until you close the file.
I am currently requiring a JSON file which I am reading data from.
var allUORHours = require('./UORHoursAch.json');
How do I then write to the file? The below doesn't make any changes to the file
allUORHours.test = {};
You may use the File System API's writeFile():
https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback
No, of course it doesn't. It just changes the variable's value. To write a JSON, you would need to convert to JSON, then write to a file:
var fs = require('fs');
fs.writeFile('./UORHoursAch.json', JSON.stringify(allUORHours), function (err) {
if (err) {
console.log(err);
} else {
console.log("Saved");
}
});
I am very new to node.js and I think I understand the basics of how it functions but I feel like I am not seeing something that is vital to how fs.write and buffers function.
I am trying to send a user defined variable over socket.io and write it into an html file. I have a main site that has the button, when clicked it sends the information to the socket in a variable.
The thing I can't figure out is how to insert the variable into the html file.
I can save strings that I type, into a file:
(e.g.) var writeBuffer = new Buffer ('13');
But not variables that I put in:
(e.g.) var writeBuffer = new Buffer ($(newval));
I even tried different encoding methods, I think I am missing something.
Server.js
var newval = "User String";
var fd = fsC.open(fileName, 'rs+', function (error, fd) {
if (error) { throw error }
var writeBuffer = new Buffer($(newval));
var bufferLength = writeBuffer.length;
fsC.write( fd, writeBuffer, 0, bufferLength, 937,
function (error, written) {
if (error) { throw error }
fsC.close(fd, function() {
console.log('File Closed');
});
}
);
});
If you are using a version of jsdom 4.0.0 or later, it will not work with Node.js. As per the jsdom github readme:
Note that as of our 4.0.0 release, jsdom no longer works with
Node.js™, and instead requires io.js. You are still welcome to install
a release in the 3.x series if you use Node.js™.