This question already has answers here:
Download JSON object as a file from browser
(14 answers)
Closed 3 years ago.
I want to generate a .json file on my local project folder. I want to save the fetch API call response (which is an array of objects) into a .json file Here is my code :
ts :
getRecords(){
this.service.getRecords()
.subscibe((res)=>{
// res.data -want to convert this data into .json file
});
}
I'm not sure if you want to be able to do this locally (i.e., so the user can download a JSON file), or if you want to do this using Node.js, so here's both approaches.
I mention Nodejs since you tagged this with Angular.
If you are using NodeJS, you can use the fs module:
let json = {"foo": "bar"};
let str = JSON.stringify(json);
let fs = require("fs");
fs.writeFile("jsonFile.json", str, function(error) {
if (error) {
console.log("Error");
} else {
console.log("Success");
}
);
But if you want to create a file from the browser and download on the client machine:
let json = {"foo": "bar"};
let str = JSON.stringify(json);
let str = "data:text/json;charset=utf-8," + encodeURIComponent(str);
let dl = document.createElement("a");
dl.setAttribute("href", str);
dl.setAttribute("download", "jsonFile.json");
dl.click();
Also, your code did not get added to your question.
Related
I want to change my JSON file or add an element to my SON file, but real file. I tried this code, but it doesn't work on the real file. Only the time the tab is open on the web has changed. How to handle it in real file? Not user file, it is server file, but I tried my local.
let xmlreq = new XMLHttpRequest()
xmlreq.open("GET","users.json",true)
function test(){
const obj = JSON.parse(xmlreq.responseText);
console.log(obj);
obj.user1.name="john";
console.log('obj.user1.name: ', obj.user1.name);
obj.user2.push("item");
console.log('obj.user2.: ', obj.user2);
}
xmlreq.send()
another
let xmlreq = new XMLHttpRequest()
function test(){
// let parsereq= JSON.parse(xmlreq.responseText);
const obj = JSON.parse(xmlreq.responseText);
console.log(obj);
obj.user1.name="john";
console.log('obj.user1.name: ', obj.user1.name);
obj.user2.push("item");
console.log('obj.user2.: ', obj.user2);
}
xmlreq.open("GET","users.json",true)
xmlreq.send()
First you have to use the File API to load the file.
https://developer.mozilla.org/en-US/docs/Web/API/File
Then you have to parse the JSON data.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse
Then you can do your modifications.
But you can not modify files on your local disc directly. Instead you have to download the file in order to overwrite the original file.
To do so you have to create a data URL from your JSON data.
https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URIs
And finally you can create a link to download the new JSON.
https://stackoverflow.com/a/15832662/402322
I have a dummy csv file uploaded in my nodejs application. All I want is to convert the CSV to JSON file.
But it gives syntax error while just requiring the csv file.
Below is the code in nodejs to parse file content
const csv = require('csv-parser')
const results = [];
let csvToJson = require('convert-csv-to-json');
let fileInputName = require('./fileUpload/testingCSV.csv');
let fileOutputName = 'myproduct.json';
const fs = require('fs');
fs.createReadStream(fileInputName)
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results);
});
csvToJson.generateJsonFileFromCsv(fileInputName,fileOutputName);
Below is the screenshot of console error
Also, the actual csv file will contain a large number of data of products/accounts/userprofiles etc.
this is a dummy csv that I am trying to parse.
Please let me know what needs to be done .
I think your csv file is not correcly formated, what is the name of your nodejs file, and can you give us the error line 14 please ?
EDIT :
you are probably using an old version of Node.js. If this is the case, either const csv = require('csv-parser'); to const csv = require('csv-parser/lib/es5');`
Or
the csv file which is incorrect. Give the path directly on createreadstream
I have a really huge json object (created with a JavaScript parser called espree, contains an array of objects). I want to write it to a .json file, but it fails every time with memory allocation problems (My heap size is 22 Gb).
As far as I understand, the buffer gets overloaded while the data is not being written into file.
If I use synchronous file operations only, the output gets written into the file, but the running time of my application exploads.
Solutions I have tried and failed (tried to serialize the whole object, then tried to serialize the items of the array):
JSON.stringify
JSONStream
big-json (which should be serialize the object as a stream, but the buffer still gets overloaded..)
watching for drain events
Here is the current code:
const bjson= require('big-json');
function save(result) {
let outputStream = fs.createWriteStream(/*path*/);
const stringifyStream = bjson.createStringifyStream({
body: result
});
function write(d) {
let result = outputStream.write(d);
if (!result) {
outputStream.once('drain', write);
}
}
stringifyStream.on('data', function (chunk) {
writeData(chunk);
});
stringifyStream.on('end', function () {
outputStream.end();
});
}
let results = [/*results as an array, containing lots of json objects*/];
for (let i = 0; i < results.length; i++){
save(result[i]);
}
Performance issue came from JSON transformation to string. I have the same issue and I solved that by storing data with msgpack format.
Has explained here, I installed msgpack-lite in my projet with npm :
npm install msgpack-lite
I code that for storing my JSON object :
var fs = require("fs");
var msgpack = require("msgpack-lite");
var writeStream = fs.createWriteStream("file.msp");
var encodeStream = msgpack.createEncodeStream();
encodeStream.pipe(writeStream);
// send multiple objects to stream
encodeStream.write(myBigObject);
// call this once you're done writing to the stream.
encodeStream.end();
And that for reading my file and restore my object. I don't know why it doesn't works with Streams :
var fs = require("fs");
var msgpack = require("msgpack-lite");
var buffer = fs.readFileSync("file.msp");
var myBigObject = msgpack.decode(buffer);
Hi all I am loading XML file with file name 'Attachments.xml' using the following code:
var attachmentsXml = XmlBuddy.load(file, 'Attachments.xml');
Now if I want to load the xml-file with file name as 'attachments.xml' I am unable to load so can anyone suggest me how to do.
Most likely you are using node.js so you can use the xml2json
fs = require('fs');
var parser = require('xml2json');
fs.readFile( './Attachments.xml', function(err, data) {
var json = parser.toJson(data);
console.log("to json ->", json);
});
I just want to store my json data in a file in a particular directory using JS. I can not see the created file using the following code.
var jsonse = JSON.stringify(submitted);
var blob = new Blob([jsonse], {type: "application/json"});
var file = new File([blob], "" + workerID + ".json")
JS Documentation Link would also suffice.
Assuming you're not using a web browser which cannot write to your file system for, hopefully obvious (another question), security reasons.
You can redirect output from your script to a file.
node yourfile.js > output_file.json
Or you can use the fs module.
Writing files in Node.js
// note jsonse is the json blob
var fs = require('fs');
fs.writeFile("/tmp/test", jsonse, function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});