I'm working in NodeJS and I would like to export a JSON-format object to an Excel file.
I am well aware that there are (at least) three npm packages for that purpose, but so far none of these gave me the output I'm dreaming of.
Here is the javascript object I have :
var myObject =
{
hashkey1 : {
keyA : dataA1,
keyB : dataB2
}
hashkey2 : {
keyA : dataA2,
keyB : dataB2
}
};
The .xls (or .xlsx)(or any spreadsheet format) of my dreams has one line for each hashkey. On each line : first column would be the hashkeyX, second column would be the dataAX, third column would be the dataBX.
Is it possible to achieve such a result using available tools, or do I have to code it from scratch ? Any advice to get anywhere near this result ?
You can write to csv (comma-separated values) text file without any additional library. This extension open in Excel by default.
var fs = require('fs');
var file = fs.createWriteStream('file.csv', {'flags': 'w', autoClose: true});
var result = '';
for (var hashkey in myObject)
result += hashkey + ';' + myObject[hashkey].keyA + ';' + myObject[hashkey].keyB + '\n';
file.write(result);
Related
I am trying to write a basic JS script in NodeJs. The script will create a folder with the name of the folder to be reponses_timestamp.
I have written the attached script, however, when it runs, i receive an error which says:
Einval: invalid argument.
Any ideas on how to fix this?
Test.js
const fs = require('fs');
const today = new Date();
const date = today.getFullYear()+'-'+(today.getMonth()+1)+'-'+today.getDate();
const time = today.getHours()+":"+today.getMinutes()+":"+today.getSeconds();
const dateTime = date + '_' + time;
const uniqueIdentifier = dateTime;
// const folderName = './responses' + '_' + uniqueIdentifier;
try {
if (!fs.existsSync('./responses' + '_' + uniqueIdentifier)) {
fs.mkdirSync('./responses' + '_' + uniqueIdentifier)
}
} catch (err) {
console.error(err)
}
Probably - the problem is with the folder name. In Windows folder should not have special characters, like :
As mentioned the issue is due to the usage of the colon : because Windows (also per your screenshot Windows shown as the path) will not allow special characters.
It wasn't mentioned a solution so I wanted to mention for what you're doing this can be achieved easily with moment js using two lines of code and a substitution of the colon for a dash:
const date = new Date()
const uniqueIdentifier = moment(date).format('YYYY-MM-DD-HH-MM')
console.log('uuid', uniqueIdentifier)
// Result: "uuid" "2021-05-13-11-05"
and even a one-liner:
const uniqueIdentifier = moment(new Date()).format('YYYY-MM-DD-HH-MM')
console.log('uuid', uniqueIdentifier)
// Result: "uuid" "2021-05-13-11-05"
Using node.js, I am trying to build an array of objects and write them to a file. To do this, I'm using the built in fs library.
After calling
var file = fs.createWriteStream('arrayOfObjects.json'); and file.write('[') I run several asynchronous functions to eventually append objects like this:
file.write(JSON.stringify(objectToAppend) + ',\n')
I can determine when all of the objects have stopped appending, and this is where I run file.write(']') and file.end(). My problem is that adding the last comma to the end of the last object causes the JSON to be invalid.
It is very difficult to determine where and when the last object is being created due to the asynchronous nature of the script, so I was wondering if there is a way to strip or remove characters from a file-stream. If so, I could do this before adding the last ']' character.
I could do this manually, but I was hoping to pipe this to another application. The only solution I've thought about is using the fs.truncate() function, however this doesn't seem to work for file streams, and neither file.length or file.length() will give me the length of the contents because it is not a string so it's difficult to determine how or where to truncate the file.
For now I have just been adding '{}]' to the end of the array to make it valid JSON, but this empty object may cause some problems later.
Also note: the array of objects I am writing in this stream is VERY large, so I would rather not end the stream and re-open the file.
I'd recommend to prepend the separator instead, so that you dynamically can adjust it after the first call:
file.write('[\n')
var sep = "";
forEach(function(objectToAppen) {
file.write(sep + JSON.stringify(objectToAppend))
if (!sep)
sep = ",\n";
});
Example using JSONStream:
var JSONStream = require('JSONStream');
var fs = require('fs');
var jsonwriter = JSONStream.stringify();
var file = fs.createWriteStream('arrayOfObjects.json');
// Pipe the JSON data to the file.
jsonwriter.pipe(file);
// Write your objects to the JSON stream.
jsonwriter.write({ foo : 'bar#1' });
jsonwriter.write({ foo : 'bar#2' });
jsonwriter.write({ foo : 'bar#3' });
jsonwriter.write({ foo : 'bar#4' });
// When you're done, end it.
jsonwriter.end();
Here's a snippet incorporating robertklep's answer. This converts from a pipe-separated file to json:
var fs = require('fs');
var readline = require('readline');
var JSONStream = require('JSONStream');
// Make sure we got a filename on the command line.
if (process.argv.length < 3) {
console.log('Usage: node ' + process.argv[1] + ' FILENAME');
process.exit(1);
}
var filename = process.argv[2];
var outputFilename = filename + '.json';
console.log("Converting psv to json. Please wait.");
var jsonwriter = JSONStream.stringify();
var outputFile = fs.createWriteStream(outputFilename);
jsonwriter.pipe(outputFile);
var rl = readline.createInterface({
input: fs.createReadStream(filename),
terminal: false
}).on('line', function(line) {
console.log('Line: ' + line);
if(!/ADDRESS_DETAIL_PID/.test(line))
{
var split = line.split('|');
var line_as_json = { "address_detail_pid": split[0], "flat_type": split[1], "flat_number": split[2], "level_type": split[3], "level_number": split[4], "number_first": split[5], "street_name": split[6], "street_type_code": split[7], "locality_name": split[8], "state_abbreviation": split[9], "postcode": split[10], "longitude": split[11], "latitude": split[12] };
jsonwriter.write(line_as_json);
}
}).on('close', () => {
jsonwriter.end();
});;
console.log('psv2json complete.');
The accepted answer is interesting (prepending the separator) but in my case I have found it easier to append the separator and remove the last character of the file, just as suggested in the question.
This is how you remove the last character of a file with Node.js :
import fs from 'fs'
async function removeLastCharacter(filename) {
const stat = await fs.promises.stat(filename)
const fileSize = stat.size
await fs.promises.truncate(filename, fileSize - 1)
}
explanation :
fs.promises.stat gives us some information about the file, we will use its size.
fs.promises.truncate remove from the file what is after a certain position
We use the position fileSize - 1 which is the last character.
Note :
Yes I know that we need to wait until the stream is closed, but this is ok because truncate and stat functions are very fast and doesn't depend on the file size, it doesn't have to read its content.
I modified this code to convert JSON to .xls format. The code actually works, but while opening the file in MS Excel 2013, it throws a warning that the file format and extension do not match.
This is what I have so far:
var json3 = { "d": "[{\"Id\":1,\"UserName\":\"Sam Smith\"},{\"Id\":2,\"UserName\":\"Fred Frankly\"},{\"Id\":3,\"UserName\":\"Zachary Zupers\"}]" }
DownloadJSON2CSV(json3.d);
function DownloadJSON2CSV(objArray)
{
var array = typeof objArray != 'object' ? JSON.parse(objArray) : objArray;
var str = '';
for (var i = 0; i < array.length; i++) {
var line = '';
for (var index in array[i]) {
line += array[i][index] + '\t';
}
line.slice(0,line.Length-1);
str += line + '\r\n';
}
window.open( "data:application/vnd.ms-excel;charset=utf-8," + escape(str));
}
What am I missing?
jsFiddle
The .xls format is a far more complex (and for that matter, proprietary) file format - the modification you made only modifies the mime type, not the actual content of the file. In other words, the file is still a CSV file inside, but you just tricked your browser into thinking it's an XLS file.
More info on mime types: http://en.wikipedia.org/wiki/Internet_media_type
For a solution to your problem, if you really, REALLY need XLS, the best idea is to find an online service that offers an API which converts CSV to XLS (googling "CSV to XLS online" might help).
Does anyone know an easy way to change a file extension in Javascript?
For example, I have a variable with "first.docx" but I need to change it to "first.html".
This will change the string containing the file name;
let file = "first.docx";
file = file.substr(0, file.lastIndexOf(".")) + ".htm";
For situations where there may not be an extension:
let pos = file.lastIndexOf(".");
file = file.substr(0, pos < 0 ? file.length : pos) + ".htm";
In Node.js:
path.join(path.dirname(file), path.basename(file, path.extname(file)) + '.md')
or more readably:
// extension should include the dot, for example '.html'
function changeExtension(file, extension) {
const basename = path.basename(file, path.extname(file))
return path.join(path.dirname(file), basename + extension)
}
Unlike the accepted answer, this works for edge cases such as if the file doesn't have an extension and one of the parent directories has a dot in their name.
I'd use this:
path.format({ ...path.parse('/path/to/file.txt'), base: '', ext: '.md' })
to change "/path/to/file.txt" to "/path/to/file.md".
file = file.replace(/\.[^.]+$/, '.html');
This probably won't get many upvotes but I couldn't resist.
This code will deal with the edge case where a file might not have an extension already (in which case it will add it). It uses the "tilde trick"
function changeExt (fileName, newExt) {
var _tmp
return fileName.substr(0, ~(_tmp = fileName.lastIndexOf('.')) ? _tmp : fileName.length) + '.' + newExt
}
EDITED: thanks #kylemit for a much better gist which uses the same logic, but in a much much neater way:
function changeExt(fileName, newExt) {
var pos = fileName.includes(".") ? fileName.lastIndexOf(".") : fileName.length
var fileRoot = fileName.substr(0, pos)
var output = `${fileRoot}.${newExt}`
return output
}
console.log(changeExt("img.jpeg", "jpg")) // img.jpg
console.log(changeExt("img.name.jpeg", "jpg")) // img.name.jpg
console.log(changeExt("host", "csv")) // host.csv
console.log(changeExt(".graphqlrc", "graphqlconfig")) // .graphqlconfig
path.parse("first.docx").name + ".html"
var file = "first.docx";
file = file.split(".");
file = file[0]+".html";
For statistical reasons, I want an extensive analysis from a dataset. I already have a function that exports the data to Excel, but I have raw data that way; 500 lines, 35 columns, heaps of text sometimes...
Is it possible to include a macro into a function so that the excelfile is readymade to be analyzed?
I am using ASP, Javascript, and at the moment Excel 2003.
This is the current function (written by one of my predecessors):
function exporttoexcel()
{ //export to excel
if (tableSortArray.length > 0)
{
var t, arr;
var tempArray=new Array();
for(var i=0; i, i<tableSortArray.length; i++) {
arr = tableSortArray[i].toString();
arrr = (arr.split(","));
if (i==0) { t = arrr[1]; }
else { t += ','+arrr[1]; }
}
document.excel.t.value = t;
}
// I left out some mumbojumbo about sorting here
document.excel.submit();
}
I mean macro's so that graphs are made "automatically" as well as some turntables...
Stolen from mrexcel.com (google + cut_paste = faster than typing):
' Delete any old stray copies of the module1
On Error Resume Next
Kill ("C:\MrXL1.bas")
On Error GoTo 0
' Export Module 1
ActiveWorkbook.VBProject.VBComponents("module1").Export ("c:\MrXL1.bas")
For x = 1 to 54
ThisBroker = Sheets("BrokerList").range("A" & x).value
' customization of plan omited for brevity
Sheets(Array("Menu", "Plan")).Copy
NBName = ActiveWorkbook.Name
' new book name
' Import Module 1 to this new book
Application.VBE.ActiveVBProject.VBComponents.Import ("c:\MrXL1.bas")
ActiveWorkbook.SaveAs Filename:=ThisBroker
ActiveWorkbook.Close
Next x
Kill ("C:\MrXl1.bas")
Alternatively you could also just setup a master excel file (say called "analysis.xls") that references the data in the "data" excel file, for example in a cell enter:
='Z:\excel-data[Current-data.xls]Sheet1'!$A$1
User opens up the master ("analysis.xls") and it in turn adds all the values from Z:\excel-data\Current-data.xls, just replace Current-data.xls with new data as needed.