I am converting JSON data to an excel file format. So far I have been able to create a file with the data.
I am looking forward to add a custom message to be displayed (image below) in the first row and thereafter data should be displayed in the file with column headers.
I have taken reference from this stackblitz link
How can I achieve this ?
New Issue
Missing headers firstName, lastName, email, phone
I assume when you say JSON, you mean a Javascript object that have been parsed from a JSON file.
in my example it's myObject.
We create a worksheet using XLSX.utils.json_to_sheet(myObject);
We add a row to the start of the worksheet using: XLSX.utils.sheet_add_aoa(myWorkSheet, [["Your Mesage Goes Here"]], { origin: 0 });
this will insert an aoa (array of arrays) to a new row at the position defined by origin.
{ origin: 0 } means first row
{ origin: 1 } means 2nd row
{ origin: -1 } means last row
in our case we add just one cell (A1) with the content: "Your Mesage Goes Here"
we merge the cells in range A1:D1 (4 cells) using myWorkSheet['!merges'] = [{ s: 'A1', e: 'D1' }];
The rest is self explanatory I think
Here's a working example
myObject = [
{ name: "Moran", role: "back" },
{ name: "Alain", role: "front" },
{ name: "Tony", role: "back" },
{ name: "Mike", role: "back" },
{ name: "Abo", role: "back" },
{ name: "Toni", role: "back" }
];
function exportWS() {
var myFile = "myFile.xlsx";
var myWorkSheet = XLSX.utils.json_to_sheet(myObject);
var myWorkBook = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(myWorkBook, myWorkSheet, "myWorkSheet");
XLSX.writeFile(myWorkBook, myFile);
}
function exportWSPlus() {
var myFile = "myFilePlus.xlsx";
var myWorkSheet = XLSX.utils.json_to_sheet(myObject);
XLSX.utils.sheet_add_aoa(myWorkSheet, [["Your Mesage Goes Here"]], { origin: 0 });
var merges = myWorkSheet['!merges'] = [{ s: 'A1', e: 'D1' }];
var myWorkBook = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(myWorkBook, myWorkSheet, "myWorkSheet");
XLSX.writeFile(myWorkBook, myFile);
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/xlsx/0.14.3/xlsx.full.min.js"></script>
<button type="button" onclick="exportWS()">Export Worksheet</button>
<button type="button" onclick="exportWSPlus()">Export Worksheet+</button>
feel free to ask any questions you may have.
I researched about this a lot and finally I could come up with a solution to this.
public exportAsExcelFile(json: Array<object>, excelFileName: string): void {
var worksheet: XLSX.WorkSheet = XLSX.utils.aoa_to_sheet([
[`${excelFileName}`]]); // message to display
worksheet['!merges'] = [{ s: { r: 0, c: 0 }, e: { r: 0, c: 3 } }]; //for merging columns. s : start, e: end, c: column, r: row
XLSX.utils.sheet_add_json(worksheet, json, { origin: "A2" }); //origin for json data
const workbook: XLSX.WorkBook = { Sheets: { 'data': worksheet }, SheetNames: ['data'] };
const excelBuffer: any = XLSX.write(workbook, { bookType: 'xlsx', type: 'array' });
var range = XLSX.utils.decode_range(worksheet['!ref']);
for (var C = range.s.r; C <= range.e.r; ++C) {
var address = XLSX.utils.encode_col(C) + "1";
if (!worksheet[address]) continue;
worksheet[address].v = worksheet[address].v.charAt(0).toUpperCase() + worksheet[address].v.substr(1).toLowerCase();
}
}
Related
I'm using exceljs and i must sum all values from my column, how can i do this?
at issue on github, i found one solution, but not work for me:
workSheet.getCell(`B${endRow}`).value = { formula: `SUM(B4:B${endRow-1})` };
because vscode throw me: Type '{ formula: string; }' is not assignable to type 'CellValue'. Type '{ formula: string; }' is missing the following properties from type 'CellSharedFormulaValue': sharedFormula, date1904
can somebody tell me how to sum each values from column?
Ciao, try to modify your code like this:
workSheet.getCell(`B${endRow}`).value = { formula: `SUM(B4:B${endRow-1})`, date1904: false };
Im a bit late im adding this for anyone who came across the same issue.
point number one your array object values must be of type number not string.
I created a method to do that for me which is convertStringToNumber(data);
Example data
[{ItemPrice: 69.99, name: "Kellogs Cornflakes", brand: "Kellogs", Quantity_Purchased: 2, QaunititySaleValue: 139.98}, {ItemPrice: 19.99, name: "Castle Lite", brand: "Castle", Quantity_Purchased: 2, QaunititySaleValue: 39.98}]
Code
async createExcel(data, fileName) {
let xlsData = this.convertStringToNumber(data);
const fs = require('fs')
const workbook = new Excel.Workbook();
const worksheet = workbook.addWorksheet(fileName);
worksheet.columns = [
{ header: 'name', key: 'name', width: 10 },
{ header: 'brand', key: 'brand', width: 32 },
{ header: 'Quantity_Purchased', key: 'Quantity_Purchased', width: 15, },
{ header: 'ItemPrice', key: 'ItemPrice', width: 15, },
{ header: 'QaunititySaleValue', key: 'QaunititySaleValue', width: 15, }
];
worksheet.addRows(xlsData);
const endRow = worksheet.lastRow._number + 1;
worksheet.getCell(`C${endRow}`).value = { formula: `SUM(C2:C${endRow - 1})` };
worksheet.getCell(`D${endRow}`).value = { formula: `SUM(D2:D${endRow - 1})` };
worksheet.getCell(`E${endRow}`).value = { formula: `SUM(E2:E${endRow - 1})` };
// save under export.xlsx
let buffResult = await workbook.xlsx.writeBuffer();
fs.writeFileSync(fileName + ".xlsx", buffResult); }
convertStringToNumber(objects) {
for (var i = 0; i < objects.length; i++) {
var obj = objects[i];
for (var prop in obj) {
if (obj.hasOwnProperty(prop) && obj[prop] !== null && !isNaN(obj[prop])) {
obj[prop] = +obj[prop];
}
}
}
return objects; }
Output
I have a function which writes values to a csv file from an array in some response, but before that I need to provide headers with the field name in the first line for each tab so I have written the headers in using csv += '/t header 1',csv += '/t header 2' and so on.
Here is my block of code
function exportToCsv(fName, rows) {
var csv = 'branch';
csv += '\t customerId';
csv += '\t customerName';
csv += '\t LOAN ID/UT unique ID';
csv += '\n';
for (var i = 0; i < rows.length; i++) {
var row = Object.values(rows[i]);
for (var j = 0; j < row.length; j++) {
var val = '';
val = row[j] === null ? '' : row[j].toString();
if (j > 0)
csv += '\t';
csv += val;
}
csv += '\n';
}
}
Is there any efficient way to write those five lines in above function? The current code is working but I'm looking for a more efficient way to replace these lines.
Also note I have just mentioned a few header names here but I actually have 20 - 30 headers fields.
Please share your thoughts.
If the keys in your row object are the same as the headers you can just use .join to string them into a csvheader. Otherwise, you could use a mapping array to convert a row key into the appropriate header for the CSV. For example:
const row = {
branch: 'Main',
customerId: 45,
customerName: 'Bill',
'LOAN ID/UT unique ID': 'X456Y01'
}
let csvheader = Object.keys(row).join('\t');
console.log(csvheader);
const row2 = {
branch: 'Main',
Id: 45,
Name: 'Bill',
LoanId: 'X456Y01'
};
const map = {
branch: 'branch',
Id: 'customerId',
Name: 'customerName',
LoanId: 'LOAN ID/UT unique ID'
}
csvheader = Object.keys(row2).map(v => map[v]).join('\t');
console.log(csvheader);
I'm not sure with what you mean by efficient (can be processing time, or less lines of code, readability, etc.)
For reference, here's a library that is usually used as a convenience method for processing/generating CSV files. I think this can also be imported to be used in javascript, not just node.js.
https://csv.js.org/stringify/api/
They have options that can be used like putting some headers or even delimiters.
Sample code from their site
const stringify = require('csv-stringify')
const assert = require('assert')
stringify([
[ '1', '2', '3', '4' ],
[ 'a', 'b', 'c', 'd' ]
], function(err, output){
assert.equal(output, '1,2,3,4\na,b,c,d\n')
});
with the headers option: (source: https://csv.js.org/stringify/options/columns/)
stringify( [
{ a: '1', b: '2' }
], {
columns: [ { key: 'a' }, { key: 'b' } ]
}, function(err, data){
assert.equal(data, '1,2\n')
})
I have this in my drilldown event of highcharts which works right.
if (!e.seriesOptions) {
var s=e.point.name;
var chart = this,
drilldowns = {
'SAR': {
name: 'SAR',
data: yearData,
}
},
series = drilldowns[e.point.name];
chart.addSeriesAsDrilldown(e.point, series);
}
but when I replace string 'SAR' with e.point.name
if (!e.seriesOptions) {
var s=e.point.name;
var chart = this,
drilldowns = {
s: {
name: s,
data: yearData,
}
},
series = drilldowns[e.point.name];
chart.addSeriesAsDrilldown(e.point, series);
}
it does not show any drilldown data where in e.point.name has got string 'SAR' in it.
You cannot create a JS-Object like you intent to do:
var s = 'SAR',
drilldowns = {
s: {
name: s,
data: [],
}
}
will create an object drilldown with the key s instead of SAR:
{s: {name: "SAR", data: [] }}
You can however use a String for a key with bracket notation:
var s = 'SAR',
drilldowns = {};
drilldowns[s] = {
name: s,
data: []
}
will create an drilldown-object with the right keys for you:
{SAR: {name: "SAR", data: []}}
I am trying to classify objects I receive from clients.
On the server side, I have defined my "blueprints" like so:
{ // "type1"
type: 1,
name: String,
password: String
}
{ // "type2"
type: 2,
user_id: Number,
action: String
}
{ // "type3", and yes, this says type: 2....
type: 2,
object_id: Number,
action: String
}
Based on what the client sends I want to classify them like this:
{ type: 1, name: 'user', password: 'pass' } // -> type1
{ type: 1, name: 'user', password: 'pass', remember_me: true } // -> type1
{ type: 2, name: 'user', password: 'pass' } // -> N/A
{ type: 2, user_id: 5, action: 'hello' } // -> type2
{ type: 2, object_id: 5, action: 'hello' } // -> type3
The identification would need to be based off of the key names, the data-types of values, and the actual value of the values. There will be thousands of objects sent per second, and there may be thousands of blueprints. Therefore, it would be nice if it could be done in < O(n) where n is the number of blueprints.
I am writing this from scratch so the blue prints and metadata can be stored in any datastructures needed.
Thanks for the help. I look forward to hearing ideas on this.
Random thought on an approach that might reduce complexity:
The real limiting factor here is going to be how well you can reduce the set of types. One of the most obvious approaches would be to do something based on only the keys of the object. The problem with having extra keys in the data is that we can't rely on just Object.keys( data ).sort().join(","), we must also try every combination of keys we DO have.
// Assuming the "types" list is called "types":
// using underscore.js api
var _ = require('underscore');
var keyMap = _.chain( types ).map(function( typeDef, typeIndex ) {
// get an index with the definition, in case its
return { index: typeIndex, def: typeDef };
}).groupBy(function( data ) {
return _.keys( data.def ).sort().join(",");
}).value();
// empty map needed
keyMap[""] = [];
// assumes sorted key list
function getPossibleMaps( keys ) {
// if we have a map for this, use it
if ( keyMap[ keys.join(",") ] ) {
return keyMap[ keys.join(",") ];
} else {
// create a map of possible types by removing every key from the list of keys
// and then looking for maps that match, cache our result
return keyMap[ keys.join(",") ] = recursiveMapTest( keys );
}
}
function recursiveMapTest( keys ) {
return _.chain( keys )
.map(function( key ) {
return getPossibleMaps( _.without( keys, key ) );
}).flatten().value();
}
// we must also include "lesser" definitions for each of the key lists we found:
_.each( keyMap, function( results, index ) {
var keys = index.split(",");
keyMap[index] = results.concat( recursiveMapTest( keys ) );
});
function getType( data ) {
function checkType( typeData ) {
var def = typeData.def;
return _.every(typeData.def, function( value, key ) {
// these checks are probably not quite right
if ( value === null ) {
return true;
} else if ( value === Number ) {
return typeof data[key] === "number" || data instanceof Number;
} else if ( value === String ) {
return typeof data[key] === "string" || data instanceof String;
} else {
return data[ key ] === value;
}
});
}
var match = _.find( getPossibleMaps( _.keys( data ).sort() ), checkType );
return match && match.index;
}
// Retrieve
var clientTypes = [
{ type: 1, name: 'user', password: 'pass' },
{ type: 2, name: 'user', password: 'pass' },
{ type: 2, user_id: 5, action: 'hello' },
{ type: 2, object_id: 5, action: 'hello' },
{ type: 1, name: 'user', password: 'pass', remember_me: true }
];
console.log('Client types:');
for (var i = 0; i < clientTypes.length; i++) {
var type = clientTypes[i];
// The type object from the map
console.log("getType", type, getType(type));
}
jsbin
Granted, this just means that the more possible incoming key lists, the more memory you consume storing the "quick" lookup tables.
Also, If everything has an numeric type you can obviously use that to speedup a huge chunk of the possible "object types" within that subtype.
I think your best bet would be to avoid needing to do any of this in the first place. Pass better type hints with your objects.
The blueprints will either be sent in an object or an array. If you can send them as an object, use the type IDs as keys and the values as the type object. When determining the type, access the key of that type in O(1) time.
Even if you receive the types as an array, an O(n) pass will allow you to store them in an object internally and use it as a hash table to retrieve the required type information at runtime.
If you can't rely on the type itself as a key, generate a unique key for each type and use this same function for retrieval.
var types = [{ // Will refer to this JSON object as type1
type: 1,
name: String,
password: String
},
{ // type2
type: 2,
user_id: Number,
action: String
},
{ // type3
type: 2,
object_id: Number,
action: String
}];
console.log(types);
// Prepare map
var typeMap = {};
for (var i = 0; i < types.length; i++) {
var type = types[i];
typeMap[typeKey(type)] = type;
}
console.log(typeMap);
function typeKey(type) {
var key = '';
for (var i in type) {
if (i == 'type') {
key += type[i].toString() + ':';
}
key += ':' + i;
}
return key;
}
function getType(type) {
return typeMap[typeKey(type)];
}
// Retrieve
var clientTypes = [
{ type: 1, name: 'user', password: 'pass' },
{ type: 2, name: 'user', password: 'pass' },
{ type: 2, user_id: 5, action: 'hello' },
{ type: 2, object_id: 5, action: 'hello' }
];
console.log('Client types:');
for (var i = 0; i < clientTypes.length; i++) {
var type = clientTypes[i];
// The type object from the map
console.log(getType(type));
}
If a type isn't found for the "client" type, then undefined is retured from getType.
http://jsfiddle.net/Kt2sq/1
Output:
Client types:
Object {type: 1, name: function, password: function}
undefined
Object {type: 2, user_id: function, action: function}
Object {type: 2, object_id: function, action: function}
You can do it like this
var obj1={ type: 1, name: 'user', password: 'pass' };
var obj2={ type: 2, name: 'user', password: 'pass' };
//match JSON keys
var keys1 = Object.keys(obj1);
var keys2 = Object.keys(obj2);
if (JSON.stringify(keys1) === JSON.stringify(keys2))
console.log("matched all keys");
//match JSON value datatypes
for (var key in obj1) {
if (typeof(obj1[key]) == typeof(obj2[key]))
console.log(key +' data type matched');
}
//match 'type' field
if (obj1.type == obj2.type)
console.log("woooo total match");
Here is the time complexity :
Key match is O(n)
Field data type match is O(n)
Type field check is O(1)
So total is O(n) iff JSON is ordered other wise sorting will take additional time.
So I have an object, that I'm using in nodejs. It looks as such:
for(var i = 0; i < x.length; i++) {
var sUser = x[i];
mUsers[sUser.userid] = CreateUser(sUser);
++mUsers.length;
}
So I'm pulling information from an external source, and it breaks down as an array full of instances of this:
[{ name: 'Michael Lovesllamas Lankford',
created: 1338420951.11,
laptop: 'pc',
laptop_version: null,
userid: '4fc6aed7eb35c14ad6000057',
acl: 0,
fans: 1,
points: 5,
avatarid: 34 }]
and so forth.
so that information is passed as x in the above function.
global.mUsers = {length:0}
global.UserBase = {
userid: -1,
name: "noidea",
isSuperUser: false,
isDJ: false,
laptop: "pc" };
process.on("registered", OnRegistered);
global.OnRegistered = function(a) {
//misc code here
RegisterUsers(a.users);
//misc code here
}
global.CreateUser = function(a) {
var b = UserBase;
b.userid = a.userid;
b.name = a.name;
b.laptop = a.laptop;
if (a.acl > 0) b.isSuperUser = true;
return b;
};
global.RegisterUsers = function(x) {
for(var i = 0; i < x.length; i++) {
var sUser = x[i];
mUsers[sUser.userid] = sUser;
++mUsers.length;
}
}
Now, I've logged it in the loop, and mUsers[sUser.userid] does indeed = sUser.
but when I console.log(mUsers) immediately after the loop, I get this:
{
userid1: { userid: userid3, name: name3, item: item3 },
userid2: { userid: userid3, name: name3, item: item3 },
userid3: { userid: userid3, name: name3, item: item3 }
}
And I don't know why it's overwriting. Any ideas?
The main problem is that you where continuously referencing the same object when you where calling CreateUser, as such it was simply updating and returning a reference which was being kept through out all the calls, this is why when you where printing it, it just printed the last update.
You need to create a copy of the object.
global.CreateUser = function(a) {
var b = Object.create(UserBase); // this will create a copy of it.
b.userid = a.userid;
b.name = a.name;
b.laptop = a.laptop;
if (a.acl > 0) b.isSuperUser = true;
return b;
};
now CreateUser is actually creating a copy, when you go through the properties the default ones may not appear right away, but theres still there, they've being simply moved to __proto__ you can still call them.
Try the below it is working for me
var obj = {
userid1: { userid: "userid1", name: "name3", item: "item3" },
userid2: { userid: "userid2", name: "name3", item: "item3" },
userid3: { userid: "userid3", name: "name3", item: "item3" }
};
var muser = {};
for (var key in obj) {
muser[key] = obj[key];
}