I'm using excel.js to convert .XLSX file to HTML.
const workbook = new ExcelJS.Workbook()
workbook.xlsx.readFile("./file.xlsx").then(wb => {
wb.eachSheet(sheet => {
//Sheet Data here
sheet.eachRow(row => {
//Row Data here
row.eachCell(cell => {
//Cell Data here
})
})
})
})
I can fetch all the cells with strings, but it completely skips cells with charts. Is there any way I could get the chart data using excel.js or any other library? Would appreciate any help.
Related
I want to retrieve the data from donation and display it in a table. I was able to retrieve the user data from Users and displayed it on a table. But now I don't know how I will be able to retrieve the data from donation.
This is my database structure in Firebase. Note: All of the data that was entered came from a mobile app created in Android Studio.
This is the code that I made when retrieving the User data.
function AddAllITemsToTable(User) {
id=0;
tbody.innerHTML="";
User.forEach(element => {
AddItemToTable(element.uid, element.fullName, element.organization, element.contactPerson, element.contactNo, element.location, element.emailAddress, element.status);
});
}
function GetAllDataRealtime() {
const dbRef = ref(database, 'Users');
onValue(dbRef,(snapshot) => {
var Users = [];
snapshot.forEach(childSnapshot => {
Users.push(childSnapshot.val());
});
AddAllITemsToTable(Users);
})
}
window.onload = GetAllDataRealtime;
Since you're calling onValue on /Users, you already get all data for all users and all of their donations. To process the donations in your code:
const dbRef = ref(database, 'Users');
onValue(dbRef,(snapshot) => {
var Users = [];
snapshot.forEach(userSnapshot => {
Users.push(userSnapshot.val());
userSnapshot.child("donation").forEach((donationSnapshot) => {
console.log(donationSnapshot.key, donationSnapshot.val());
});
});
AddAllITemsToTable(Users);
})
As I said in my comment, I recommend reading the Firebase documentation on structuring data, as the way you nest donations under each user does not follow the guidance on nesting data and keeping your structure flat.
I have a file upload system. It reads excel file and upload data to database (mongoose). When I console.log(sheetData) it returns array of arrays with the objects. The array inside arrays contain only 100 objects and then it create another array. Below is my code and images of the issue.
Code
//Excel Upload
const handleExcelSubmit = async (e) => {
e.preventDefault();
const reader = new FileReader();
reader.onload = async (e) => {
const data = e.target.result;
const workbook = xlsx.read(data, { type: "binary" });
const sheetNames = workbook.SheetNames[0];
const workSheet = workbook.Sheets[sheetNames];
const sheetData = xlsx.utils.sheet_to_json(workSheet, {
header: "1",
});
const headers = sheetData[0];
return convertToJson(headers, sheetData); // <--- returns array of arrays???
dispatch(importExcel(sheetData)); // currently disabled for debugging
};
reader.readAsBinaryString(excelFile);
};
//Converts data to json. IDK if this is useful I received same data without this function
const convertToJson = async (headers, data) => {
const rows = [];
data.forEach(async () => {
let rowData = {};
rows.forEach(async (element, index) => {
rowData[headers[index]] = element;
});
rows.push(rowData);
});
setTableData(rows);
console.log(tableData);
return rows;
};
Image - Array of Arrays
Summary
Actually I want that only single array is created with all the objects in one array. Currently it is creating two arrays, with one array of limit with 100 objects, after 99th object it creates 2nd array and starts from 100th object. Is there any option that it create only one array with all 108 objects in it.
(This app is production so I have to hide those data. Sorry)
Thank You
I found the issue. It was in my backend. I am sending an array of data so I used for loop to fetch and upload data to database in backend. The mistake was I was sending a response in between loop which was breaking it. The response was just for check backend data and was useless so I removed it and it worked. If anyone faces same issue in future then please check your loops if they break while the loop is running.
I want to create a single web page using html and javascript in which user can upload an excel(.xlsx, .xls) file as an input.
After uploading excel sheet the web page should use the javascript logic.
It could read that excel sheet and make changes using some js logic.
After writing into excel, web page should have a download button to download the new excel sheet.
Do I need to use Node js to do that or is it possible to use some cdn script tags of excel-js library, which is not working for me(I dont know why)
<script src="https://cdnjs.cloudflare.com/ajax/libs/exceljs/4.2.1/exceljs.min.js" integrity="sha512-DPjFYmSXYGB7/5k/Z4h5iw1i29Vl//jj3I7v79DRy+T0o4KssDku6Hf7ImlIV87KmNIh+euT5H0LJhQmTnbC/A==" crossorigin="anonymous"></script>
Whenever I try to use some function of this library like , it throws me error
function excelChange (event) {
var input = event.target.files[0]
const workbook = createAndFillWorkbook();
const sheet = workbook.xlsx.writeFile(input);
console.log(sheet)}
What should be my approach to achieve this goal ?
CSV Excel files are comma separated like this
name, age,
john, 33,
bob, 55,
So you could just read it as a string and make an array from it. Then render the array into a table. Do the reverse to make a CSV Excel file.
const excelString = `
name, age,
john, 33,
bob, 55,
`;
function parse(string){
return string.trim().split("\n").map(row => row.trim().split(",").map(val => val.trim()).filter(str => str));
}
const parsed = parse(excelString);
const [head, ...body] = parsed;
function addElement(tag, data, editable){
const elm = document.createElement(tag);
if(data) elm.textContent = data;
if(editable) elm.contentEditable = true;
return elm;
}
document.querySelector("table").appendChild(
addElement("tr")
);
head.map(header => {
document.querySelector("tr:last-child").appendChild(
addElement("th", header, true)
);
});
body.map(row => {
document.querySelector("table").appendChild(addElement("tr"));
row.map(item => {
document.querySelector("tr:last-child").appendChild(
addElement("td", item, true)
);
});
});
function table2excel(){
return Array.from(document.querySelectorAll("tr")).map(row => {
return Array.from(row.querySelectorAll("th, td"))
.map(elm => elm.textContent);
}).map(row => row.join(",")).join(",\n").concat(",")
}
const newExcel = table2excel();
console.log(newExcel)
document.querySelector("button").addEventListener("click", () => console.log(table2excel()))
<table />
<button>Log table</button>
I'm trying to filter geoJSON data using the $.getJSON function, but I'm stuck on how to filter the data to what I want and then applying that to the geojson = data.
Here is the code below:
// Fetch the GeoJSON file
$.getJSON(config.geojson, function (data) {
geojson = data
features = $.map(geojson.features, function(feature) {
return feature.properties;
});
featureLayer.addData(data);
buildConfig();
To filter your features according to a specific property (such as contractor === 'Tilson'), use filter on it:
geojson = data.features.filter(function(feature) {
return feature.properties.contractor === 'Tilson';
});
I'm working on a CSV uploader that uses PapaParse as it's CSV parser. For my CSV I would like my first column to act as my header for the parsed data as opposed to the first row. In order to get the expected outcome, I've been having to manually transpose the CSV in the editor before uploading.
The reason for this is that my users find it much easier to edit the CSV when the headers are in the first column and not the first row. Is there a way I can do this in PapaParse (or even JavaScript outside of PapaParse)?
if (file != null) {
Papa.parse(file, {
header: true,
complete: function (results, file) {
console.log("Parsing complete: ", results, file);
}
});
}
I would suggest to parse the array with PapaParse and then perform transpose over the result with JS.
Using this method: https://stackoverflow.com/a/4492703/1625793
So it would look like that transpose(result.data)
-- Update --
const transposed = transpose(result.data)
const headers = transposed.shift();
const res = transposed.map(row => row.reduce((acc, col, ind) => {acc[headers[ind]] = col; return acc}, {}))