I need to convert an excel file with multiple work sheets to json, and I found the following script to do so. However it just console logs each sheet and I want a way to have an array and each element in the array be a sheet. I have tried initializing an array and pushing oJS to the array every time the forEach runs but it doesn't work :(
function filePicked(oEvent) {
// Get The File From The Input
var oFile = oEvent.target.files[0];
var sFilename = oFile.name;
// Create A File Reader HTML5
var reader = new FileReader();
// Ready The Event For When A File Gets Selected
reader.onload = function(e) {
var data = e.target.result;
var cfb = XLS.CFB.read(data, {type: 'binary'});
var wb = XLS.parse_xlscfb(cfb);
// Loop Over Each Sheet
wb.SheetNames.forEach(function(sheetName) {
// Obtain The Current Row As CSV
var sCSV = XLS.utils.make_csv(wb.Sheets[sheetName]);
var oJS = XLS.utils.sheet_to_row_object_array(wb.Sheets[sheetName]);
$("#my_file_output").html(sCSV);
console.log(oJS)
});
};
// Tell JS To Start Reading The File.. You could delay this if desired
fileOut += reader.readAsBinaryString(oFile);
console.log(fileOut);
}
Related
I am getting a JSON back from an API and want to add this data as a new column to an Excel file that already has some data. I wanted to ask that is this possible using just frontend Javascript (without involving Node.js)? If yes, then how?
Yes, you can do it using the library exceljs
Github: https://github.com/exceljs/exceljs
NPM: https://www.npmjs.com/package/exceljs
CDN: https://cdn.jsdelivr.net/npm/exceljs#1.13.0/dist/exceljs.min.js
<input type="file" onchange="parseChoosenExcelFile(this)">
function parseChoosenExcelFile(inputElement) {
var files = inputElement.files || [];
if (!files.length) return;
var file = files[0];
console.time();
var reader = new FileReader();
reader.onloadend = function(event) {
var arrayBuffer = reader.result;
// var buffer = Buffer.from(arrayBuffer)
// debugger
var workbook = new ExcelJS.Workbook();
// workbook.xlsx.read(buffer)
workbook.xlsx.load(arrayBuffer).then(function(workbook) {
console.timeEnd();
var result = ''
workbook.worksheets.forEach(function (sheet) {
sheet.eachRow(function (row, rowNumber) {
result += row.values + ' | \n'
})
})
// Output somewhere your result file
// result2.innerHTML = result
});
};
reader.readAsArrayBuffer(file);
}
I’m trying to attach a file to an email I send in Google Apps Script with MailApp.sendEmail(). In the browser JavaScript I read in the file manually with this code because I’ve already processed the equivalent of the submit button in my HTML form, and it works:
var file = document.getElementById('myfile');
var fileInfo = [];
if(file.files.length) // if there is at least 1 file
{
if (file.files[0].size < maxEmailAttachmentSize) // and its size is < 25M
{
var reader = new FileReader();
reader.onload = function(e)
{
fileInfo[0] = e.target.result;
};
reader.readAsBinaryString(file.files[0]);
fileInfo[1] = file.files[0].name;
fileInfo[2] = file.files[0].type;
fileInfo[3] = file.files[0].size;
}
console.log(fileInfo); // here I see the full file and info. All looks correct.
}
Then I send it up to the server.
google.script.run.withSuccessHandler(emailSent).sendAnEmail(fileInfo);
On the server I pull out the fields and send the email like so:
var fname = fileInfo[1];
var mimeType = fileInfo[2];
var fblob = Utilities.newBlob(fileInfo[0], mimeType, fname);
// all looks right in the Logger at this point
try {
GmailApp.sendEmail(emaiRecipient, emailSubject, emailBody,
{
name: 'Email Sender', // email sender
attachments: [fblob]
}
);
catch …
This works fine when the file is a text file or HTML file but doesn’t when the file is anything else. The file is sent but it's empty and apparently corrupt. Can anyone see what’s wrong with this code? (It doesn’t work with MailApp.sendEmail() either.) I did see in another stackoverflow post that the document has to be saved once, but that is something I definitely don’t want to do. Isn’t there any other way? What am I missing?
Modification points:
FileReader works with the asynchronous process. This has already been mentioned by Rubén's comment.
In the current stage, when the binary data is sent to Google Apps Script side, it seems that it is required to convert it to the string and byte array. This has already been mentioned by TheMaster's comment.
In order to use your Google Apps Script, in this case, I think that converting the file content to the byte array of int8Array might be suitable.
For this, I used readAsArrayBuffer instead of readAsBinaryString.
When above points are reflected to your script, it becomes as follows.
Modified script:
HTML&Javascript side:
// Added
function getFile(file) {
return new Promise((resolve, reject) => {
var reader = new FileReader();
reader.onload = (e) => resolve([...new Int8Array(e.target.result)]);
reader.onerror = (err) => reject(err);
reader.readAsArrayBuffer(file);
});
}
async function main() { // <--- Modified
var file = document.getElementById('myfile');
var fileInfo = [];
if(file.files.length) {
if (file.files[0].size < maxEmailAttachmentSize) {
fileInfo[0] = await getFile(file.files[0]); // <--- Modified
fileInfo[1] = file.files[0].name;
fileInfo[2] = file.files[0].type;
fileInfo[3] = file.files[0].size;
}
console.log(fileInfo); // here I see the full file and info. All looks correct.
google.script.run.withSuccessHandler(emailSent).sendAnEmail(fileInfo);
}
}
Although I'm not sure about your whole script from your question, at the modified script, it supposes that main() is run. When main() is run, the file is converted to the byte array and put it to fileInfo[0].
At Google Apps Script side, from fileInfo, var fblob = Utilities.newBlob(fileInfo[0], mimeType, fname); has the correct blob for Google Apps Script.
Google Apps Script side:
In this modification, your Google Apps Script is not modified.
References:
FileReader
FileReader.readAsArrayBuffer()
Added:
This code looks good but we can't use it because we are running on the Rhino JavaScript engine, not V8. We don't have support for newer JavaScript syntax. Could you give us an example of how it's done with older syntax? Ref
From your above comment, I modified as follows.
Modified script:
HTML&Javascript side:
function main() {
var file = document.getElementById('myfile');
var fileInfo = [];
if(file.files.length) {
if (file.files[0].size < maxEmailAttachmentSize) {
var reader = new FileReader();
reader.onload = function(e) {
var bytes = new Int8Array(e.target.result);
var ar = [];
for (var i = 0; i < bytes.length; i++) {
ar.push(bytes[i]);
}
fileInfo[0] = ar;
fileInfo[1] = file.files[0].name;
fileInfo[2] = file.files[0].type;
fileInfo[3] = file.files[0].size;
console.log(fileInfo); // here I see the full file and info. All looks correct.
google.script.run.withSuccessHandler(emailSent).sendAnEmail(fileInfo);
}
reader.onerror = function(err) {
reject(err);
}
reader.readAsArrayBuffer(file.files[0]);
}
}
}
Google Apps Script side:
In this modification, your Google Apps Script is not modified.
I have a for loop iterating over the number of files
I have to read the first line of each file and add it let's say to a Map having File name as the key and First line of that file as a the value.
I am using FileReader to read the file but it is asynchronous.
When I open a stream to read the file the loop gets incremented before I am done with reading the file and adding my desired entry to the map.
I need a synchronous operation i.e. Read the First line , add it to the Map and then increment the loop and proceed with the next file.
for (var i = 0; i < files.length; i++){
var file = files[i];
var reader = new FileReader();
reader.onload = function(progressEvent){
var lines = progressEvent.target.result.split('\n');
firstLine = lines[0];
alert('FirstLine'+firstLine);
//add to Map here
}
reader.readAsText(file);
}
How to modify the code so as to achieve the above mentioned functionality.
You can use promises and let them run in the order you create them using reduce.
The below code shows how it could be done this way, and you can take a look at this simple JSFiddle that demos the idea.
//create a function that returns a promise
function readFileAndAddToMap(file){
return new Promise(function(resolve, reject){
var reader = new FileReader();
reader.onload = function(progressEvent){
var lines = progressEvent.target.result.split('\n');
firstLine = lines[0];
console.log('FirstLine'+firstLine);
//add to Map here
resolve();
}
reader.onerror = function(error){
reject(error);
}
reader.readAsText(file);
});
}
//create an array to hold your promises
var promises = [];
for (var i = 0; i < files.length; i++){
//push to the array
promises.push(readFileAndAddToMap(files[i]));
}
//use reduce to create a chain in the order of the promise array
promises.reduce(function(cur, next) {
return cur.then(next);
}, Promise.resolve()).then(function() {
//all files read and executed!
}).catch(function(error){
//handle potential error
});
I was facing the same issue, what I did was I removed the for loop and used recursive function instead. That way, I was able to handle the sequence of FileReader.
Below I tried to modify your code based on my logic. feel free to ask any question in comments if you need more clarity.
attachmentI = { i: files.length };
function UploadMe() {
attachmentI.i--;
if(attachmentI.i > -1){
var file = files[attachmentI.i];
var reader = new FileReader();
reader.onload = function(progressEvent){
var lines = progressEvent.target.result.split('\n');
firstLine = lines[0];
alert('FirstLine'+firstLine);
//add to Map here
UploadMe();
}
reader.readAsText(file);
}
I am trying to run protractor with multi capabilities (around 30 browsers with diff versions)
The data sheet is xlsx and is one sheet, which will be consumed. After each run the xlsx row will be updated that it has been 'USED'
I use exceljs to write the flag. But it throws error if its already been used/opened by another process. I handled the exception, but instead of failing , i would like the muti processes to wait and retry accessing the file.
Please suggest how to read/write one xlsx at same time by multiple processes - where the processed has to wait for any previous process to complete its access.
Write Function:
updateRunnerXLS: function(fileName) {
var Excel = require('exceljs'); //Require the exceljs package
var workbook = new Excel.Workbook(); //create a workbook reader
workbook.xlsx.readFile(fileName)
.then(function(workbook) {
var rowValues = []; //initialize empty array
var worksheet = workbook.getWorksheet('sheet2');
var nameCol = worksheet.getColumn('M');
nameCol.eachCell(function(cell, rowNumber) { //Loop through the cells
if (cell.value == 'Y') { //get all rows with Y
rowValues.push(rowNumber); //Fill the array with row number
}
});
var row = worksheet.getRow(rowValues[0]); //Get the first row from the array
row.getCell('M').value = 'X'; //update the first row that has a Y to X
row.commit(); //Commit the row change
workbook.xlsx.writeFile(fileName).then(function() {
//done
}).catch(function (err) {
console.log('Handled the error - ' + err);
})
})
}
Read Function: (simpler that reading using exceljs)
var Parser = require('parse-xlsx');
sheet = new Parser(testDataFolder + 'dataSheet.xlsx', 'sheet2');
I'm trying to make a webpage in html5 which stores sample-data from a wav-file in an array. Is there any way to get the sample-data with javascript?
I'm using a file-input to select the wav-file.
In the javascript I already added:
document.getElementById('fileinput').addEventListener('change', readFile, false);
but I have no idea what to do in readFile.
EDIT:
I tried to get the file in an ArrayBuffer, pass it to the decodeAudioData method and get a typedArraybuffer out of it.
This is my code:
var openFile = function(event) {
var input = event.target;
var audioContext = new AudioContext();
var reader = new FileReader();
reader.onload = function(){
var arrayBuffer = reader.result;
console.log("arrayBuffer:");
console.log(arrayBuffer);
audioContext.decodeAudioData(arrayBuffer, decodedDone);
};
reader.readAsArrayBuffer(input.files[0]);
};
function decodedDone(decoded) {
var typedArray = new Uint32Array(decoded, 1, decoded.length);
console.log("decoded");
console.log(decoded);
console.log("typedArray");
console.log(typedArray);
for (i=0; i<10; i++)
{
console.log(typedArray[i]);
}
}
The elements of typedArray are all 0. Is my way of creating the typedArray wrong or did I do something else wrong on?
EDIT:
I finally got it. This is my code:
var openFile = function(event) {
var input = event.target;
var audioContext = new AudioContext();
var reader = new FileReader();
reader.onload = function(){
var arrayBuffer = reader.result;
console.log("arrayBuffer:");
console.log(arrayBuffer);
audioContext.decodeAudioData(arrayBuffer, decodedDone);
};
reader.readAsArrayBuffer(input.files[0]);
};
function decodedDone(decoded) {
var typedArray = new Float32Array(decoded.length);
typedArray=decoded.getChannelData(0);
console.log("typedArray:");
console.log(typedArray);
}
Thanks for the answers!
You'll need to learn a lot about Web APIs to accomplish that, but in the end it's quite simple.
Get the file contents in an ArrayBuffer with the File API
Pass it to the Web Audio API's decodeAudioData method.
Get a typed ArrayBuffer with the raw samples you wanted.
Edit: If you want to implement an equalizer, you're wasting your time, there's a native equalizer node in the Audio API. Depending on the length of your wave file it might be better not to load it all in memory and instead to just create a source that reads from it and connect that source to an equalizer node.
Here's a simple code example to get a Float32Array from a wav audio file in JavaScript:
let audioData = await fetch("https://example.com/test.wav").then(r => r.arrayBuffer());
let audioCtx = new AudioContext({sampleRate:44100});
let decodedData = await audioCtx.decodeAudioData(audioData); // audio is resampled to the AudioContext's sampling rate
console.log(decodedData.length, decodedData.duration, decodedData.sampleRate, decodedData.numberOfChannels);
let float32Data = decodedData.getChannelData(0); // Float32Array for channel 0