Problem transferring a json string from the client to the esp8266 server - javascript

As part of my WLAN Thermometer project, I am planning a small file management for the files stored on the ESP. In this context, I need to transfer a list of file names from the client to the server. Because there is also the wonderful ArduinoJSON library for the ESP 8266, I would like to pass the data as a JSON object. The first excerpt from the scripts.js of my webpage shows how to create the filelist (contains all available files at ESP Filesystem) and compile and transfer the deletelist (whose elements should be deleted).
let fileID = 0
for (i = 2; i < FDatas.length; i += 2)
{
let fileInfo = {
name: FDatas[i],
size: FDatas[i+1],
fileID: fileID,
marked: false};
fileList.push(fileInfo);
};
}
function deleteFiles() {
let deleteFileList = [];
let fileID = 0;
for (let i = 0; i < fileList.length; i++) {
if (fileList[i].marked == true) {
let keyname = 'fileID_' + String(fileID);
fileID += 1;
let newEntry = {[keyname]:fileList[i].name}
deleteFileList.push(newEntry);
}
}
if (deleteFileList.length > 0) {
var xhttp = new XMLHttpRequest();
var formData = JSON.stringify(deleteFileList);
xhttp.open("POST", "/deleteFiles");
xhttp.send(formData);
}
}
On the server side, communication is organized as follows:
In the setup part of the arduino code:
webserver.on("/deleteFiles", HTTP_POST, deleteFiles);
In the handler:
void deleteFiles() {
String input = webserver.arg("plain");
Serial println(input);
DynamicJsonDocument doc(2048);
DeserializationError err = deserializeJson(doc, input);
if (err) {
Serial.println(F("deserializeJson() failed with code "));
Serial.println(err.f_str());
}
JsonObject obj = doc.as<JsonObject>();
// Loop through all the key-value pairs in obj
for (JsonPair p : obj) {
Serial.println(p.key().c_str());
if (p.value().is<const char*>()) {
auto s = p.value().as<const char*>();
Serial.println(s);
}
}
webserver.send(200);
}
The result of these efforts is sobering. Nevertheless, the Serial.println(input); - command outputs the following,
[{"fileID_0":"/settings.json"},{"fileID_1":"/tdata.js"},{"fileID_2":"/scripts.js"}]
the passage through the JSON object does not result in key value pairs.
Where is my mistake? Thank you very much for your good advice.
1. Udate:
After first comment (Thank You) I've changed the arduino-code to:
void deleteFiles() {
String input = webserver.arg("plain");
Serial.println(input);
DynamicJsonDocument doc(2048);
DeserializationError err = deserializeJson(doc, input);
if (err) {
Serial.println(F("deserializeJson() failed with code "));
Serial.println(err.f_str());
}
JsonArray arr = doc.to<JsonArray>();
for(JsonVariant v : arr) {
Serial.println(v.as<const char*>());
}
webserver.send(200);
}
Unfortunately, the result is the same. No result in the loop.

Your json object consists of an array(each element of an array is indexed by a number like 0, 1, 2...), and within the array, there are these 3 json objects. So to access the data of each array element, you do doc[0] and so on. You can then access each key value pair with doc[0]['key'] notation.
StaticJsonDocument<192> doc;
DeserializationError error = deserializeJson(doc, input);
if (error) {
Serial.print(F("deserializeJson() failed: "));
Serial.println(error.f_str());
return;
}
const char* element1 = doc[0]["fileID_0"]; // "/settings.json"
const char* element2 = doc[1]["fileID_1"]; // "/tdata.js"
const char* element3 = doc[2]["fileID_2"]; // "/scripts.js"

Related

How to correctly parse Json Objects from text Stream delimited by new lines

I have written some code to read JSON objects from a stream returned from a fetch API. However, I am unable to successfully parse out the JSON objects successfully. I want return the JSON objects from the chunks in a array and feed the results to a transformer that will apply some custom transformation the the final JSON. However, I am unable to deal with the partial chunks that result from the chunks after splitting the chunks into new lines. My main problem is I don't know how the chunking format looks from the data returned in the stream. I have written some code which does not work entirely. I am looking for some help to get it parsing out all the JSON objects. Here is my code that attempts to do that. First, I wrote an async iterator that works fine. Next, i use that to split the chunks into new lines
function streamAsyncIterator(stream) {
const reader = stream.pipeThrough(new TextDecoderStream()).getReader();
return {
next() {
return reader.read();
},
return() {
reader.releaseLock();
return {};
},
[Symbol.asyncIterator]() {
return this;
}
};
}
Next, I use the iterator to to get new lines to convert to JSON objects; Here is that code, which partially works:
const fetchProductsStream = async () => {
const body = {
searchRange: ['fragrance', 'perfumery', 'cologne'],
exclusionCategory: ['discounted']
}
let time1 = performance.now();
const response = await fetch('/products/stream', {
method: 'POST',
body: JSON.stringify(body)
});
let chunkCount = 0;
let remaining = "";
let currentChunk = "";
let totalItems = 0;
for await (const chunk of streamAsyncIterator(response.body)) {
let items = [];
try{
currentChunk = chunk;
let lastIndex;
if(remaining.trim().length > 1){
currentChunk = remaining + chunk;
}
currentChunk.split(/\r?\n/).forEach((line, index) => {
try{
items.push(JSON.parse(line));
console.log(`Pushed line ${line} to items array`);
lastIndex = index;
}catch(err){
console.error(`Error is ${err}`);
}
})
totalItems += items.length;
remaining = "";
let i = currentChunk.indexOf(lastIndex);
if(i != -1){
remaining = currentChunk.substring(lastIndex);
}
items = [];
}catch(error){
console.log(error);
}
chunkCount++;
}
let time2 = performance.now();
console.log(`Total number of items is ${totalItems}`);
console.log(`Total number of chunks :: ${chunkCount} in time
${(time2 - time1)/1000} seconds`)
}
Here is what a sample looks like
"{"productId":"1671","category":"fragrance","categoryId":1WW23,"productName":"Paco Rabane","productClass":"M","warehouseCode":"1242","alternateName":"ALT_ELL332"}\n{"productId":"1671","category":"fragrance","categoryId":1WW23,"productName":"Paco Rabane","productClass":"M","warehouseCode":"1242","alternateName":"ALT_ELL332"}\n{"productId":"1671","category":"fragrance","categoryId":1WW23,"productName":"Paco Rabane","productClass":"M","warehouseCode":"1242","alternateName":"ALT_ELL332","category":{"desc":"1654 Cologne Perfume","code":"221","feature":"1","salesCode":"S2237"}\n{"productId":"1671","category":"fragrance","categoryId":1WW23,"productName":"Paco Rabane","productClass":"M","warehouseCode":"1242","alternateName":"ALT_ELL332"}"
This is my attempt, which does not work perfectly, since there are some json objects that are not correctly parsed. Most of them are, so I'd appreciate some help in getting it to work perfectly. If there is a better way to do it, I will also be grateful for that solution.

Split a Javascript string by comma, but ignore commas that would be inside a string [duplicate]

My CSV data looks like this:
heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2
...
How do you read this data and convert to an array like this using JavaScript?:
[
heading1: value1_1,
heading2: value2_1,
heading3: value3_1,
heading4: value4_1
heading5: value5_1
],[
heading1: value1_2,
heading2: value2_2,
heading3: value3_2,
heading4: value4_2,
heading5: value5_2
]
....
I've tried this code but no luck!:
<script type="text/javascript">
var allText =[];
var allTextLines = [];
var Lines = [];
var txtFile = new XMLHttpRequest();
txtFile.open("GET", "file://d:/data.txt", true);
txtFile.onreadystatechange = function()
{
allText = txtFile.responseText;
allTextLines = allText.split(/\r\n|\n/);
};
document.write(allTextLines);
document.write(allText);
document.write(txtFile);
</script>
No need to write your own...
The jQuery-CSV library has a function called $.csv.toObjects(csv) that does the mapping automatically.
Note: The library is designed to handle any CSV data that is RFC 4180 compliant, including all of the nasty edge cases that most 'simple' solutions overlook.
Like #Blazemonger already stated, first you need to add line breaks to make the data valid CSV.
Using the following dataset:
heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2
Use the code:
var data = $.csv.toObjects(csv):
The output saved in 'data' will be:
[
{ heading1:"value1_1",heading2:"value2_1",heading3:"value3_1",heading4:"value4_1",heading5:"value5_1" }
{ heading1:"value1_2",heading2:"value2_2",heading3:"value3_2",heading4:"value4_2",heading5:"value5_2" }
]
Note: Technically, the way you wrote the key-value mapping is invalid JavaScript. The objects containing the key-value pairs should be wrapped in brackets.
If you want to try it out for yourself, I suggest you take a look at the Basic Usage Demonstration under the 'toObjects()' tab.
Disclaimer: I'm the original author of jQuery-CSV.
Update:
Edited to use the dataset that the op provided and included a link to the demo where the data can be tested for validity.
Update2:
Due to the shuttering of Google Code. jquery-csv has moved to GitHub
NOTE: I concocted this solution before I was reminded about all the "special cases" that can occur in a valid CSV file, like escaped quotes. I'm leaving my answer for those who want something quick and dirty, but I recommend Evan's answer for accuracy.
This code will work when your data.txt file is one long string of comma-separated entries, with no newlines:
data.txt:
heading1,heading2,heading3,heading4,heading5,value1_1,...,value5_2
javascript:
$(document).ready(function() {
$.ajax({
type: "GET",
url: "data.txt",
dataType: "text",
success: function(data) {processData(data);}
});
});
function processData(allText) {
var record_num = 5; // or however many elements there are in each row
var allTextLines = allText.split(/\r\n|\n/);
var entries = allTextLines[0].split(',');
var lines = [];
var headings = entries.splice(0,record_num);
while (entries.length>0) {
var tarr = [];
for (var j=0; j<record_num; j++) {
tarr.push(headings[j]+":"+entries.shift());
}
lines.push(tarr);
}
// alert(lines);
}
The following code will work on a "true" CSV file with linebreaks between each set of records:
data.txt:
heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2
javascript:
$(document).ready(function() {
$.ajax({
type: "GET",
url: "data.txt",
dataType: "text",
success: function(data) {processData(data);}
});
});
function processData(allText) {
var allTextLines = allText.split(/\r\n|\n/);
var headers = allTextLines[0].split(',');
var lines = [];
for (var i=1; i<allTextLines.length; i++) {
var data = allTextLines[i].split(',');
if (data.length == headers.length) {
var tarr = [];
for (var j=0; j<headers.length; j++) {
tarr.push(headers[j]+":"+data[j]);
}
lines.push(tarr);
}
}
// alert(lines);
}
http://jsfiddle.net/mblase75/dcqxr/
Don't split on commas -- it won't work for most CSV files, and this question has wayyyy too many views for the asker's kind of input data to apply to everyone. Parsing CSV is kind of scary since there's no truly official standard, and lots of delimited text writers don't consider edge cases.
This question is old, but I believe there's a better solution now that Papa Parse is available. It's a library I wrote, with help from contributors, that parses CSV text or files. It's the only JS library I know of that supports files gigabytes in size. It also handles malformed input gracefully.
1 GB file parsed in 1 minute:
(Update: With Papa Parse 4, the same file took only about 30 seconds in Firefox. Papa Parse 4 is now the fastest known CSV parser for the browser.)
Parsing text is very easy:
var data = Papa.parse(csvString);
Parsing files is also easy:
Papa.parse(file, {
complete: function(results) {
console.log(results);
}
});
Streaming files is similar (here's an example that streams a remote file):
Papa.parse("http://example.com/bigfoo.csv", {
download: true,
step: function(row) {
console.log("Row:", row.data);
},
complete: function() {
console.log("All done!");
}
});
If your web page locks up during parsing, Papa can use web workers to keep your web site reactive.
Papa can auto-detect delimiters and match values up with header columns, if a header row is present. It can also turn numeric values into actual number types. It appropriately parses line breaks and quotes and other weird situations, and even handles malformed input as robustly as possible. I've drawn on inspiration from existing libraries to make Papa, so props to other JS implementations.
I am using d3.js for parsing csv file. Very easy to use.
Here is the docs.
Steps:
npm install d3-request
Using Es6;
import { csv } from 'd3-request';
import url from 'path/to/data.csv';
csv(url, function(err, data) {
console.log(data);
})
Please see docs for more.
Update -
d3-request is deprecated. you can use d3-fetch
Here's a JavaScript function that parses CSV data, accounting for commas found inside quotes.
// Parse a CSV row, accounting for commas inside quotes
function parse(row){
var insideQuote = false,
entries = [],
entry = [];
row.split('').forEach(function (character) {
if(character === '"') {
insideQuote = !insideQuote;
} else {
if(character == "," && !insideQuote) {
entries.push(entry.join(''));
entry = [];
} else {
entry.push(character);
}
}
});
entries.push(entry.join(''));
return entries;
}
Example use of the function to parse a CSV file that looks like this:
"foo, the column",bar
2,3
"4, the value",5
into arrays:
// csv could contain the content read from a csv file
var csv = '"foo, the column",bar\n2,3\n"4, the value",5',
// Split the input into lines
lines = csv.split('\n'),
// Extract column names from the first line
columnNamesLine = lines[0],
columnNames = parse(columnNamesLine),
// Extract data from subsequent lines
dataLines = lines.slice(1),
data = dataLines.map(parse);
// Prints ["foo, the column","bar"]
console.log(JSON.stringify(columnNames));
// Prints [["2","3"],["4, the value","5"]]
console.log(JSON.stringify(data));
Here's how you can transform the data into objects, like D3's csv parser (which is a solid third party solution):
var dataObjects = data.map(function (arr) {
var dataObject = {};
columnNames.forEach(function(columnName, i){
dataObject[columnName] = arr[i];
});
return dataObject;
});
// Prints [{"foo":"2","bar":"3"},{"foo":"4","bar":"5"}]
console.log(JSON.stringify(dataObjects));
Here's a working fiddle of this code.
Enjoy! --Curran
You can use PapaParse to help.
https://www.papaparse.com/
Here is a CodePen.
https://codepen.io/sandro-wiggers/pen/VxrxNJ
Papa.parse(e, {
header:true,
before: function(file, inputElem){ console.log('Attempting to Parse...')},
error: function(err, file, inputElem, reason){ console.log(err); },
complete: function(results, file){ $.PAYLOAD = results; }
});
If you want to solve this without using Ajax, use the FileReader() Web API.
Example implementation:
Select .csv file
See output
function readSingleFile(e) {
var file = e.target.files[0];
if (!file) {
return;
}
var reader = new FileReader();
reader.onload = function(e) {
var contents = e.target.result;
displayContents(contents);
displayParsed(contents);
};
reader.readAsText(file);
}
function displayContents(contents) {
var element = document.getElementById('file-content');
element.textContent = contents;
}
function displayParsed(contents) {
const element = document.getElementById('file-parsed');
const json = contents.split(',');
element.textContent = JSON.stringify(json);
}
document.getElementById('file-input').addEventListener('change', readSingleFile, false);
<input type="file" id="file-input" />
<h3>Raw contents of the file:</h3>
<pre id="file-content">No data yet.</pre>
<h3>Parsed file contents:</h3>
<pre id="file-parsed">No data yet.</pre>
function CSVParse(csvFile)
{
this.rows = [];
var fieldRegEx = new RegExp('(?:\s*"((?:""|[^"])*)"\s*|\s*((?:""|[^",\r\n])*(?:""|[^"\s,\r\n]))?\s*)(,|[\r\n]+|$)', "g");
var row = [];
var currMatch = null;
while (currMatch = fieldRegEx.exec(this.csvFile))
{
row.push([currMatch[1], currMatch[2]].join('')); // concatenate with potential nulls
if (currMatch[3] != ',')
{
this.rows.push(row);
row = [];
}
if (currMatch[3].length == 0)
break;
}
}
I like to have the regex do as much as possible. This regex treats all items as either quoted or unquoted, followed by either a column delimiter, or a row delimiter. Or the end of text.
Which is why that last condition -- without it it would be an infinite loop since the pattern can match a zero length field (totally valid in csv). But since $ is a zero length assertion, it won't progress to a non match and end the loop.
And FYI, I had to make the second alternative exclude quotes surrounding the value; seems like it was executing before the first alternative on my javascript engine and considering the quotes as part of the unquoted value. I won't ask -- just got it to work.
Per the accepted answer,
I got this to work by changing the 1 to a 0 here:
for (var i=1; i<allTextLines.length; i++) {
changed to
for (var i=0; i<allTextLines.length; i++) {
It will compute the a file with one continuous line as having an allTextLines.length of 1. So if the loop starts at 1 and runs as long as it's less than 1, it never runs. Hence the blank alert box.
$(function() {
$("#upload").bind("click", function() {
var regex = /^([a-zA-Z0-9\s_\\.\-:])+(.csv|.xlsx)$/;
if (regex.test($("#fileUpload").val().toLowerCase())) {
if (typeof(FileReader) != "undefined") {
var reader = new FileReader();
reader.onload = function(e) {
var customers = new Array();
var rows = e.target.result.split("\r\n");
for (var i = 0; i < rows.length - 1; i++) {
var cells = rows[i].split(",");
if (cells[0] == "" || cells[0] == undefined) {
var s = customers[customers.length - 1];
s.Ord.push(cells[2]);
} else {
var dt = customers.find(x => x.Number === cells[0]);
if (dt == undefined) {
if (cells.length > 1) {
var customer = {};
customer.Number = cells[0];
customer.Name = cells[1];
customer.Ord = new Array();
customer.Ord.push(cells[2]);
customer.Point_ID = cells[3];
customer.Point_Name = cells[4];
customer.Point_Type = cells[5];
customer.Set_ORD = cells[6];
customers.push(customer);
}
} else {
var dtt = dt;
dtt.Ord.push(cells[2]);
}
}
}
Actually you can use a light-weight library called any-text.
install dependencies
npm i -D any-text
use custom command to read files
var reader = require('any-text');
reader.getText(`path-to-file`).then(function (data) {
console.log(data);
});
or use async-await :
var reader = require('any-text');
const chai = require('chai');
const expect = chai.expect;
describe('file reader checks', () => {
it('check csv file content', async () => {
expect(
await reader.getText(`${process.cwd()}/test/files/dummy.csv`)
).to.contains('Lorem ipsum');
});
});
This is an old question and in 2022 there are many ways to achieve this. First, I think D3 is one of the best alternatives for data manipulation. It's open sourced and free to use, but also it's modular so we can import just the fetch module.
Here is a basic example. We will use the legacy mode so I will import the entire D3 library. Now, let's call d3.csv function and it's done. This function internally calls the fetch method therefore, it can open dataURL, url, files, blob, and so on.
const fileInput = document.getElementById('csv')
const outElement = document.getElementById('out')
const previewCSVData = async dataurl => {
const d = await d3.csv(dataurl)
console.log({
d
})
outElement.textContent = d.columns
}
const readFile = e => {
const file = fileInput.files[0]
const reader = new FileReader()
reader.onload = () => {
const dataUrl = reader.result;
previewCSVData(dataUrl)
}
reader.readAsDataURL(file)
}
fileInput.onchange = readFile
<script type="text/javascript" src="https://unpkg.com/d3#7.6.1/dist/d3.min.js"></script>
<div>
<p>Select local CSV File:</p>
<input id="csv" type="file" accept=".csv">
</div>
<pre id="out"><p>File headers will appear here</p></pre>
If we don't want to use any library and we just want to use pain JavaScrip (Vanilla JS) and we managed to get the text content of a file as data and we don't want to use d3 we can implement a simple function that will split the data into a text array then we will extract the first line and split into a headers array and the rest of the text will be the lines we will process. After, we map each line and extract its values and create a row object from an array created from mapping each header to its correspondent value from values[index].
NOTE:
We also we going to use a little trick array objects in JavaScript can also have attributes. Yes so we will define an attribute rows.headers and assign the headers to it.
const data = `heading_1,heading_2,heading_3,heading_4,heading_5
value_1_1,value_2_1,value_3_1,value_4_1,value_5_1
value_1_2,value_2_2,value_3_2,value_4_2,value_5_2
value_1_3,value_2_3,value_3_3,value_4_3,value_5_3`
const csvParser = data => {
const text = data.split(/\r\n|\n/)
const [first, ...lines] = text
const headers = first.split(',')
const rows = []
rows.headers = headers
lines.map(line => {
const values = line.split(',')
const row = Object.fromEntries(headers.map((header, i) => [header, values[i]]))
rows.push(row)
})
return rows
}
const d = csvParser(data)
// Accessing to the theaders attribute
const headers = d.headers
console.log({headers})
console.log({d})
Finally, let's implement a vanilla JS file loader using fetch and parsing the csv file.
const fetchFile = async dataURL => {
return await fetch(dataURL).then(response => response.text())
}
const csvParser = data => {
const text = data.split(/\r\n|\n/)
const [first, ...lines] = text
const headers = first.split(',')
const rows = []
rows.headers = headers
lines.map(line => {
const values = line.split(',')
const row = Object.fromEntries(headers.map((header, i) => [header, values[i]]))
rows.push(row)
})
return rows
}
const fileInput = document.getElementById('csv')
const outElement = document.getElementById('out')
const previewCSVData = async dataURL => {
const data = await fetchFile(dataURL)
const d = csvParser(data)
console.log({ d })
outElement.textContent = d.headers
}
const readFile = e => {
const file = fileInput.files[0]
const reader = new FileReader()
reader.onload = () => {
const dataURL = reader.result;
previewCSVData(dataURL)
}
reader.readAsDataURL(file)
}
fileInput.onchange = readFile
<script type="text/javascript" src="https://unpkg.com/d3#7.6.1/dist/d3.min.js"></script>
<div>
<p>Select local CSV File:</p>
<input id="csv" type="file" accept=".csv">
</div>
<pre id="out"><p>File contents will appear here</p></pre>
I used this file to test it
Here is another way to read an external CSV into Javascript (using jQuery).
It's a little bit more long winded, but I feel by reading the data into arrays you can exactly follow the process and makes for easy troubleshooting.
Might help someone else.
The data file example:
Time,data1,data2,data2
08/11/2015 07:30:16,602,0.009,321
And here is the code:
$(document).ready(function() {
// AJAX in the data file
$.ajax({
type: "GET",
url: "data.csv",
dataType: "text",
success: function(data) {processData(data);}
});
// Let's process the data from the data file
function processData(data) {
var lines = data.split(/\r\n|\n/);
//Set up the data arrays
var time = [];
var data1 = [];
var data2 = [];
var data3 = [];
var headings = lines[0].split(','); // Splice up the first row to get the headings
for (var j=1; j<lines.length; j++) {
var values = lines[j].split(','); // Split up the comma seperated values
// We read the key,1st, 2nd and 3rd rows
time.push(values[0]); // Read in as string
// Recommended to read in as float, since we'll be doing some operations on this later.
data1.push(parseFloat(values[1]));
data2.push(parseFloat(values[2]));
data3.push(parseFloat(values[3]));
}
// For display
var x= 0;
console.log(headings[0]+" : "+time[x]+headings[1]+" : "+data1[x]+headings[2]+" : "+data2[x]+headings[4]+" : "+data2[x]);
}
})
Hope this helps someone in the future!
A bit late but I hope it helps someone.
Some time ago even I faced a problem where the string data contained \n in between and while reading the file it used to read as different lines.
Eg.
"Harry\nPotter","21","Gryffindor"
While-Reading:
Harry
Potter,21,Gryffindor
I had used a library csvtojson in my angular project to solve this problem.
You can read the CSV file as a string using the following code and then pass that string to the csvtojson library and it will give you a list of JSON.
Sample Code:
const csv = require('csvtojson');
if (files && files.length > 0) {
const file: File = files.item(0);
const reader: FileReader = new FileReader();
reader.readAsText(file);
reader.onload = (e) => {
const csvs: string = reader.result as string;
csv({
output: "json",
noheader: false
}).fromString(csvs)
.preFileLine((fileLine, idx) => {
//Convert csv header row to lowercase before parse csv file to json
if (idx === 0) { return fileLine.toLowerCase() }
return fileLine;
})
.then((result) => {
// list of json in result
});
}
}
I use the jquery-csv to do this.
and I provide two examples as below
async function ReadFile(file) {
return await file.text()
}
function removeExtraSpace(stringData) {
stringData = stringData.replace(/,( *)/gm, ",") // remove extra space
stringData = stringData.replace(/^ *| *$/gm, "") // remove space on the beginning and end.
return stringData
}
function simpleTest() {
let data = `Name, Age, msg
foo, 25, hello world
bar, 18, "!! 🐬 !!"
`
data = removeExtraSpace(data)
console.log(data)
const options = {
separator: ",", // default "," . (You may want to Tab "\t" or somethings.
delimiter: '"', // default "
headers: true // default true
}
// const myObj = $.csv.toObjects(data, options)
const myObj = $.csv.toObjects(data) // If you want to use default options, then you can omit them.
console.log(myObj)
}
window.onload = () => {
const inputFile = document.getElementById("uploadFile")
inputFile.onchange = () => {
const inputValue = inputFile.value
if (inputValue === "") {
return
}
const selectedFile = document.getElementById('uploadFile').files[0]
const promise = new Promise(resolve => {
const fileContent = ReadFile(selectedFile)
resolve(fileContent)
})
promise.then(fileContent => {
// Use promise to wait for the file reading to finish.
console.log(fileContent)
fileContent = removeExtraSpace(fileContent)
const myObj = $.csv.toObjects(fileContent)
console.log(myObj)
})
}
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.6.0/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery-csv/1.0.11/jquery.csv.min.js"></script>
<label for="uploadFile">Demo 1</label>
<input type="file" id="uploadFile" accept=".csv"/>
<button onclick="simpleTest()">Demo 2</button>
With this function csvToObjs you can transform data-entries from format CSV to an array of objects.
function csvToObjs(string) {
const lines = data.split(/\r\n|\n/);
let [headings, ...entries] = lines;
headings = headings.split(',');
const objs = [];
entries.map(entry=>{
obj = entry.split(',');
objs.push(Object.fromEntries(headings.map((head, i)=>[head, obj[i]])));
})
return objs;
}
data = `heading1,heading2,heading3,heading4,heading5
value1_1,value2_1,value3_1,value4_1,value5_1
value1_2,value2_2,value3_2,value4_2,value5_2`
console.log(csvToObjs(data));

Zapier Code (JavaStep) Action: How to convert Server XML response body to JSON or JS Object?

I am trying to use Zapier Javascript code step to invoke a HTML query to a remote server. The server returns an XML response body that needs to be parsed further into a JSON object.
Firstly, I had some trouble finding a quick way to do this conversion. There are modules/packages that are available using JQuery and other frameworks, which are unsupported by Zapier (Zapier engine supports NodeJS 8.10.x without many additional modules/packages). The standard documentation seems to be created for a case returning JSON object already.
I have the following code, and I am experiencing error relative to that as well:
EDIT (06/05):
Based on the comment, I am posting the calling code to the method xml2JSON.
let url = '<https://remote-sever-base-url?method, params etc.>'
let response = await fetch(url);
if (response.ok) { // if HTTP-status is 200-299
// Response body is in XML format. No JSON available.
var xmlDoc = await response.text();
var jsondata = xmlToJson(xmlDoc);
console.log(jsonData);
} else {
alert("HTTP-Error: " + response.status);
}
// Changes XML to JSON
function xmlToJson(xml) {
// Create the return object
var obj = {};
if (xml.nodeType == 1) { // element
// do attributes
if (xml.attributes.length > 0) {
obj["#attributes"] = {};
for (var j = 0; j < xml.attributes.length; j++) {
var attribute = xml.attributes.item(j);
obj["#attributes"][attribute.nodeName] = attribute.nodeValue;
}
}
} else if (xml.nodeType == 3) { // text
obj = xml.nodeValue;
}
// do children
if (xml.hasChildNodes()) {
for(var i = 0; i < xml.childNodes.length; i++) {
var item = xml.childNodes.item(i);
var nodeName = item.nodeName;
if (typeof(obj[nodeName]) == "undefined") {
obj[nodeName] = xmlToJson(item);
} else {
if (typeof(obj[nodeName].push) == "undefined") {
var old = obj[nodeName];
obj[nodeName] = [];
obj[nodeName].push(old);
}
obj[nodeName].push(xmlToJson(item));
}
}
}
return obj;
};
The error:
The run javascript could not be sent to Code by Zapier.
TypeError: xml.hasChildNodes is not a function
FYI - the incoming XML is pretty flat and predictable in its structure.
<result>
<record>
<field1>
<field 2>
....fields continue
</record>
...... list continues
</result>
Therefore the DOM does have child nodes, though not very deep tree.
Also, since these are records out of a relational DB, all record structures within a result set are similar, though between different POSTs they will differ. Hope this clarifies.
When strung together, you're basically calling (await response.text()).hasChildNodes(), which is a function that doesn't exist (hence your error). response.text() resolves to a regular string, not any sort of XML object.
Instead of parsing it yourself, I'd recommend using the "Webhooks by Zapier" app to make your request - it'll do the XML parsing for you and return a json-like object you can use downstream.

Converting XML Blob in Javascript [duplicate]

I am looking for a JavaScript library that parses an XML string and converts it to a JavaScript object. What are some good ones?
The following function parses XML and returns a JavaScript object with a scheme that corresponds to the XML. XML siblings w/ the same name are collapsed into arrays. nodes with names that can be found in the arrayTags parameter (array of tag name strings) always yield arrays even in case of only one tag occurrence. arrayTags can be omitted. Text nodes with only spaces are discarded.
function parseXml(xml, arrayTags) {
let dom = null;
if (window.DOMParser) dom = (new DOMParser()).parseFromString(xml, "text/xml");
else if (window.ActiveXObject) {
dom = new ActiveXObject('Microsoft.XMLDOM');
dom.async = false;
if (!dom.loadXML(xml)) throw dom.parseError.reason + " " + dom.parseError.srcText;
}
else throw new Error("cannot parse xml string!");
function parseNode(xmlNode, result) {
if (xmlNode.nodeName == "#text") {
let v = xmlNode.nodeValue;
if (v.trim()) result['#text'] = v;
return;
}
let jsonNode = {},
existing = result[xmlNode.nodeName];
if (existing) {
if (!Array.isArray(existing)) result[xmlNode.nodeName] = [existing, jsonNode];
else result[xmlNode.nodeName].push(jsonNode);
}
else {
if (arrayTags && arrayTags.indexOf(xmlNode.nodeName) != -1) result[xmlNode.nodeName] = [jsonNode];
else result[xmlNode.nodeName] = jsonNode;
}
if (xmlNode.attributes) for (let attribute of xmlNode.attributes) jsonNode[attribute.nodeName] = attribute.nodeValue;
for (let node of xmlNode.childNodes) parseNode(node, jsonNode);
}
let result = {};
for (let node of dom.childNodes) parseNode(node, result);
return result;
}
Here's a nice xml2json and json2xml converter:
http://goessner.net/download/prj/jsonxml/
Related tutorial: http://www.xml.com/pub/a/2006/05/31/converting-between-xml-and-json.html
Here's another one:
http://www.kawa.net/works/js/xml/objtree-e.html
Depending on your needs, you might be able to use a standard parser (see http://www.w3schools.com/XML/tryit.asp?filename=tryxml_parsertest2) and xpath (http://www.w3schools.com/xpath/default.asp) - here's an example:
http://snippets.dzone.com/posts/show/5272
and a few nice tutorials:
http://www.nczonline.net/blog/2009/03/17/xpath-in-javascript-part-1/
https://developer.mozilla.org/en/introduction_to_using_xpath_in_javascript
Going straight to the point (using node-xml2json):
npm install xml2json
Then, use it:
const parser = require('xml2json');
const obj = parser.toJson(xml, { object: true });
Example:
const parser = require('xml2json');
const xml = '<root><person><name>Bob Dylan</name></person></root>';
const obj = parser.toJson(xml, { object: true });
const { person } = obj.root;
person.name; // Bob Dylan
You can also convert from JSON to XML, and much more.
I wanted a simple Typescript version that didn't create additional #text objects and also disregarded attributes. If that's what you need, here's the code:
export class DomFuncs {
static parseNode = (node: Node) => {
const childNodes = node.childNodes;
if (childNodes.length === 0) {
return node.nodeValue;
} else if (childNodes.length === 1 && childNodes[0].nodeType === Node.TEXT_NODE) {
return childNodes[0].nodeValue;
} else {
const obj = {};
childNodes.forEach(childNode => {
const childName = childNode.nodeName;
const childValue = obj[childName];
if (childValue !== undefined) {
if (Array.isArray(childValue)) {
childValue.push(DomFuncs.parseNode(childNode));
} else {
obj[childName] = [childValue, DomFuncs.parseNode(childNode)];
}
} else {
obj[childName] = DomFuncs.parseNode(childNode);
}
});
return obj;
}
};
static xml2obj = (str: string) => {
const dom = (new DOMParser()).parseFromString(str, 'text/xml')
const result = {[dom.nodeName]: DomFuncs.parseNode(dom)};
return result;
}
}
To use it:
DomFuncs.xml2obj(xmlString);
This script currently disregards XML attributes since my converted object didn't require them. If you need that, let me know and I could update the code.
The xml2json javascript file from https://bitbucket.org/surenrao/xml2json is all you need to do this.
Here's the download link for quick download: https://bitbucket.org/surenrao/xml2json/get/0e0989dfe48e.zip
Once included in your project, here's some sample code to get you started:
var xmlStr = "<root><person><name>Bob Dylan</name></person></root>";
var jsObj = X2J.parseXml(xmlStr);
var result = jsObj[0].root[0].person[0].name[0].jValue; //Bob Dylan

Javascript Rewrite Config File

I have a config.js file which I believe is JSON which is called when the application first starts:
var config={};
config.user = [
{id:'JSMITH', priceModify:'true'},
{id:'JBLOGGS', priceModify:'false'},
]
config.price = [
{id:"price01", name:"priceName01", primary:"57.25", secondary:"34.54"},
{id:"price02", name:"priceName02", primary:"98.26", secondary:"139.45"},
{id:"price03", name:"priceName03", primary:"13.87", secondary:"29.13"}
]
To pull / push data I just use the following:
// Read
var curPrice = config.price[0].primary;
// Write
config.price[0].primary = "98.24";
How do I go about exporting the config file with the new value so that it will load next time the application is opened? I can use the file system object to write the file, I just don't understand how I would export everything (and preferably keep the same format).
I originally thought about reading the whole config file into a variable, cycling through to find the required block, id, and key and replacing the value, then writing the whole thing back, but I can't seem to figure out how to replace that specific value only.
Any help would be greatly appreciated
Edit Apologies, I forgot to mention that this application is completely offline and uses local directories
Solution
I stumbled across a few solutions to different issues which, when combined, gave me the perfect solution. First we cycle the Javascript object, building an array of the detail and then converting the array to a string:
vMethod.convertToText = function(obj) {
var string = [];
var output = '';
var count= 0;
var countTotal = 0;
if (typeof(obj) == "object" && (obj.join == undefined)) {
count= 0;
countTotal = 0;
string.push("{");
for (prop in obj) {
countTotal++;
}
for (prop in obj) {
if(count==countTotal - 1) {
string.push(prop, ": ", vMethod.convertToText(obj[prop]),'}\r\n');
} else {
string.push(prop, ": ", vMethod.convertToText(obj[prop]), ",");
}
count++;
};
} else if (typeof(obj) == "object" && !(obj.join == undefined)) {
count= 0;
countTotal = 0;
string.push("[\r\n")
for (prop in obj) {
countTotal++;
}
for(prop in obj) {
if(count==countTotal - 1) {
string.push(vMethod.convertToText(obj[prop]),'];\r\n');
} else {
string.push(vMethod.convertToText(obj[prop]), ",");
}
count++;
}
} else if (typeof(obj) == "function") {
string.push(obj.toString())
} else {
string.push(JSON.stringify(obj))
}
output = string.join("").toString();
//output = output.slice(1, -1);
return output;
}
Then we clean the array (neccessary for me to remove excess characters)
vMethod.cleanConfigText = function() {
var outputText = vMethod.convertToText(config);
outputText = outputText.slice(1, -1);
outputText = 'var config = {};\r\n'+outputText;
outputText = outputText.replace('user:','config.user =');
outputText = outputText.replace(',price:','config.price =');
outputText = outputText.slice(0, -2);
outputText = outputText.replace(/"/g, "'")
return outputText;
}
Finally a function to export the object into my config.js file:
vMethod.writeToConfig = function() {
vObject.fileSystem = new ActiveXObject('Scripting.FileSystemObject');
vObject.fileSystemFile = vObject.fileSystem.CreateTextFile('source\\js\\config.js',true);
vObject.fileSystemFile.Write(vMethod.cleanConfigText());
vObject.fileSystemFile.Close();
delete vObject.fileSystemFile;
delete vObject.fileSystem;
}
So when I want to export a change in the config, I just call:
vMethod.writeToConfig();
The only difference in the file format is that the commas appear at the start of a trailing line rather than the end of a preceding line but I can live with that!
Edit Turns out I'm anally retentive and the commas were bugging me
Added these to the clean up function and now the config is identical to before but without the indent
outputText = outputText.replace(/[\n\r]/g, '_');
outputText = outputText.replace(/__,/g, ',\r\n');
outputText = outputText.replace(/__/g, '\r\n');
Thank you to those that looked at the question and tried to help, very much appreciated.
Edit
DO NOT READ THE SOLUTION ABOVE, IT IS IN THE WRONG PLACE AND THERFORE IS NOT A VALID ANSWER. YOU'VE BEEN WARNED.
You can use a very popular npm package: https://www.npmjs.com/package/jsonfile . There are many but I've choosen this one.
Usually config stuff should be in json or .env files.
Now, all you have to do is use jsonfile's API to read/write JSON and parse (the package does the serialization/deserialization) it at the beginning when the application starts.
Example:
var jsonfile = require('jsonfile');
var util = require('util');
var config = null;
var file = './config.json';
// Reading
jsonfile.readFile(file, function(err, obj) {
config = obj;
});
// Writing
// Edit your config blah blah
config.user = [
{id:'JSMITH', priceModify:'true'},
{id:'JBLOGGS', priceModify:'false'},
];
config.price = [
{id:"price01", name:"priceName01", primary:"57.25", secondary:"34.54"},
{id:"price02", name:"priceName02", primary:"98.26", secondary:"139.45"},
{id:"price03", name:"priceName03", primary:"13.87", secondary:"29.13"}
];
jsonfile.writeFile(file, config, function (err) {
if(err) return err;
console.log('Config saved to file!');
});

Categories