Converting XML data to JSON on ReactJs - javascript

i am working on healthcare project and i have to import some medicen data from another server . the other server data is type xml , and i want to convert it to json to be on my API .
fetch ("http://api.com/rest/api/something&q=")
.then(response => response.text())
.then((response) => {
parseString(responseText, function (err, result) {
console.log(response)
});
}).catch((err) => {
console.log('fetch', err)
})
},
i get this erorr :
fetch ReferenceError: parseString is not defined
i am using ReactJs so please can someone help me to get the correct way to convert XML to JSON ?

install these two npm Lib
1.npm install react-native-xml2js
2.npm install --save xml-js
step-1-----
import axios from "axios";
const parseString = require('react-native-xml2js').parseString; //step-1 here
axios.get(your api url here,
{
headers:{Authorization:this.state.token}
}).then(response => {
parseString(response, function (err, result) {
console.log("response----"+response.data)
//step--2 here
var convert = require('xml-js');
var xml = response.data
var result1 = convert.xml2json(xml, {compact: true, spaces: 4});
var result2 = convert.xml2json(xml, {compact: false, spaces: 4});
console.log("result1----"+result1);
console.log("result2----"+result2);
//step--2 end here
});
}).catch((err) => {
console.log('fetch', err)
})
use which result suitable for you...
It's worked for me in my React-native project

Related

Is it possible to read/decode .avro uInt8Array Container File with Javascript in Browser?

I'm trying to decode an .avro file loaded from a web server.
Since the string version of the uInt8Array starts with
"buffer from S3 Objavro.schema�{"type":"record","name":"Destination",..."
I assume it's avro Container File
I found 'avro.js' and 'avsc' as tools for working with the .avro format and javascript but reading the documentation it sound's like the decoding of a Container File is only possible in Node.js, not in the browser.
(The FileDecoder/Encoder methods are taking a path to a file as string, not an uInt8Array)
Do I get this wrong or is there an alternative way to decode an .avro Container File in the browser with javascript?
Luckily I found a way using avsc with broserify
avro.createBlobDecoder(blob, [{options}])
[avro.js - before browserifying]
var avro = require('avsc');
const AVRO = {
decodeBlob(blob) {
let schema, columnTitles, columnTypes, records = []
return new Promise((resolve) => {
avro.createBlobDecoder(blob, {
// noDecode: true
})
.on('metadata', (s) => {
schema = s
columnTitles = schema.fields.map(f => f.name)
columnTypes = schema.fields.map(f => f.type)
})
.on('data', (data) => {
records.push(data)
})
.on('finish', () => {
resolve(
{
columnTitles: columnTitles,
columnTypes: columnTypes,
records: records
}
)
})
})
}
}
module.exports = AVRO
[package.json]
"scripts": {
"avro": "browserify public/avro.js --s AVRO > public/build/avro.js"
}
[someOtherFile.js]
//s3.getObject => uInt8array
const blob = new Blob([uInt8array]) //arr in brackets !important
const avroDataObj = await AVRO.decodeBlob(blob)
Thanks for posting!
Below is my integration of avro/binary with axios, in case it helps anyone else trying to implement browser side decoding:
[before browserifying]
const axios = require('axios')
const avro = require('avsc')
const config = {
responseType: 'blob'
};
const url = 'https://some-url.com'
axios.get(url, config)
.then(res => {
avro.createBlobDecoder(res.data)
.on('metadata', (type) => console.log(type))
.on('data', (record) => console.log(record))
})
.catch(e => console.error(e))

npm package csvtojson CSV Parse Error: Error: unclosed_quote

Node version: v10.19.0
Npm version: 6.13.4
Npm package csvtojson Package Link
csvtojson({
"delimiter": ";",
"fork": true
})
.fromStream(fileReadStream)
.subscribe((dataObj) => {
console.log(dataObj);
}, (err) => {
console.error(err);
}, (success) => {
console.log(success);
});
While trying to handle large CSV file (about 1.3 million records) I face error "CSV Parse Error: Error: unclosed_quote." after certain records(e.g. after 400+ records) being processed successfully. From the CSV file i don't see any problems with data formatting there, however the parser might be raising this error because of "\n" character being found inside the column/field value.
Is there a solution already available with this package? or
is there a workaround to handle this error? or
is there a way to skip such CSV rows having any sort of errors not only this one, to let the
entire CSV to JSON parsing work without the processing getting stuck?
Any help will be much appreciated.
I've played about with this, and it's possible to hook into this using a CSV File Line Hook, csv-file-line-hook, you can check for invalid lines and either repair or simply invalidate them.
The example below will simply skip the invalid lines (missing end quotes)
example.js
const fs = require("fs");
let fileReadStream = fs.createReadStream("test.csv");
let invalidLineCount = 0;
const csvtojson = require("csvtojson");
csvtojson({ "delimiter": ";", "fork": true })
.preFileLine((fileLineString, lineIdx)=> {
let invalidLinePattern = /^['"].*[^"'];/;
if (invalidLinePattern.test(fileLineString)) {
console.log(`Line #${lineIdx + 1} is invalid, skipping:`, fileLineString);
fileLineString = "";
invalidLineCount++;
}
return fileLineString
})
.fromStream(fileReadStream)
.subscribe((dataObj) => {
console.log(dataObj);
},
(err) => {
console.error("Error:", err);
},
(success) => {
console.log("Skipped lines:", invalidLineCount);
console.log("Success");
});
test.csv
Name;Age;Profession
Bob;34;"Sales,Marketing"
Sarah;31;"Software Engineer"
James;45;Driver
"Billy, ;35;Manager
"Timothy;23;"QA
This regex works better
/^(?:[^"\]|\.|"(?:\.|[^"\])")$/g
Here is a more complex working script for big files by reading each line
import csv from 'csvtojson'
import fs from 'fs-extra'
import lineReader from 'line-reader'
import { __dirname } from '../../../utils.js'
const CSV2JSON = async(dumb, editDumb, headers, {
options = {
trim: true,
delimiter: '|',
quote: '"',
escape: '"',
fork: true,
headers: headers
}
} = {}) => {
try {
log(`\n\nStarting CSV2JSON - Current directory: ${__dirname()} - Please wait..`)
await new Promise((resolve, reject) => {
let firstLine, counter = 0
lineReader.eachLine(dumb, async(line, last) => {
counter++
// log(`line before convert: ${line}`)
let json = (
await csv(options).fromString(headers + '\n\r' + line)
.preFileLine((fileLineString, lineIdx) => {
// if it its not the first line
// eslint-disable-next-line max-len
if (counter !== 1 && !fileLineString.match(/^(?:[^"\\]|\\.|"(?:\\.|[^"\\])*")*$/g)) {
// eslint-disable-next-line max-len
console.log(`Line #${lineIdx + 1} is invalid. It has unescaped quotes. We will skip this line.. Invalid Line: ${fileLineString}`)
fileLineString = ''
}
return fileLineString
})
.on('error', e => {
e = `Error while converting CSV to JSON.
Line before convert: ${line}
Error: ${e}`
throw new BaseError(e)
})
)[0]
// log(`line after convert: ${json}`)
if (json) {
json = JSON.stringify(json).replace(/\\"/g, '')
if (json.match(/^(?:[^"\\]|\\.|"(?:\\.|[^"\\])*")*$/g)) {
await fs.appendFile(editDumb, json)
}
}
if (last) {
resolve()
}
})
})
} catch (e) {
throw new BaseError(`Error while converting CSV to JSON - Error: ${e}`)
}
}
export { CSV2JSON }

Streaming JSON data to React results in unexpected end of JSON inpit

I'm trying to stream a lot of data from a NodeJS server that fetches the data from Mongo and sends it to React. Since it's quite a lot of data, I've decided to stream it from the server and display it in React as soon as it comes in. Here's a slightly simplified version of what I've got on the server:
const getQuery = async (req, res) => {
const { body } = req;
const query = mongoQueries.buildFindQuery(body);
res.set({ 'Content-Type': 'application/octet-stream' });
Log.find(query).cursor()
.on('data', (doc) => {
console.log(doc);
const data = JSON.stringify(result);
res.write(`${data}\r\n`);
}
})
.on('end', () => {
console.log('Data retrieved.');
res.end();
});
};
Here's the React part:
fetch(url, { // this fetch fires the getQuery function on the backend
method: "POST",
body: JSON.stringify(object),
headers: {
"Content-Type": "application/json",
}
})
.then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
const pump = () =>
reader.read().then(({ done, value }) => {
if (done) return this.postEndHandler();
console.log(value.length); // !!!
const decoded = decoder.decode(value);
this.display(decoded);
return pump();
});
return pump();
})
.catch(err => {
console.error(err);
toast.error(err.message);
});
}
display(chunk) {
const { data } = this.state;
try {
const parsedChunk = chunk.split('\r\n').slice(0, -1);
parsedChunk.forEach(e => data.push(JSON.parse(e)));
return this.setState({data});
} catch (err) {
throw err;
}
}
It's a 50/50 whether it completes with no issues or fails at React's side of things. When it fails, it's always because of an incomplete JSON object in parsedChunk.forEach. I did some digging and it turns out that every time it fails, the console.log that I marked with 3 exclamation marks shows 65536. I'm 100% certain it's got something to do with my streams implementation and I'm not queuing the chunks correctly but I'm not sure whether I should be fixing it client or server side. Any help would be greatly appreciated.
Instead of implementing your own NDJSON-like streaming JSON protocol which you are basically doing here (with all of the pitfalls of dividing the stream into chunks and packets which is not always under your control), you can take a look at some of the existing tools that are created to do what you need, e.g.:
http://oboejs.com/
http://ndjson.org/
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/clarinet

Express.js response converts snake_case keys to camelCase automatically

I'm working on a small project at work and we have an Express.js based node application running that sends a json response that has keys in snake_case format. We have another node application that consumes this service but the response object keys are accessed in camelCase format here. I'd like to know what happens in the background to make this work.
This is the code in the REST API
app.get('/api/customer/:id', (req, res) => {
const data = {
"arr": [{
"my_key": "609968029"
}]
}
res.send(data);
});
This is how it is consumed in the other node application
getData = (id) => {
const options = {
url: `api/customer/${id}`
};
return httpClient.get(options)
.then(data => {
const arr = data.arr.map(arrEntry => {
return {
myKey: arrEntry.myKey
};
});
return {
arr
};
});
};
Here myKey correctly has the data from the REST API but I'm not sure how my_key is converted to myKey for it work.
Turns out we have used humps library to resolve the response object from keys snake-case to camelCase.
I found this code in the lib call
const humps = require('humps');
...
axios(optionsObj)
.then(response => {
resolve(humps.camelizeKeys(response.data));
})
.catch(err => {
reject(err);
});
lodash can do this
_.camelCase('Foo Bar');
// => 'fooBar'

how to load an image from url into buffer in nodejs

I am new to nodejs and am trying to set up a server where i get the exif information from an image. My images are on S3 so I want to be able to just pass in the s3 url as a parameter and grab the image from it.
I am u using the ExifImage project below to get the exif info and according to their documentation:
"Instead of providing a filename of an image in your filesystem you can also pass a Buffer to ExifImage."
How can I load an image to the buffer in node from a url so I can pass it to the ExifImage function
ExifImage Project:
https://github.com/gomfunkel/node-exif
Thanks for your help!
Try setting up request like this:
var request = require('request').defaults({ encoding: null });
request.get(s3Url, function (err, res, body) {
//process exif here
});
Setting encoding to null will cause request to output a buffer instead of a string.
Use the axios:
const response = await axios.get(url, { responseType: 'arraybuffer' })
const buffer = Buffer.from(response.data, "utf-8")
import fetch from "node-fetch";
let fimg = await fetch(image.src)
let fimgb = Buffer.from(await fimg.arrayBuffer())
I was able to solve this only after reading that encoding: null is required and providing it as an parameter to request.
This will download the image from url and produce a buffer with the image data.
Using the request library -
const request = require('request');
let url = 'http://website.com/image.png';
request({ url, encoding: null }, (err, resp, buffer) => {
// Use the buffer
// buffer contains the image data
// typeof buffer === 'object'
});
Note: omitting the encoding: null will result in an unusable string and not in a buffer. Buffer.from won't work correctly too.
This was tested with Node 8
Use the request library.
request('<s3imageurl>', function(err, response, buffer) {
// Do something
});
Also, node-image-headers might be of interest to you. It sounds like it takes a stream, so it might not even have to download the full image from S3 in order to process the headers.
Updated with correct callback signature.
Here's a solution that uses the native https library.
import { get } from "https";
function urlToBuffer(url: string): Promise<Buffer> {
return new Promise((resolve, reject) => {
const data: Uint8Array[] = [];
get(url, (res) => {
res
.on("data", (chunk: Uint8Array) => {
data.push(chunk);
})
.on("end", () => {
resolve(Buffer.concat(data));
})
.on("error", (err) => {
reject(err);
});
});
});
}
const imageUrl = "https://i.imgur.com/8k7e1Hm.png";
const imageBuffer = await urlToBuffer(imageUrl);
Feel free to delete the types if you're looking for javascript.
I prefer this approach because it doesn't rely on 3rd party libraries or the deprecated request library.
request is deprecated and should be avoided if possible.
Good alternatives include got (only for node.js) and axios (which also support browsers).
Example of got:
npm install got
Using the async/await syntax:
const got = require('got');
const url = 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png';
(async () => {
try {
const response = await got(url, { responseType: 'buffer' });
const buffer = response.body;
} catch (error) {
console.log(error.body);
}
})();
you can do it that way
import axios from "axios";
function getFileContentById(
download_url: string
): Promise < Buffer > {
const response = await axios.get(download_url, {
responseType: "arraybuffer",
});
return Buffer.from(response.data, "base64");
}

Categories