Issues writing currency symbols to file in nodejs - javascript

I am trying to write to a file:
private async writeToFile(data: any) {
try {
fs.writeFile(filePath as string, JSON.stringify(data), 'utf8', (error: any) => {
if (error) {
logger.error(`[JSON] Error while saving file : ${error}`);
}
logger.info('The file has been saved!');
});
} catch (error) {
logger.error(`[JSON] Error while saving file : ${error}`);
}
}
where data has:
var data = [{label:'Egyptian Pound £', value: 'E£'}, {"label":"Albanian Lek-AL","value":"AL"}];
When I write to file, the characters are saved as {label: Egyptian Pound E�, value: E�}
The data array is created from a multi line string returned from server:
Egyptian Pound|E£
Albanian Lek|AL
Code to create the data array:
const currencyArr = response
.split('\n')
.map(val => val.trim())
.reduce((arr, currencyString) => {
arr.push({
label: currencyString.split('|')[0] + '-' + currencyString.split('|')[1],
value: currencyString.split('|')[1]
});
return arr;
}, []);
this.writeToFile(currencyArr);
I am not sure why this is happening. As per docs, node supports UTF-8 encoding by default

The only reason I can find this kind of thing happen is if your JS file is the one not encoded in UTF8.
Make sure the JS file is saved in the UTF8 encoding, so the string in your script can be saved to the corresponding encoding.

Related

How can I duplicate a file (completely) but change it's name?

So I need to generate the smaller previews of the image files that will be uploaded and I have to append "_preview" at the end of each file name.
Currently I'm doing this:
uploadFile.map((file) => {
if (file.type.includes('image')) {
console.log('Generating thumbnail for ' + file.name)
const fileName = file.name.split('.').slice(0, -1).join('.')
const fileExtension = file.name.split('.').pop()
const compressedFile = new File(
[file.slice(0, file.size, file.type)],
fileName + '_preview.' + fileExtension,
)
console.log('Generated file:', compressedFile)
convert({
file: compressedFile,
width: 300,
height: 300,
type: fileExtension,
})
.then((resp) => {
uploadFile.push(resp)
})
.catch((error) => {
// Error
console.error('Error compressing ', file.name, '-', error)
})
}
})
The problem is that "compressedFile" is missing some fields which were present in the original file and so the convert functoin throws the error "File type not supported". As you can see "type" and "webkitRelativePath" are not copied.
Can anybody suggest how I can retain all the information from the original file and just append _preview at the end of file name?
I realized File API provides an option to pass "options" object as well which can specify the file type. For instance:
const file = new File(["foo"], "foo.txt", {
type: "text/plain",
});
Source: https://developer.mozilla.org/en-US/docs/Web/API/File/File
for copy code in js or duplicate, you can use this code
//copyfile.js
const fs = require('fs');
// destination will be created or overwritten by default.
fs.copyFile('C:\folderA\myfile.txt', 'C:\folderB\myfile.txt', (err) => {
if (err) throw err;
console.log('File was copied to destination');
});

Read CSV over SSH and convert to JSON

This is a duplicate of this question here
Here is the code I'm trying to work with:
let Client = require('ssh2-sftp-client');
let sftp = new Client();
var csv = require("csvtojson");
sftp.connect({
host: 'HOST',
port: 'PORT',
username: 'USERNAME',
password: 'PASSWORD'
}).then(() => {
return sftp.get('/home/user/etc/testfile.csv');
}).then((data) => {
csv()
.fromString(data.toString()) // changed this from .fromStream(data)
.subscribe(function(jsonObj) { //single json object will be emitted for each csv line
// parse each json asynchronously
return new Promise(function(resolve, reject) {
resolve()
console.log(jsonObj);
})
})
}).catch((err) => {
console.log(err, 'catch error');
});
I can read back the CSV data, and can see it going into JSON format on console.log(jsonObj) but the data is unreadable, all '\x00o\x00n\x00'' ..
I'm not sure what to do in the line:
// parse each json asynchronously
Could anyone help to figure out how to parse the CSV/JSON after it comes back from the buffer?
The null bytes \x00 are pointing towards an encoding/decoding issue. The CSV file might be encoded using UTF-16, but Buffer.toString() by default decodes the data using UTF-8. You can change that to data.toString('utf16le') (or data.toString('ucs2')) to force using the correct encoding.

Typescript, await for function

Im trying to upload files to S3 bucket using typescript and aws-sdk package. The ideal outcome is:
Upload to s3 is successful THEN => Load SQS message.
Here is the function for reading file recieved and executing s3Upload
// on each byte of uploading
file.on("data", function (data) {
console.log("File [" + fieldname + "] got " + data.length + " bytes");
});
// whenever the upload is finished into this microservice
file.on("end", function () {
console.log("File [" + fieldname + "] Finished uploading");
});
const s3BucketLink = await saveToS3(filename, file);
### HERE IS IMPORTANT, I BASICALLY WANT TO CONFIRM THAT STATUS IS 200 and only then upload to SQS
if (s3BucketLink.status === 200) {
console.log("its done");
}
console.log("here");
console.log(s3BucketLink);
});
and in my saveToS3 I have this
interface S3Response {
status: number;
filepath: string;
}
export const saveToS3 = async (filename: string, file: any): Promise<S3Response> => {
let status = {
status: 400,
filepath: ""
};
s3.listBuckets(function (err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data.Buckets);
}
});
const params = {
Bucket: bucketName,
Key: filename, // File name you want to save as in S3
Body: file
};
s3.upload(params, function (err: Error, data: any) {
if (err) {
throw err;
}
console.log(`File uploaded successfully. ${data.Location}`);
status = {
status: 200,
filepath: data.Location
};
});
return status;
};
Im basically trying to set the status 200 and if its the case then go ahead and load sqs.
The current results in console.log would look like this
[Function]
File [File] got 65002 bytes
here
{ status: 400 }
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 65536 bytes
File [File] got 43814 bytes
File [File] Finished uploading
Success [ { Name: 'name', CreationDate: 2020-05-31T06:50:22.000Z } ]
File uploaded successfully. https://name.s3.amazonaws.com/file_example_MP3_700KB.mp3
Would really appreciate any help on how to achieve the desired outcome :)
It looks like you've got a couple of issues here.
You're executing a couple of asynchronous methods (s3.listBuckets and s3.upload) without waiting for them to complete. So you are starting those actions and then calling return status;, which is why you're getting that value back early before the file has finished uploading. You'll need to wait for those things to complete before returning.
But both of these methods use callbacks, not promises, and you want your saveToS3 method to return a Promise. So you'll need to wrap both of those method calls. Here's a simplified example of what that looks like (with some code omitted). In this example the method returns a Promise which is only resolved when the callback of s3.upload is fired, meaning that operation has completed or returned an error.
export const saveToS3 = (filename: string, file: any): Promise<S3Response> => {
return new Promise((resolve, reject) => {
s3.upload(params, function (err: Error, data: any) {
if (err) {
return reject(err);
}
resolve(status);
});
});
}
This will cause any await saveToS3() statement to wait for the operation to complete.

npm package csvtojson CSV Parse Error: Error: unclosed_quote

Node version: v10.19.0
Npm version: 6.13.4
Npm package csvtojson Package Link
csvtojson({
"delimiter": ";",
"fork": true
})
.fromStream(fileReadStream)
.subscribe((dataObj) => {
console.log(dataObj);
}, (err) => {
console.error(err);
}, (success) => {
console.log(success);
});
While trying to handle large CSV file (about 1.3 million records) I face error "CSV Parse Error: Error: unclosed_quote." after certain records(e.g. after 400+ records) being processed successfully. From the CSV file i don't see any problems with data formatting there, however the parser might be raising this error because of "\n" character being found inside the column/field value.
Is there a solution already available with this package? or
is there a workaround to handle this error? or
is there a way to skip such CSV rows having any sort of errors not only this one, to let the
entire CSV to JSON parsing work without the processing getting stuck?
Any help will be much appreciated.
I've played about with this, and it's possible to hook into this using a CSV File Line Hook, csv-file-line-hook, you can check for invalid lines and either repair or simply invalidate them.
The example below will simply skip the invalid lines (missing end quotes)
example.js
const fs = require("fs");
let fileReadStream = fs.createReadStream("test.csv");
let invalidLineCount = 0;
const csvtojson = require("csvtojson");
csvtojson({ "delimiter": ";", "fork": true })
.preFileLine((fileLineString, lineIdx)=> {
let invalidLinePattern = /^['"].*[^"'];/;
if (invalidLinePattern.test(fileLineString)) {
console.log(`Line #${lineIdx + 1} is invalid, skipping:`, fileLineString);
fileLineString = "";
invalidLineCount++;
}
return fileLineString
})
.fromStream(fileReadStream)
.subscribe((dataObj) => {
console.log(dataObj);
},
(err) => {
console.error("Error:", err);
},
(success) => {
console.log("Skipped lines:", invalidLineCount);
console.log("Success");
});
test.csv
Name;Age;Profession
Bob;34;"Sales,Marketing"
Sarah;31;"Software Engineer"
James;45;Driver
"Billy, ;35;Manager
"Timothy;23;"QA
This regex works better
/^(?:[^"\]|\.|"(?:\.|[^"\])")$/g
Here is a more complex working script for big files by reading each line
import csv from 'csvtojson'
import fs from 'fs-extra'
import lineReader from 'line-reader'
import { __dirname } from '../../../utils.js'
const CSV2JSON = async(dumb, editDumb, headers, {
options = {
trim: true,
delimiter: '|',
quote: '"',
escape: '"',
fork: true,
headers: headers
}
} = {}) => {
try {
log(`\n\nStarting CSV2JSON - Current directory: ${__dirname()} - Please wait..`)
await new Promise((resolve, reject) => {
let firstLine, counter = 0
lineReader.eachLine(dumb, async(line, last) => {
counter++
// log(`line before convert: ${line}`)
let json = (
await csv(options).fromString(headers + '\n\r' + line)
.preFileLine((fileLineString, lineIdx) => {
// if it its not the first line
// eslint-disable-next-line max-len
if (counter !== 1 && !fileLineString.match(/^(?:[^"\\]|\\.|"(?:\\.|[^"\\])*")*$/g)) {
// eslint-disable-next-line max-len
console.log(`Line #${lineIdx + 1} is invalid. It has unescaped quotes. We will skip this line.. Invalid Line: ${fileLineString}`)
fileLineString = ''
}
return fileLineString
})
.on('error', e => {
e = `Error while converting CSV to JSON.
Line before convert: ${line}
Error: ${e}`
throw new BaseError(e)
})
)[0]
// log(`line after convert: ${json}`)
if (json) {
json = JSON.stringify(json).replace(/\\"/g, '')
if (json.match(/^(?:[^"\\]|\\.|"(?:\\.|[^"\\])*")*$/g)) {
await fs.appendFile(editDumb, json)
}
}
if (last) {
resolve()
}
})
})
} catch (e) {
throw new BaseError(`Error while converting CSV to JSON - Error: ${e}`)
}
}
export { CSV2JSON }

Getting the value of a Key from a JSON response which is made by parsing XML

I'm trying to parse an XML response from a URL and convert it to JSON the parsed response has so many unwanted data, however i only need the value of the attribute 'TEXT'
Tried getting the key by calling in output = JSON.stringify(result); and returning return output['TEXT']; but it gives an undefined error.
var parseString = require('xml2js').parseString;
function parse()
{
http.get('http://data.alexa.com/data?cli=10&url=https://google.com',
(resp) => {
let data = '';
// A chunk of data has been recieved.
resp.on('data', (chunk) => {
data += chunk;
});
// The whole response has been received. Print out the result.
resp.on('end', () => {
//console.log(JSON.parse(data).explanation);
parseString(data, function (err, result) {
console.log(JSON.stringify(result));
output = JSON.stringify(result);
});
});
}).on("error", (err) => {
console.log("Error: " + err.message);
});
return output['TEXT'];
}
This is the complete JSON which the parse function returns, i think its not valid. Im trying to get the 'TEXT' value inside 'POPULARITY'
{"ALEXA":{"$":
{"VER":"0.9","URL":"google.com/","HOME":"0","AID":"=","IDN":
"buymeacoffee.com/"},"SD":[{"POPULARITY":[{"$":
{"URL":"google.com/","TEXT"
:"20242","SOURCE":"panel"}}],"REACH":[{"$":{"RANK":"25887"}}],"RANK":
[{"$":
{"DEL TA":"-21167"}}],"COUNTRY":
[{"$":{"CODE":"US","NAME":"United States","RANK":"2093
5"}}]}]}}
Regarding what you did wrong here:
you don't need to stringify the result of parseString. It's an object already and you can use it as is.
If you want to access text, you can do it by result['ALEXA']['SD']['POPULARITY']['$']['TEXT']
alternatively:
You can try using camaro. It is made specifically for this purpose: transforming xml to json and only take the properties you're interested in.
const { transform } = require('camaro')
const xml = `
<?xml version="1.0" encoding="UTF-8"?>
<!-- Need more Alexa data? Find our APIs here: https://aws.amazon.com/alexa/ -->
<ALEXA VER="0.9" URL="google.com/" HOME="0" AID="=" IDN="google.com/">
<SD><POPULARITY URL="google.com/" TEXT="1" SOURCE="panel"/><REACH RANK="1"/><RANK DELTA="+0"/><COUNTRY CODE="US" NAME="United States" RANK="1"/></SD></ALEXA>
`
;(async function () {
const template = {
text: 'ALEXA/SD/POPULARITY/#TEXT'
}
const result = await transform(xml, template)
console.log(JSON.stringify(result, null, 4));
})()
Output:
{
"text": "1"
}
the text property you want is accessible by using result.text

Categories