I have a problem with if statement in for loop.
I am getting list of files from directory filesSAS loop through them and each one converting from csv to JSON. after that i check if output has id in their object if yes copy file (copyFile(dirSas, dirOut, filename) if id is present add Date and save as CSV.
problem is that in frist iteration it copy the file but it is also executing saveCSV function which overrides my result. What i want to achive is to if id is not present copy file and that it for this iteration, go for another iteration. I was trying to put saveCSV inside for loop with no luck
EDIT: when my for loop hit object with no id i want to copy file. when id is present i want to add date to it and save as csv
let noId = [{
user:"Mark",
job:"Job"
}]
let withId = [{
id:1,
user:"Mark",
job:"Job"
}]
output
let withId = [{
id:1,
user:"Mark",
job:"Job"
date: 12-09-2019
}]
const saveNewFile = async (filesSAS, dirSas, dirOut, dirArchive) => {
filesSAS.forEach(async filename => {
const newData = await csv().fromFile(`${dirSas.path}/${filename}`);
for await (const iterator of object) {
if (iterator.Id === null || iterator.Id === undefined) {
await copyFile(dirSas, dirOut, filename);
}
rec.Date = moment(Date.now()).format("DD-MMM-YYYY");
}
await saveCSV(newData, `${dirOut.path}/${filename}`, "output");
});
};
regards
JavaScript does have the continue statement, which jumps over one iteration of a loop:
JavaScript Break and Continue
However, I think the fact that your loop is asynchronous makes it hard to restart the loop in a sequential fashion. You can always just use an if statement to not do the remainder of the loop.
What you want to do can be achieved with a boolean that we update in case we don't find an id
const saveNewFile = async (filesSAS, dirSas, dirOut, dirArchive) => {
filesSAS.forEach(async (filename, index) => {
const newData = await csv().fromFile(`${dirSas.path}/${filename}`);
for await (const iterator of object) {
if (iterator.Id === null || iterator.Id === undefined) {
await copyFile(dirSas, dirOut, filename);
}
else {
rec.Date = moment(Date.now()).format("DD-MMM-YYYY");
await saveCSV(newData, `${dirOut.path}/${filename}`, "output");
}
}
});
};
Hey so I mange to do it with refactor
const saveNewFile = async (filesSAS, dirSas, dirOut, dirArchive) => {
filesSAS.forEach(async filename => {
const newData = await csv().fromFile(`${dirSas.path}/${filename}`);
if (!newData[0].hasOwnProperty(pkCol)) {
copyFile(dirSas, dirOut, filename);
} else {
for await (const rec of newData) {
rec.DateCreated = moment(Date.now()).format("DD-MMM-YYYY");
}
await saveCSV(newData, `${dirOut.path}/${filename}`, "output");
}
});
};
Related
This question already has answers here:
Using async/await with a forEach loop
(33 answers)
Closed 1 year ago.
I'm making a program that consists of three different functions:
downloadPDF: download a PDF from the web
getPDF: read and parse the pdf
getDaata: loop through getPDF
Problem I'm having is that the third function(getData) that has a for of loop that runs getPDF, it seems as if it doesn't let getPDF finish before trying to console.log the result that getPDF returns.
Here are the three functions:
async function downloadPDF(pdfURL, outputFilename) {
let pdfBuffer = await request.get({uri: pdfURL, encoding: null});
console.log("Writing downloaded PDF file to " + outputFilename + "...");
fs.writeFileSync(outputFilename, pdfBuffer);
}
async function getPDF(query, siteName, templateUrl, charToReplace) {
const currentWeek = currentWeekNumber().toString();
await downloadPDF(templateUrl.replace(charToReplace, currentWeek), "temp/pdf.pdf");
var resultsArray = []
let dataBuffer = fs.readFileSync("temp/pdf.pdf");
pdf(dataBuffer).then(function(data) {
pdfContent = data.text;
const splittedArray = pdfContent.split("\n");
const parsedArray = splittedArray.map((item, index) => {
if(item.includes(query)) {
resultsArray.push({result: item, caseId: splittedArray[index-1].split(',', 1)[0], site: siteName});
}
}).filter(value => value);
return(resultsArray);
});
fs.unlinkSync("temp/pdf.pdf"); //deletes the downloaded file
}
async function getData(query, desiredSites) {
var resultsArray = []
for (const value of desiredSites) {
let result = await getPDF(query, sitesList.sites[value].name, sitesList.sites[value].templateUrl, sitesList.sites[value].charToReplace);
console.log(result)
}
}
getData("test", ['a', 'b']);
In the bottom function(getData), the console.log results in undefined
I'm guessing this has something to do with the promises. Any ideas? Thanks a lot!
In getPDF, you should chain all your async functions with await instead of .then or vice versa.
You can mix await with .then but this would be not easy to chain them with linear codes. The reason people use await because they want to make the codes look linear and easy to maintain.
async function downloadPDF(pdfURL, outputFilename) {
let pdfBuffer = await request.get({ uri: pdfURL, encoding: null });
console.log("Writing downloaded PDF file to " + outputFilename + "...");
fs.writeFileSync(outputFilename, pdfBuffer);
}
async function getPDF(query, siteName, templateUrl, charToReplace) {
const currentWeek = currentWeekNumber().toString();
await downloadPDF(
templateUrl.replace(charToReplace, currentWeek),
"temp/pdf.pdf"
);
var resultsArray = [];
let dataBuffer = fs.readFileSync("temp/pdf.pdf");
const data = await pdf(dataBuffer);
pdfContent = data.text;
const splittedArray = pdfContent.split("\n");
const resultsArray = splittedArray
.filter(item => item.includes(query))
.map(item => ({
result: item,
caseId: splittedArray[index - 1].split(",", 1)[0],
site: siteName,
}));
fs.unlinkSync("temp/pdf.pdf"); //deletes the downloaded file
return resultsArray;
}
async function getData(query, desiredSites) {
for (const value of desiredSites) {
let result = await getPDF(
query,
sitesList.sites[value].name,
sitesList.sites[value].templateUrl,
sitesList.sites[value].charToReplace
);
console.log(result);
}
}
getData("test", ["a", "b"])
.then(() => console.log("done"))
.catch(console.log);
Good Afternoon,
I am using the MERN stack to making a simple invoice application.
I have a function that runs 2 forEach() that goes through the invoices in the DB and the Users. if the emails match then it gives the invoices for that user.
When I log DBElement to the console it works, it has the proper data, but when I log test1 to the console (app.get()) it only has one object not both.
// forEach() function
function matchUserAndInvoice(dbInvoices, dbUsers) {
dbInvoices.forEach((DBElement) => {
dbUsers.forEach((userElement) => {
if(DBElement.customer_email === userElement.email){
const arrayNew = [DBElement];
arrayNew.push(DBElement);
app.set('test', arrayNew);
}
})
})
}
// end point that triggers the function and uses the data.
app.get('/test', async (req,res) => {
const invoices = app.get('Invoices');
const users = await fetchUsersFromDB().catch((e) => {console.log(e)});
matchUserAndInvoice(invoices,users,res);
const test1 = await app.get('test');
console.log(test1);
res.json(test1);
})
function matchUserAndInvoice(dbInvoices, dbUsers) {
let newArray = [];
dbInvoices.forEach((DBElement) => {
dbUsers.forEach(async(userElement) => {
if(DBElement.customer_email === userElement.email){
newArray.push(DBElement);
app.set('test', newArray);
}
})
})
}
app.set('test', DBElement); overrides the existing DBElement, so only the last matching DBElement is shown in test1.
If you want to have test correspond to all matching DBElement, you should set it to an array, and then append a new DBElement to the array each time it matches inside the for-loop:
if(DBElement.customer_email === userElement.email){
let newArray = await app.get('test');
newArray.push(DBElement);
app.set('test', newArray);
}
Right now, I coded a function to go like this
async function checkPlayerScam(ign) {
const UUID = await getUUID(ign);
if(MATCHING){
playerIsScammer = true
}
else {
playerIsScammer = false
}
}
The MATCHING is just a placeholder at the moment. I want to check their UUID, and make sure it isn't in this list: https://raw.githubusercontent.com/skyblockz/pricecheckbot/master/scammer.json
Any idea how? It needs to be relatively fast
EDIT: It'd also be cool if I could get the reason from the list, but that's not as necessary
https://lodash.com/docs/#find
Use lodash _.find to
const uuid = '000c97aaf948417a9a74d6858c01aaae'; // uuid you want to find
const scammer = _.find(scammersList, o => o.uuid === uuid);
if (scammer) { // if scammer found
console.log(scammer);
console.log(scammer.reason)
}
For anyone wondering, this is how I solved it:
async function checkPlayerScam(ign) {
const UUID = await getUUID(ign);
const response = await fetch(`https://raw.githubusercontent.com/skyblockz/pricecheckbot/master/scammer.json`);
const result = await responsejson();
if (result[UUID] = null) {
playerIsScammer == False
}
else{
playerIsScammer == True
}
}
This function will fetch the data, then check if the uuid 1d0c0ef4295047b39f0fa899c485bd00 exists. Assuming that you already fetched the data somewhere else and stored it, all you need to do is check if a given uuid exists by adding the following line where you please:
!!data[uuidToCheck]
uuidToCheck should be the uuid string that you are looking for.
This line will return true if the uuid exists and false otherwise.
In terms of the spacetime complexity, this function runs in constant time [O(1)] and O(N) space. This is the fastest time you can get it to run.
data[uuidToCheck].reason will return the reason.
async function playerIsScammer(uuidToCheck) {
uuidToCheck = '1d0c0ef4295047b39f0fa899c485bd00';
const response = await fetch('https://raw.githubusercontent.com/skyblockz/pricecheckbot/master/scammer.json');
if (response.ok){
let data = await response.json();
if(!!data[uuidToCheck])
return data[uuidToCheck].reason;
return false
}
}
consider this scenario:
I have 2 csv files, each one is sorted and contains the id filed.
I need to join the rows using the id field. Because the files are already sorted by the id I wanted to perform merge join (https://en.wikipedia.org/wiki/Sort-merge_join).
For that I need to have a way to load some portion of both files, process it and iteratively load more again from one or both files.
(The files are big and would not fit into memory so only streaming approach will work).
The problem is the Node API, what to use? readline will not work because of https://github.com/nodejs/node/issues/33463. Any other ideas?
I had to do something quite similar recently and decided to use the node-line-reader module that has a simpler interface than the built-in readline. I then created a little recursive function that determines which file to read from next by comparing the id of each csv-entry of each provided file. After that the corresponding line gets written out to the target file, and the method is called again until all lines of all files are processed. Here's the whole class I ended up with:
const fs = require('fs');
const LineReader = require('node-line-reader').LineReader;
class OrderedCsvFileMerger {
constructor(files, targetFile) {
this.lineBuffer = [];
this.initReaders(files);
this.initWriter(targetFile);
}
initReaders(files) {
this.readers = files.map(file => new LineReader(file));
}
initWriter(targetFile) {
this.writer = fs.createWriteStream(targetFile);
}
async mergeFiles() {
// initially read first line from all files
for (const reader of this.readers) {
this.lineBuffer.push(await this.nextLine(reader));
}
return this.merge();
}
async nextLine(reader) {
return new Promise((resolve, reject) => {
reader.nextLine(function (err, line) {
if (err) reject(err);
resolve(line);
});
})
}
async merge() {
if (this.allLinesProcessed()) {
return;
}
let currentBufferIndex = -1;
let minRowId = Number.MAX_VALUE;
for (let i = 0; i < this.lineBuffer.length; i++) {
const currentRowId = parseInt(this.lineBuffer[i]); // implement parsing logic if your lines do not start
// with an integer id
if (currentRowId < minRowId) {
minRowId = currentRowId;
currentBufferIndex = i;
}
}
const line = this.lineBuffer[currentBufferIndex];
this.writer.write(line + "\n");
this.lineBuffer[currentBufferIndex] = await this.nextLine(this.readers[currentBufferIndex]);
return this.merge();
}
allLinesProcessed() {
return this.lineBuffer.every(l => !l);
}
}
(async () => {
const input = ['./path/to/csv1.csv', './path/to/csv2.csv'];
const target = './path/to/target.csv';
const merger = new OrderedCsvFileMerger(files, output);
await merger.mergeFiles();
console.log("Files were merged successfully!")
})().catch(err => {
console.log(err);
});
async onSubmit(formValue) {
this.isSubmitted = true;
if(this.selectedImageArray.length > 0) { // 4 images in this array
for (let index = 0; index < this.selectedImageArray.length; index++) { // Loop through this image array
await new Promise(resolve => {
setTimeout(()=> {
console.log('This is iteration ' + index);
var filePath = `images/tours/${this.selectedImageArray[index].name.split('.').slice(0,-1).join('.')}_${new Date(). getTime()}`;
const fileRef = this.storage.ref(filePath);
this.storage.upload(filePath, this.selectedImageArray[index]).snapshotChanges().pipe(
finalize(() => {
fileRef.getDownloadURL().subscribe((url) => {
formValue[`imageUrl${index+1}`] = url;
console.log(url);
});
})
).subscribe()
resolve();
}, 3000);
});
}
console.log('After loop execution');
// this.value(formValue);
}
}
After submitting the code it will download and print 3 urls and then it print 'after loop execution' then it print 4th one I don't understand why. See here in console
see in the image line no of code execution.
What I want to execute code in sequence after all images download then after it will go out of loop.
I wrote another version of this that hopefully works as you expect it to.
First we create an array of all the storage upload snapshot observables.
The we use concat() to run them all in sequence. (If you change from concat() to merge() they will all go at once)
The we use mergeMap to jump over to the getDownloadURL
Then in the subscribe we add the url to the formValues
Finally in the finalize we set the class propery "value" equal to the formValue.
onSubmit(formValue) {
const snapshotObservables = this.selectedImageArray.map(selectedImage => { // 4 images in this array
const filePath = `images/tours/${selectedImage.name.split('.').slice(0, -1).join('.')}_${new Date(). getTime()}`;
return combineLatest(this.storage.upload(filePath, selectedImage).snapshotChanges(), filePath);
});
concat(...snapshotObservables).pipe(
mergeMap(([snapshot, filePath]) => {
const fileRef = this.storage.ref(filePath);
return fileRef.getDownloadURL();
}),
finalize(() => {
this.value(formValue);
})
).subscribe(url => {
formValue[`imageUrl${index+1}`] = url;
});
}
I wrote a new function for multiple file upload
public multipleFileUpload(event, isEncodeNeeded?: Boolean):Array<any> {
if(!isEncodeNeeded){
isEncodeNeeded=false;
}
let fileList = [];
for (let index = 0; index < event.target.files.length; index++) {
let returnData = {};
let file: File = event.target.files[index];
let myReader: FileReader = new FileReader();
returnData['documentName'] = event.target.files[index]['name'];
returnData['documentType'] = event.target.files[index]['type'];
myReader.addEventListener("load", function (e) {
if (myReader.readyState == 2) {
returnData['document'] = isEncodeNeeded ? btoa(e.target['result']) : e.target['result'];
}
});
myReader.readAsBinaryString(file);
fileList.push(returnData);
}
return fileList;
}
In this function event is the event of the input and the isEncodeNeeded is conversion is needed. If this is true then it convert to base64 format.
The output format is
[{
"document": documentbyte,
"documentName": document name,
"documentType": file format
}]