consider this scenario:
I have 2 csv files, each one is sorted and contains the id filed.
I need to join the rows using the id field. Because the files are already sorted by the id I wanted to perform merge join (https://en.wikipedia.org/wiki/Sort-merge_join).
For that I need to have a way to load some portion of both files, process it and iteratively load more again from one or both files.
(The files are big and would not fit into memory so only streaming approach will work).
The problem is the Node API, what to use? readline will not work because of https://github.com/nodejs/node/issues/33463. Any other ideas?
I had to do something quite similar recently and decided to use the node-line-reader module that has a simpler interface than the built-in readline. I then created a little recursive function that determines which file to read from next by comparing the id of each csv-entry of each provided file. After that the corresponding line gets written out to the target file, and the method is called again until all lines of all files are processed. Here's the whole class I ended up with:
const fs = require('fs');
const LineReader = require('node-line-reader').LineReader;
class OrderedCsvFileMerger {
constructor(files, targetFile) {
this.lineBuffer = [];
this.initReaders(files);
this.initWriter(targetFile);
}
initReaders(files) {
this.readers = files.map(file => new LineReader(file));
}
initWriter(targetFile) {
this.writer = fs.createWriteStream(targetFile);
}
async mergeFiles() {
// initially read first line from all files
for (const reader of this.readers) {
this.lineBuffer.push(await this.nextLine(reader));
}
return this.merge();
}
async nextLine(reader) {
return new Promise((resolve, reject) => {
reader.nextLine(function (err, line) {
if (err) reject(err);
resolve(line);
});
})
}
async merge() {
if (this.allLinesProcessed()) {
return;
}
let currentBufferIndex = -1;
let minRowId = Number.MAX_VALUE;
for (let i = 0; i < this.lineBuffer.length; i++) {
const currentRowId = parseInt(this.lineBuffer[i]); // implement parsing logic if your lines do not start
// with an integer id
if (currentRowId < minRowId) {
minRowId = currentRowId;
currentBufferIndex = i;
}
}
const line = this.lineBuffer[currentBufferIndex];
this.writer.write(line + "\n");
this.lineBuffer[currentBufferIndex] = await this.nextLine(this.readers[currentBufferIndex]);
return this.merge();
}
allLinesProcessed() {
return this.lineBuffer.every(l => !l);
}
}
(async () => {
const input = ['./path/to/csv1.csv', './path/to/csv2.csv'];
const target = './path/to/target.csv';
const merger = new OrderedCsvFileMerger(files, output);
await merger.mergeFiles();
console.log("Files were merged successfully!")
})().catch(err => {
console.log(err);
});
Related
hi how can I break out of the for loop ? I want to be able to break out of it in the callback in the if statement
I want this program to create a folder in the given directory and every time it throws an error I want it to change the folder name and add a number to it so when it says that the folder already exists, It'll create a unique folder name until it doesn't throw an error.
I will check for the error code later help me solve this first
const path = require('path');
function folder(folderName) {
for (let i = 1; i <= 10; i++) {
let pathNumber = i;
let fullPath = folderName + pathNumber;
fs.mkdir(path.join("D:", fullPath), (err) => {
if (!err) {
return; // I want to break out of the loop here
}
})
}
}
folder("folder");
You can't write the code that way because the for loop will already be done before any of the fs.mkdir() callbacks are called. They are asynchronous and happen LATER.
If you want to execute one iteration of the loop, including the fs.mkdir() before moving on to any other, then you can use async/await with fs.promises.mkdir().
Here's what a solution could look like with fs.promises.mkdir(). I've also added error handling for the case where all 10 sub-dir names you're trying already exist.
async function folder(folderName) {
let lastError;
for (let pathNumber = 1; pathNumber <= 10; pathNumber++) {
let fullPath = path.join("D:", folderName + pathNumber);
try {
await fs.promises.mkdir(fullPath);
return fullPath;
} catch(e) {
lastError = e;
// ignore error so we keep trying other numbers
}
}
throw lastError;
}
folder("folder").then(fullPath => {
console.log(`dir created: ${fullPath}`);
}).catch(err => {
console.log(err);
});
Much simpler without await
const numFolders = 10,
folders = Array.from(Array(numFolders), (_,i) => `folder${i+1}`), len = folder.length;
let cnt = 0;
const makeFolder = () => {
if (cnt >= len) return; // stop because done
fs.mkdir(path.join("D:", fullPath), (err) => {
if (err) {
makeFolder(); // only call again if error
}
cnt++
}
makeFolder()
Problem Statement:
Our aim is to allocate values in the array ytQueryAppJs, which are returned from a time consuming function httpsYtGetFunc().
The values in ytQueryAppJs needs to be used many times in further part of the code, hence it needs to be done 'filled', before the code proceeds further.
There are many other arrays like ytQueryAppJs, namely one of them is ytCoverAppJs, that needs to be allocated the value, the same way as ytQueryAppJs.
The values in ytCoverAppJs further require the use of values from ytQueryAppJs. So a solution with clean code would be highly appreciated.
(I am an absolute beginner. I have never used async, await or promises and I'm unaware of the correct way to use it. Please guide.)
Flow (to focus on):
The user submits a queryValue in index.html.
An array ytQueryAppJs is logged in console, based on the query.
Expected Log in Console (similar to):
Current Log in Console:
Flow (originally required by the project):
User submits query in index.html.
The values of arrays, ytQueryAppJs, ytCoverAppJs, ytCoverUniqueAppJs, ytLiveAppJs, ytLiveUniqueAppJs gets logged in the console, based on the query.
Code to focus on, from 'app.js':
// https://stackoverflow.com/a/14930567/14597561
function compareAndRemove(removeFromThis, compareToThis) {
return (removeFromThis = removeFromThis.filter(val => !compareToThis.includes(val)));
}
// Declaring variables for the function 'httpsYtGetFunc'
let apiKey = "";
let urlOfYtGetFunc = "";
let resultOfYtGetFunc = "";
let extractedResultOfYtGetFunc = [];
// This function GETs data, parses it, pushes required values in an array.
async function httpsYtGetFunc(queryOfYtGetFunc) {
apiKey = "AI...MI"
urlOfYtGetFunc = "https://www.googleapis.com/youtube/v3/search?key=" + apiKey + "&part=snippet&q=" + queryOfYtGetFunc + "&maxResults=4&order=relevance&type=video";
let promise = new Promise((resolve, reject) => {
// GETting data and storing it in chunks.
https.get(urlOfYtGetFunc, (response) => {
const chunks = []
response.on('data', (d) => {
chunks.push(d)
})
// Parsing the chunks
response.on('end', () => {
resultOfYtGetFunc = JSON.parse((Buffer.concat(chunks).toString()))
// console.log(resultOfYtGetFunc)
// Extracting useful data, and allocating it.
for (i = 0; i < (resultOfYtGetFunc.items).length; i++) {
extractedResultOfYtGetFunc[i] = resultOfYtGetFunc.items[i].id.videoId;
// console.log(extractedResultOfYtGetFunc);
}
resolve(extractedResultOfYtGetFunc);
})
})
})
let result = await promise;
return result;
}
app.post("/", function(req, res) {
// Accessing the queryValue, user submitted in index.html. We're using body-parser package here.
query = req.body.queryValue;
// Fetching top results related to user's query and putting them in the array.
ytQueryAppJs = httpsYtGetFunc(query);
console.log("ytQueryAppJs:");
console.log(ytQueryAppJs);
});
Complete app.post method from app.js:
(For better understanding of the problem.)
app.post("/", function(req, res) {
// Accessing the queryValue user submitted in index.html.
query = req.body.queryValue;
// Fetcing top results related to user's query and putting them in the array.
ytQueryAppJs = httpsYtGetFunc(query);
console.log("ytQueryAppJs:");
console.log(ytQueryAppJs);
// Fetching 'cover' songs related to user's query and putting them in the array.
if (query.includes("cover") == true) {
ytCoverAppJs = httpsYtGetFunc(query);
console.log("ytCoverAppJs:");
console.log(ytCoverAppJs);
// Removing redundant values.
ytCoverUniqueAppJs = compareAndRemove(ytCoverAppJs, ytQueryAppJs);
console.log("ytCoverUniqueAppJs:");
console.log(ytCoverUniqueAppJs);
} else {
ytCoverAppJs = httpsYtGetFunc(query + " cover");
console.log("ytCoverAppJs:");
console.log(ytCoverAppJs);
// Removing redundant values.
ytCoverUniqueAppJs = compareAndRemove(ytCoverAppJs, ytQueryAppJs);
console.log("ytCoverUniqueAppJs:");
console.log(ytCoverUniqueAppJs);
}
// Fetching 'live performances' related to user's query and putting them in the array.
if (query.includes("live") == true) {
ytLiveAppJs = httpsYtGetFunc(query);
console.log("ytLiveAppJs:");
console.log(ytLiveAppJs);
// Removing redundant values.
ytLiveUniqueAppJs = compareAndRemove(ytLiveAppJs, ytQueryAppJs.concat(ytCoverUniqueAppJs));
console.log("ytLiveUniqueAppJs:");
console.log(ytLiveUniqueAppJs);
} else {
ytLiveAppJs = httpsYtGetFunc(query + " live");
console.log("ytLiveAppJs:");
console.log(ytLiveAppJs);
// Removing redundant values.
ytLiveUniqueAppJs = compareAndRemove(ytLiveAppJs, ytQueryAppJs.concat(ytCoverUniqueAppJs));
console.log("ytLiveUniqueAppJs:");
console.log(ytLiveUniqueAppJs);
}
// Emptying all the arrays.
ytQueryAppJs.length = 0;
ytCoverAppJs.length = 0;
ytCoverUniqueAppJs.length = 0;
ytLiveAppJs.length = 0;
ytLiveUniqueAppJs.length = 0;
});
Unfortunately you can use the async/await on http module when making requests. You can install and use axios module . In your case it will be something like this
const axios = require('axios');
// Declaring variables for the function 'httpsYtGetFunc'
let apiKey = "";
let urlOfYtGetFunc = "";
let resultOfYtGetFunc = "";
let extractedResultOfYtGetFunc = [];
// This function GETs data, parses it, pushes required values in an array.
async function httpsYtGetFunc(queryOfYtGetFunc) {
apiKey = "AI...MI"
urlOfYtGetFunc = "https://www.googleapis.com/youtube/v3/search?key=" + apiKey + "&part=snippet&q=" + queryOfYtGetFunc + "&maxResults=4&order=relevance&type=video";
const promise = axios.get(urlOfYtGetFunc).then(data => {
//do your data manipulations here
})
.catch(err => {
//decide what happens on error
})
Or async await
const data = await axios.get(urlOfYtGetFunc);
//Your data variable will become what the api has returned
If you still want to catch errors on async await you can use try catch
try{
const data = await axios.get(urlOfYtGetFunc);
}catch(err){
//In case of error do something
}
I have just looked at the code I think the issue is how you are handling the async code in the request handler. You are not awaiting the result of the function call to httpsYtGetFunc in the body so when it returns before the promise is finished which is why you get the Promise {Pending}.
Another issue is that the array is not extractedResultOfYtGetFunc is not initialised and you may access indexes that don't exist. The method to add an item to the array is push.
To fix this you need to restructure your code slightly. A possible solution is something like this,
// Declaring variables for the function 'httpsYtGetFunc'
let apiKey = "";
let urlOfYtGetFunc = "";
let resultOfYtGetFunc = "";
let extractedResultOfYtGetFunc = [];
// This function GETs data, parses it, pushes required values in an array.
function httpsYtGetFunc(queryOfYtGetFunc) {
apiKey = "AI...MI";
urlOfYtGetFunc =
"https://www.googleapis.com/youtube/v3/search?key=" +
apiKey +
"&part=snippet&q=" +
queryOfYtGetFunc +
"&maxResults=4&order=relevance&type=video";
return new Promise((resolve, reject) => {
// GETting data and storing it in chunks.
https.get(urlOfYtGetFunc, (response) => {
const chunks = [];
response.on("data", (d) => {
chunks.push(d);
});
// Parsing the chunks
response.on("end", () => {
// Initialising the array
extractedResultOfYtGetFunc = []
resultOfYtGetFunc = JSON.parse(Buffer.concat(chunks).toString());
// console.log(resultOfYtGetFunc)
// Extracting useful data, and allocating it.
for (i = 0; i < resultOfYtGetFunc.items.length; i++) {
// Adding the element to the array
extractedResultOfYtGetFunc.push(resultOfYtGetFunc.items[i].id.videoId);
// console.log(extractedResultOfYtGetFunc);
}
resolve(extractedResultOfYtGetFunc);
});
});
});
}
app.post("/", async function (req, res) {
query = req.body.queryValue;
// Fetching top results related to user's query and putting them in the array.
ytQueryAppJs = await httpsYtGetFunc(query);
console.log("ytQueryAppJs:");
console.log(ytQueryAppJs);
});
Another option would be to use axios,
The code for this would just be,
app.post("/", async function (req, res) {
query = req.body.queryValue;
// Fetching top results related to user's query and putting them in the array.
try{
ytQueryAppJs = await axios.get(url); // replace with your URL
console.log("ytQueryAppJs:");
console.log(ytQueryAppJs);
} catch(e) {
console.log(e);
}
});
Using Axios would be a quicker way as you don't need to write promise wrappers around everything, which is required as the node HTTP(S) libraries don't support promises out of the box.
async onSubmit(formValue) {
this.isSubmitted = true;
if(this.selectedImageArray.length > 0) { // 4 images in this array
for (let index = 0; index < this.selectedImageArray.length; index++) { // Loop through this image array
await new Promise(resolve => {
setTimeout(()=> {
console.log('This is iteration ' + index);
var filePath = `images/tours/${this.selectedImageArray[index].name.split('.').slice(0,-1).join('.')}_${new Date(). getTime()}`;
const fileRef = this.storage.ref(filePath);
this.storage.upload(filePath, this.selectedImageArray[index]).snapshotChanges().pipe(
finalize(() => {
fileRef.getDownloadURL().subscribe((url) => {
formValue[`imageUrl${index+1}`] = url;
console.log(url);
});
})
).subscribe()
resolve();
}, 3000);
});
}
console.log('After loop execution');
// this.value(formValue);
}
}
After submitting the code it will download and print 3 urls and then it print 'after loop execution' then it print 4th one I don't understand why. See here in console
see in the image line no of code execution.
What I want to execute code in sequence after all images download then after it will go out of loop.
I wrote another version of this that hopefully works as you expect it to.
First we create an array of all the storage upload snapshot observables.
The we use concat() to run them all in sequence. (If you change from concat() to merge() they will all go at once)
The we use mergeMap to jump over to the getDownloadURL
Then in the subscribe we add the url to the formValues
Finally in the finalize we set the class propery "value" equal to the formValue.
onSubmit(formValue) {
const snapshotObservables = this.selectedImageArray.map(selectedImage => { // 4 images in this array
const filePath = `images/tours/${selectedImage.name.split('.').slice(0, -1).join('.')}_${new Date(). getTime()}`;
return combineLatest(this.storage.upload(filePath, selectedImage).snapshotChanges(), filePath);
});
concat(...snapshotObservables).pipe(
mergeMap(([snapshot, filePath]) => {
const fileRef = this.storage.ref(filePath);
return fileRef.getDownloadURL();
}),
finalize(() => {
this.value(formValue);
})
).subscribe(url => {
formValue[`imageUrl${index+1}`] = url;
});
}
I wrote a new function for multiple file upload
public multipleFileUpload(event, isEncodeNeeded?: Boolean):Array<any> {
if(!isEncodeNeeded){
isEncodeNeeded=false;
}
let fileList = [];
for (let index = 0; index < event.target.files.length; index++) {
let returnData = {};
let file: File = event.target.files[index];
let myReader: FileReader = new FileReader();
returnData['documentName'] = event.target.files[index]['name'];
returnData['documentType'] = event.target.files[index]['type'];
myReader.addEventListener("load", function (e) {
if (myReader.readyState == 2) {
returnData['document'] = isEncodeNeeded ? btoa(e.target['result']) : e.target['result'];
}
});
myReader.readAsBinaryString(file);
fileList.push(returnData);
}
return fileList;
}
In this function event is the event of the input and the isEncodeNeeded is conversion is needed. If this is true then it convert to base64 format.
The output format is
[{
"document": documentbyte,
"documentName": document name,
"documentType": file format
}]
In Node.js I have to read files in a folder and for each file get file handler info, this is my simplest implementation using fs.readdir:
FileServer.prototype.listLocal = function (params) {
var self = this;
var options = {
limit: 100,
desc: 1
};
// override defaults
for (var attrname in params) { options[attrname] = params[attrname]; }
// media path is the media folder
var mediaDir = path.join(self._options.mediaDir, path.sep);
return new Promise((resolve, reject) => {
fs.readdir(mediaDir, (error, results) => {
if (error) {
self.logger.error("FileServer.list error:%s", error);
return reject(error);
} else { // list files
// cut to max files
results = results.slice(0, options.limit);
// filter default ext
results = results.filter(item => {
return (item.indexOf('.mp3') > -1);
});
// format meta data
results = results.map(file => {
var filePath = path.join(self._options.mediaDir, path.sep, file);
var item = {
name: file,
path: filePath
};
const fd = fs.openSync(filePath, 'r');
var fstat = fs.fstatSync(fd);
// file size in bytes
item.size = fstat.size;
item.sizehr = self.formatSizeUnits(fstat.size);
// "Birth Time" Time of file creation. Set once when the file is created.
item.birthtime = fstat.birthtime;
// "Modified Time" Time when file data last modified.
item.mtime = fstat.mtime;
// "Access Time" Time when file data last accessed.
item.atime = fstat.atime;
item.timestamp = new Date(item.mtime).getTime();
item.media_id = path.basename(filePath, '.mp3');
fs.closeSync(fd);//close file
return item;
});
if (options.desc) { // sort by most recent
results.sort(function (a, b) {
return b.timestamp - a.timestamp;
});
} else { // sort by older
results.sort(function (a, b) {
return a.timestamp - b.timestamp;
});
}
return resolve(results);
}
})
});
}
so that for each file I get an array of items
{
"name": "sample121.mp3",
"path": "/data/sample121.mp3",
"size": 5751405,
"sizehr": "5.4850 MB",
"birthtime": "2018-10-08T15:26:08.397Z",
"mtime": "2018-10-08T15:26:11.650Z",
"atime": "2018-10-10T09:01:48.534Z",
"timestamp": 1539012371650,
"media_id": "sample121"
}
That said, the problem is it's knonw that node.js fs.readdir may freeze Node I/O Loop when the folder to list has a large number of files, let's say from ten thousands to hundred thousands and more.
This is a known issue - see here for more info.
There are also plans to improve fs.readdir in a some way, like streaming - see here about this.
In the meanwhile I'm searching for like a patch to this, because my folders are pretty large.
Since the problem is the Event Loop get frozen, someone proposed a solution using process.nextTick, that I have ensembled here
FileServer.prototype.listLocalNextTick = function (params) {
var self = this;
var options = {
limit: 100,
desc: 1
};
// override defaults
for (var attrname in params) { options[attrname] = params[attrname]; }
// media path is the media folder
var mediaDir = path.join(self._options.mediaDir, path.sep);
return new Promise((resolve, reject) => {
var AsyncArrayProcessor = function (inArray, inEntryProcessingFunction) {
var elemNum = 0;
var arrLen = inArray.length;
var ArrayIterator = function () {
inEntryProcessingFunction(inArray[elemNum]);
elemNum++;
if (elemNum < arrLen) process.nextTick(ArrayIterator);
}
if (elemNum < arrLen) process.nextTick(ArrayIterator);
}
fs.readdir(mediaDir, function (error, results) {
if (error) {
self.logger.error("FileServer.list error:%s", error);
return reject(error);
}
// cut to max files
results = results.slice(0, options.limit);
// filter default ext
results = results.filter(item => {
return (item.indexOf('.mp3') > -1);
});
var ProcessDirectoryEntry = function (file) {
// This may be as complex as you may fit in a single event loop
var filePath = path.join(self._options.mediaDir, path.sep, file);
var item = {
name: file,
path: filePath
};
const fd = fs.openSync(filePath, 'r');
var fstat = fs.fstatSync(fd);
// file size in bytes
item.size = fstat.size;
item.sizehr = self.formatSizeUnits(fstat.size);
// "Birth Time" Time of file creation. Set once when the file is created.
item.birthtime = fstat.birthtime;
// "Modified Time" Time when file data last modified.
item.mtime = fstat.mtime;
// "Access Time" Time when file data last accessed.
item.atime = fstat.atime;
item.timestamp = new Date(item.mtime).getTime();
item.media_id = path.basename(filePath, '.mp3');
// map to file item
file = item;
}//ProcessDirectoryEntry
// LP: fs.readdir() callback is finished, event loop continues...
AsyncArrayProcessor(results, ProcessDirectoryEntry);
if (options.desc) { // sort by most recent
results.sort(function (a, b) {
return b.timestamp - a.timestamp;
});
} else { // sort by older
results.sort(function (a, b) {
return a.timestamp - b.timestamp;
});
}
return resolve(results);
});
});
}//listLocalNextTick
This seems to avoid the original issue, but I cannot anymore map the files lists to the items with file handler I did before, because when running the AsyncArrayProcessor on the files list, thus the ProcessDirectoryEntry on each file entry the async nature of process.nextTick causes that I cannot get back the results array modified as in the previous listLocal function where I just did an iterative array.map of the results array.
How to patch the listLocalNextTick to behave like the listLocal but keeping process.nextTick approach?
[UPDATE]
According to the proposed solution, this is the best implementation so far:
/**
* Scan files in directory
* #param {String} needle
* #param {object} options
* #returns {nodeStream}
*/
scanDirStream : function(needle,params) {
var options = {
type: 'f',
name: '*'
};
for (var attrname in params) { options[attrname] = params[attrname]; }
return new Promise((resolve, reject) => {
var opt=[needle];
for (var k in options) {
var v = options[k];
if (!Util.empty(v)) {
opt.push('-' + k);
opt.push(v);
}
};
var data='';
var listing = spawn('find',opt)
listing.stdout.on('data', _data => {
var buff=Buffer.from(_data, 'utf-8').toString();
if(buff!='') data+=buff;
})
listing.stderr.on('data', error => {
return reject(Buffer.from(error, 'utf-8').toString());
});
listing.on('close', (code) => {
var res = data.split('\n');
return resolve(res);
});
});
Example of usage:
scanDirStream(mediaRoot,{
name: '*.mp3'
})
.then(results => {
console.info("files:%d", results);
})
.catch(error => {
console.error("error %s", error);
});
This can be eventually modified to add a tick callback at every stdout.on event emitted when getting a new file in the directory listening.
I have Created a wrapper around find for it but you could use dir or ls in the same way.
const { spawn } = require('child_process');
/**
* findNodeStream
* #param {String} dir
* #returns {nodeStream}
*/
const findNodeStream = (dir,options) => spawn('find',[dir,options].flat().filter(x=>x));
/**
* Usage Example:
let listing = findNodeStream('dir',[options])
listing.stdout.on('data', d=>console.log(d.toString()))
listing.stderr.on('data', d=>console.log(d.toString()))
listing.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
*/
this allows you to stream a directory chunked and not in a whole as fs.readdir does.
Important
NodeJS > 12.11.1 will have async readdir support
Landed in cbd8d71 ( https://github.com/nodejs/node/commit/cbd8d715b2286e5726e6988921f5c870cbf74127 ) as fs{Promises}.opendir(), which returns an fs.Dir, which exposes an async iterator. tada
https://nodejs.org/api/fs.html#fs_fspromises_opendir_path_options
const fs = require('fs');
async function print(path) {
const dir = await fs.promises.opendir(path);
for await (const dirent of dir) {
console.log(dirent.name);
}
}
print('./').catch(console.error);
So I'm trying to connect to external server called Pexels to get some photos. I'm doing that from node.js but it is just a javascript issue. Pexels unfortunately lets user to download object with only 40 pictures per page.
https://api.pexels.com/v1/curated?per_page=40&page=1 // 40 is maximum
But actually I need more then that. I'd like to get 160 results, ie. to combine all first four pages. In order to do that I tried looping the request:
let pexelsData = [];
for(let i = 1; i < 5; i++) {
const randomPage = getRandomFromRange(1, 100); //pages should be randomized
const moreData = await axios.get(`https://api.pexels.com/v1/curated?per_page=40&page=${randomPage}`,
createHeaders('bearer ', keys.pexelsKey));
pexelsData = [ ...moreData.data.photos, ...pexelsData ];
}
Now I can use pexelsData but it work very unstable, sometimes it is able to get all combined data, sometimes it crashes. Is there a correct and stable way of looping requests?
You work with 3rd party API, which has rate limits. So you should add rate limits to your code. The simplest solution for you is using p-limit or similar approach form promise-fun
It will looks like that:
const pLimit = require('p-limit');
const limit = pLimit(1);
const input = [
limit(() => fetchSomething('foo')),
limit(() => fetchSomething('bar')),
limit(() => doSomething())
];
(async () => {
// Only one promise is run at once
const result = await Promise.all(input);
console.log(result);
})();
you can break it into functions like..
let images=[];
const getResponse = async i=> {
if(i<5)
return await axios.get(`https://api.pexels.com/v1/curated?per_page=40&page=${i}`)
}
const getImage = (i)=>{
if(i<5){
try {
const request = getResponse(i);
images = [...images,...request];
// here you will get all the images in an array
console.log(images)
getImage(++i)
} catch (error) {
console.log("catch error",error)
// getImage(i)
}
}
}
getImage(0); //call initail