I am trying to do multiple asynchronous actions: Axios requests inside of a for loop. I want to do something after everything is resolved but there is so much going on I don't know how to do it.
I thought of making my sourcer function async and awaiting it on each iteration (and wrapping the for loop in an async function), but one problem is that sourcer doesn't actually return anything. I don't know how to return from sourcer from inside an Axios "finally" clause. Another problem is that I don't want to await each sourcer call because it would be a hit on performance.
Promise.all sounds like the right direction to take but I don't know how to implement it with this for loop.
Here is the relevant part of my code (ts is a large array of objects):
.then(ts => {
// Create an association object that determines each media item's source
const sourcer = media => { // Input is either [image filename, image url] or [image filename, image url, video filename, video url]
// Test to see if the original URL works
let validURL = true
axios.get(media[1])
.then(resp => {
if (resp.status.toString()[0] !== '2') validURL = false
})
.catch(resp => {
if (resp.status.toString()[0] !== '2') validURL = false
})
.finally(() => {
let newSources = JSON.parse(JSON.stringify(this.state.sources))
let newModals = JSON.parse(JSON.stringify(this.state.modals))
if (validURL) newSources[media[0]] = media[1]
// If the original URL does not work, pull media item from server
else newSources[media[0]] = `http://serveripaddress/get_media?filename=${media[0]}`
newModals[media[0]] = false
this.setState({ sources: newSources, modals: newModals })
})
if (media.length > 2) { // If the media item is a video, do the same checks
let validVURL = true
axios.get(media[3])
.then(resp => {
if (resp.status.toString()[0] !== '2') validVURL = false
})
.catch(resp => {
if (resp.status.toString()[0] !== '2') validVURL = false
})
.finally(() => {
let newSources2 = JSON.parse(JSON.stringify(this.state.sources))
let newThumbnails = JSON.parse(JSON.stringify(this.state.thumbnails))
if (validVURL) newSources2[media[2]] = media[3]
else newSources2[media[2]] = `http://serveripaddress/get_media?filename=${media[2]}`
newThumbnails[media[0]] = media[2] // Add an association for the video and its thumbnail
this.setState({ sources: newSources2, thumbnails: newThumbnails })
})
}
}
for (let t of ts) {
if (t.media) for (let m of t.media) sourcer(m)
if (t.preview_media) sourcer(t.preview_media)
if (t.video) sourcer(t.video)
}
})
I want to do something after ts has been iterated through and all sourcer calls are completed.
I'm not fishing for someone to write my code for me but a nudge in the right direction would be greatly appreciated.
axios.get will return a Promise, so simply build up your array of Promises and use Promise.all
So, in your case, instead of executing the http call and waiting on the response, just add it to your array.
Something like this will work. I removed your code that was handling the response of each individual get request. You can merge that code (or just copy/paste) into where I put the placeholder below:
.then(ts => {
// Create an association object that determines each media item's source
const sourcer = media => { // Input is either [image filename, image url] or [image filename, image url, video filename, video url]
// Test to see if the original URL works
let validURL = true;
const promises = [];
promises.push(axios.get(media[1]));
if (media.length > 2) { // If the media item is a video, do the same checks
let validVURL = true;
promises.push(axios.get(media[3]));
}
}
for (let t of ts) {
if (t.media)
for (let m of t.media) sourcer(m)
if (t.preview_media) sourcer(t.preview_media)
if (t.video) sourcer(t.video)
}
// Execute the Promises
Promise.all(promises).then( results => {
const media1 = results[0];
const media3 = results[1];
// TODO: Run your code for media1/media3 results
})
})
Related
Context
I'm retrieving data from the ESPN API to fetch weekly NFL matchup data. So, I'm making 18 api calls each time I need to fetch this data to account for all 18 weeks in the NFL season. I'm then creating an array with the data I need from the responses to those calls and writing out 18 files that align with each week in the NFL season (week1.json, week2.json, etc.).
Problem
The problem is that when I call my endpoint, I am seeing 2 things intermittently, and not necessarily at the same time:
(1) Some of the json files(week1.json, week2.json, etc.) include only a portion of the expected array. So, instead of 16 objects in the array, I may see only 4, or only 6, etc. Why would I only see a portion of the response data written to the array that's ultimately written to the .json files?
(2) Not all files are written to each time the endpoint is called. So, I may see that only week1-week5's .json files are written. Why aren't all of them updated?
Problem Code
// iterate 18 times
for (let i = 0; i < 18; i++) {
let weekNumber;
weekNumber = i + 1;
const week = fs.readFileSync(`./pickem/week${weekNumber}.json`, 'utf8');
const weekJson = JSON.parse(week);
// empty weekJson.games array
weekJson.games = []
// get all items
axios.get(`https://sports.core.api.espn.com/v2/sports/football/leagues/nfl/seasons/2022/types/2/weeks/${weekNumber}/events?lang=en®ion=us`)
.then(response => {
const schedule = [];
// get all items from response
const items = response.data.items
// console.log(response.data.items)
items.forEach(item => {
// make get call to $ref
axios.get(item.$ref)
.then(response => {
// get name
const name = response.data.name
// get date
const date = response.data.date
// get event id
const eventid = response.data.id
// get team ids
let team1 = response.data.competitions[0].competitors[0].id
let team2 = response.data.competitions[0].competitors[1].id
// create new object
const newObject = {
name: name,
date: date,
eventid: eventid,
team1: team1,
team2: team2
}
// add games for week
weekJson.games.push(newObject);
fs.writeFileSync(`./pickem/week${weekNumber}.json`, JSON.stringify(weekJson));
})
.catch(error => {
console.log(error)
})
})
}).catch(error => {
console.log(error)
})
}
Updated Code
router.get('/getschedules', (req, res) => {
async function writeGames() {
// iterate 18 times
for (let i = 0; i < 18; i++) {
let weekNumber;
weekNumber = i + 1;
const week = fs.readFileSync(`./pickem/week${weekNumber}.json`, 'utf8');
const weekJson = JSON.parse(week);
// empty weekJson.games array
weekJson.games = []
// get all items
// Add await keyword to wait for a week to be processed before going to the next one
await axios.get(`https://sports.core.api.espn.com/v2/sports/football/leagues/nfl/seasons/2022/types/2/weeks/${weekNumber}/events?lang=en®ion=us`)
.then(async (response) => { // add async to be able to use await
const schedule = [];
// get all items from response
const items = response.data.items
console.log(response.data.items)
// Use standard loop to be able to benefit from async/await
for (let item of items) {
// make get call to $ref
// wait for an item to be processed before going to the next one
await axios.get(item.$ref)
.then(response => {
// get name
const name = response.data.name
// get date
const date = response.data.date
// get event id
const eventid = response.data.id
// get team ids
let team1 = response.data.competitions[0].competitors[0].id
let team2 = response.data.competitions[0].competitors[1].id
// create new object
const newObject = {
name: name,
date: date,
eventid: eventid,
team1: team1,
team2: team2
}
// add games for week
weekJson.games.push(newObject);
})
.catch(error => {
console.log(error)
})
}
// moved out of the for loop since you only need to write this once
fs.writeFileSync(`./pickem/week${weekNumber}.json`, JSON.stringify(weekJson));
}).catch(error => {
console.log(error)
})
}
}
writeGames();
})
Your issue might come from the fact that you are looping over an array of item that triggers parallel asynchronous calls and write weekJson before you get the entire data. (But theoretically your code should work if writeSyncFile is really synchronous, maybe there are locks on the file system that prevents node to write properly?)
You could try to make everything sequential and only write weekJson once instead of everytime you go over an item:
EDIT
I updated my original code proposition by keeping parallel calls and it worked for me (it's similar to OP's code but I only write the json file once per week).
Then I tried to run OP's code and it was working fine as well. So this makes me think that the problem isn't from the code itself but rather how it's called. As a pure node script, there doesn't seem to be any issue. But I just noticed that OP is using it server side as the result of an API call.
Having an API write so many JSON concurrently is probably not the best idea (especially if the api is called multiple times almost simultaneously). You could either
just return the games in the response
or precompute the results
or fetch and write them only once then cache the result to be reused
Then I wonder if due to the server context, there is not some kind of timeout since OP said that with my initial solution, only the first week was created.
const axios = require("axios");
const fs = require("fs");
async function writeGames() {
const writeWeekGamesPromises = [];
// iterate 18 times
for (let weekNumber = 1; weekNumber < 19; weekNumber++) {
// give week a default value in case the json file doesn't exist (for repro purpose)
let week = "{}";
try {
week = fs.readFileSync(`./pickem/week${weekNumber}.json`, "utf8");
} catch (e) {
console.log(`error reading week ${weekNumber} json file:`, e);
// file doesn't exist yet
}
const weekJson = JSON.parse(week);
// empty weekJson.games array
const games = [];
weekJson.games = games;
// get all items
// Add await keyword to wait for a week to be processed before going to the next one
writeWeekGamesPromises.push(axios
.get(
`https://sports.core.api.espn.com/v2/sports/football/leagues/nfl/seasons/2022/types/2/weeks/${weekNumber}/events?lang=en®ion=us`
)
.then(async (eventListResponse) => {
// add async to be able to use await
const schedule = [];
console.log(JSON.stringify(eventListResponse.data),'\n');
// get all items from response
const items = eventListResponse.data.items;
// console.log(eventListResponse.data.items); // this seems to be useless since we log the whole data just above
// parallelize calls and wait for all games from a week to be fetched before writing the file
await Promise.all(
items.map((item) => {
// we return the promise so that Promise.all will wait for all games to be pushed before going on writing the file
return axios
.get(item.$ref)
.then((response) => {
// get name, date and eventid
const {name, date, id: eventid} = response.data;
// get team ids
let team1 = response.data.competitions[0].competitors[0].id;
let team2 = response.data.competitions[0].competitors[1].id;
games.push({ name, date, eventid, team1, team2 });
})
.catch((error) => {
console.log(error);
});
})
);
// Now that all game data is ready, write in the file
fs.writeFileSync(
`./pickem/week${weekNumber}.json`,
JSON.stringify(weekJson)
);
})
.catch((error) => {
console.log(error);
}));
}
// Waiting for all games from all weeks to be processed
await Promise.all(writeWeekGamesPromises);
}
async function runAndLogTime() {
const start = Date.now();
await writeGames();
console.log(`took ${(Date.now() - start) / 1000}s to write all json files`);
}
runAndLogTime();
I'm trying to implement a function, which slices a file into chunks and then sends them to my backend one after another.
The function has to hash each file & validate if the hash is already known before starting the upload.
The following code is the code part, where my problematic function is called.
process: async (
fieldName,
file,
metadata,
load,
error,
progress,
abort,
transfer,
options,
) => {
// fieldName is the name of the input field - No direct relevance for us
// logger.log(`fieldName: ${fieldName}`);
// Usually Empty - Can be added with Metadata-Plugin
// logger.log(metadata);
const source = this.$axios.CancelToken.source();
const abortProcess = () => {
// This function is entered if the user has tapped the cancel button
source.cancel('Operation cancelled by user');
// Let FilePond know the request has been cancelled
abort();
};
let chunks = [];
const {
chunkForce,
chunkRetryDelays,
chunkServer,
chunkSize,
chunkTransferId,
chunkUploads,
} = options;
// Needed Parameters of file
const { name, size } = file;
if (chunkTransferId) {
/** Here we handle what happens, when Retry-Button is pressed */
logger.log(`Already defined: ${chunkTransferId}`);
return { abortProcess };
}
this.hashFile(file)
.then((hash) => {
logger.log(`File Hashed: ${hash}`);
if (hash.length === 0) {
error('Hash not computable');
}
return hash;
})
.then((hash) => {
logger.log(`Hash passed through: ${hash}`);
return this.requestTransferId(file, hash, source.token)
.then((transferId) => {
logger.log(`T-ID receieved: ${transferId}`);
return transferId;
})
.catch((err) => {
error(err);
});
})
.then((transferId) => {
transfer(transferId);
logger.log(`T-ID passed through: ${transferId}`);
// Split File into Chunks to prepare Upload
chunks = this.splitIntoChunks(file, chunkSize);
// Filter Chunks - Remove all those which have already been uploaded with success
const filteredChunks = chunks.filter(
(chunk) => chunk.status !== ChunkStatus.COMPLETE,
);
logger.log(filteredChunks);
return this.uploadChunks(
filteredChunks,
{ name, size, transferId },
progress,
error,
source.token,
).then(() => transferId);
})
.then((transferId) => {
// Now Everything should be uploaded -> Set Progress to 100% and make item appear finished
progress(true, size, size);
load(transferId);
logger.log(transferId);
})
.catch((err) => error(err));
return { abortProcess };
},
uploadChunks is where the problem starts.
async uploadChunks(chunks, options, progress, error, cancelToken) {
const { name, size, transferId } = options;
for (let index = 0; index < chunks.length; index += 1) {
let offset = 0;
const chunk = chunks[index];
chunk.status = ChunkStatus.PROCESSING;
// eslint-disable-next-line no-await-in-loop
await this.uploadChunk(chunk.chunk, options, offset)
.then(() => {
chunk.status = ChunkStatus.COMPLETE;
offset += chunk.chunk.size;
progress(true, offset, size);
logger.log(offset); // This is always chunk.chunk.size, instead of getting bigger
})
.catch((err) => {
chunk.status = ChunkStatus.ERROR;
error(err);
});
}
},
uploadChunk(fileChunk, options, offset) {
const { name, size, transferId } = options;
const apiURL = `${this.$config.api_url}/filepond/patch?id=${transferId}`;
return this.$axios.$patch(apiURL, fileChunk, {
headers: {
'content-type': 'application/offset+octet-stream',
'upload-name': name,
'upload-length': size,
'upload-offset': offset,
},
});
},
As you can see uploadChunks takes an array of chunks, some options, two functions (progress & error) and a cancelToken (which I currently don't use, since I'm still stuck at this problem)
Each chunk in the array has the form of:
{
status: 0, // Some Status indicating, if it's completed or not
chunk: // binary data
}
The Function uploadChunks iterates over the chunk array and should in theory upload one chunk after another and always increment offset after each upload and then call progress. After this it should start the next iteration of the loop, where offset would be bigger than in the call before.
The calls themselves get executed one after another, but every call has the same offset and progress does not get repeatedly called. Instead my progress-bar locks until everything is uploaded and them jumps to 100%, due to the load-call in the first function right at the end.
So the upload itself works fine in the correct order, but all the code after the await this.uploadChunk... doesn't get called after each chunk and blocks somehow.
You are setting offset to 0 inside the loop. So offset is always 0. You should move this line:
let offset = 0;
before the for statement.
Alright so my problem is that in the first set of console.log(streamXXXX)s, where XXXX are the various variables, when I read their values they all read as they should, while in the second set they read as undefined. Is this a scope issue? Maybe an Async issue? I tried adding awaits to each time I make a web request but nothing seems to work, and one of the most interesting parts about this is the fact that there are no errors?
Anyways, my code is listed below, as well as a link to test it out in Repl using a sample bot I created. Below that is the list of libraries required for said program to run. Thanks!
if (!message.member.voiceChannel) return message.channel.send(`You do realize you have to be in a voice channel to do that, right ${message.author.username}?`)
if (!message.member.voiceConnection) message.member.voiceChannel.join().then(async connection => {
let streamURL = args.slice(1).join(" ")
let streamTitle = "";
let streamThumb = "";
let streamAuth = "";
let streamAuthThumb = "";
if (streamURL.includes("https://www.youtube.com") || streamURL.includes("https://youtu.be/") && !streamURL.includes(' ')) {
youtube.getVideo(streamURL)
.then(async results => {
let {
body
} = await snekfetch.get(`https://www.googleapis.com/youtube/v3/channels?part=snippet&id=${results.channel.id}&fields=items%2Fsnippet%2Fthumbnails&key=${ytapikey}`).query({
limit: 800
})
streamTitle = results.title
streamThumb = results.thumbnails.medium.url
streamAuth = results.channel.title
streamAuthThumb = body.items[0].snippet.thumbnails.medium.url
console.log(streamURL)
console.log(streamTitle)
console.log(streamThumb)
console.log(streamAuth)
console.log(streamAuthThumb)
})
.catch(console.error)
} else if (!streamURL.includes("https://www.youtube.com") || !streamURL.includes("https://youtu.be/")) {
youtube.searchVideos(streamURL)
.then(async results => {
let {
body
} = await snekfetch.get(`https://www.googleapis.com/youtube/v3/channels?part=snippet&id=${results[0].channel.id}&fields=items%2Fsnippet%2Fthumbnails&key=${ytapikey}`).query({
limit: 800
})
streamURL = results[0].url
streamTitle = results[0].title
streamThumb = results[0].thumbnails.default.medium.url
streamAuth = results[0].channel.title
streamAuthThumb = body.items[0].snippet.thumbnails.medium.url
}).catch(console.error)
} else {
return message.reply("I can only play videos from YouTube (#NotSponsored).")
}
console.log(streamURL)
console.log(streamTitle)
console.log(streamThumb)
console.log(streamAuth)
console.log(streamAuthThumb)
const stream = ytdl(streamURL, {
filter: 'audioonly'
})
const dispatcher = connection.playStream(stream)
dispatcher.on("end", end => {
return
})
let musicEmbed = new Discord.RichEmbed()
.setAuthor(streamAuth, streamAuthThumb)
.setTitle(`Now Playing: ${streamTitle}`)
.setImage(streamThumb)
.setColor(embedRed)
.setFooter(`${streamAuth} - ${streamTitle} (${streamURL}`)
await message.channel.send(musicEmbed)
})
Link to test out the program on a sample bot I made
Modules you will need to test this:
discord.js
simple-youtube-api
node-opus
ffmpeg
ffbinaries
ffmpeg-binaries
opusscript
snekfetch
node-fetch
ytdl-core
Thanks again!
The reason why your output is undefined is due to the way promises work and how you structured your code:
let streamTitle = "";
// 1. Promise created
youtube.getVideo(streamURL)
// 2. Promise still pending, skip for now
.then(async results => {
// 4. Promise fulfilled
console.log(results.title); // 5. Logged actual title
});
console.log(streamTitle); // 3. Logged ""
You already have the correct approach for your snekfetch requests, just need to apply it to the YT ones as well:
let streamTitle = "";
const results = await youtube.getVideo(streamURL);
streamTitle = results.title;
console.log(streamTitle); // Desired output
I'm trying to iterate and print out in order an array in Javascript that contains the title of 2 events that I obtained from doing web scraping to a website but it prints out in disorder. I know Javascript is asynchronous but I'm new in this world of asynchronism. How can I implement the loop for to print the array in order and give customized info?
agent.add('...') is like console.log('...'). I'm doing a chatbot with DialogFlow and NodeJs 8 but that's not important at this moment. I used console.log() in the return just for debug.
I tried the next:
async function printEvent(event){
agent.add(event)
}
async function runLoop(eventsTitles){
for (let i = 0; i<eventsTitles.length; i++){
aux = await printEvent(eventsTitles[i])
}
}
But i got this error error Unexpected await inside a loop no-await-in-loop
async function showEvents(agent) {
const cheerio = require('cheerio');
const rp = require('request-promise');
const options = {
uri: 'https://www.utb.edu.co/eventos',
transform: function (body) {
return cheerio.load(body);
}
}
return rp(options)
.then($ => {
//** HERE START THE PROBLEM**
var eventsTitles = [] // array of event's titles
agent.add(`This mont we have these events available: \n`)
$('.product-title').each(function (i, elem) {
var event = $(this).text()
eventsTitles.push(event)
})
agent.add(`${eventsTitles}`) // The array prints out in order but if i iterate it, it prints out in disorder.
// *** IMPLEMENT LOOP FOR ***
agent.add(`To obtain more info click on this link https://www.utb.edu.co/eventos`)
return console.log(`Show available events`);
}).catch(err => {
agent.add(`${err}`)
return console.log(err)
})
}
I would like to always print out Event's title #1 and after Event's title #2. Something like this:
events titles.forEach((index,event) => {
agent.add(`${index}. ${event}`) // remember this is like console.log(`${index}. ${event}`)
})
Thanks for any help and explanation!
There no async case here but if you still face difficultly than use this loop
for (let index = 0; index < eventsTitles.length; index++) {
const element = eventsTitles[index];
agent.add(${index}. ${element})
}
I would like to flush a buffered observable based on the content of the buffer, but how to accomplish this? A simplified example of what I want to do:
observable.buffer(() => {
// Filter based on the buffer content.
// Assuming an observable here because buffer()
// needs to return an observable.
return buffer.filter(...);
})
Here is more specifically what I am trying to do with key events (bin here):
const handledKeySequences = ['1|2']
// Mock for document keydown event
keyCodes = Rx.Observable.from([1,2,3,4])
keyCodes
.buffer(() => {
/*
The following doesn't work because it needs to be an
observable, but how to observe what is in the buffer?
Also would like to not duplicate the join and includes
if possible
return function (buffer) {
return handledKeySequences.includes(buffer.join('|'));
};
*/
// Returning an empty subject to flush the buffer
// every time to prevent an error, but this is not
// what I want.
return new Rx.Subject();
})
.map((buffer) => {
return buffer.join('|')
})
.filter((sequenceId) => {
return handledKeySequences.includes(sequenceId);
})
.subscribe((sequenceId) => {
// Expecting to be called once with 1|2
console.log("HANDLING", sequenceId)
})
I feel like my approach is wrong, but I can't figure out what the right approach would be. I've tried using scan, but that scans all the events in the observable, which is not what I want.
Thanks for any help!
This should be doable with bufferWithCount:
const handledKeySequences = ['1|2']
// Mock for document keydown event
keyCodes = Rx.Observable.from([0,1,2,3,4]);
const buffer$ = keyCodes
.bufferWithCount(2, 1) // first param = longest possible sequence, second param = param1 - 1
.do(console.log)
.map((buffer) => {
return buffer.join('|')
})
.filter((sequenceId) => {
return handledKeySequences.includes(sequenceId);
});
buffer$.subscribe((sequenceId) => {
console.log("HANDLING", sequenceId)
});
See live here.
Also have a look at this question.
It seems that this functionality is not currently available in Rxjs, so as suggested by #olsn I wrote a custom operator that works by passing a function to tell when to flush the buffer:
(function() {
// Buffer with function support.
function bufferWithContent(bufferFn) {
let buffer = [];
return this.flatMap(item => {
buffer.push(item);
if (bufferFn(buffer)) {
// Flush buffer and return mapped.
let result = Rx.Observable.just(buffer);
buffer = [];
} else {
// Return empty and retain buffer.
let result = Rx.Observable.empty();
}
return result;
});
}
Rx.Observable.prototype.bufferWithContent = bufferWithContent;
})();
I also opened an issue here that proposes adding this functionality.