beginner React/Javascript: callback hell - javascript

I'm trying to fetch some data, and THEN fetch some other data. Problem is, there's a loop with a timeout in the mix, which makes everything so complicated. And it doesn't work.
In the order, what I'm triying to do:
Loop through listOfGroups and fetch one or many get request to my backend
When that is done, fetch one time another get request
So, here's my code:
var i = 0;
var arrayOfPostURL = [];
isFinished = false;
while (isFinished == false)
if(i == listOfGroups.length) {
return fetch("/update_db_fcb_groups/" + theID + "/" + arrayOfPostURL).then((response) => {
response.json().then((data) => {
this.setState({onFcbGroups: arrayOfPostURL})
isFinished = true;
});
});
}
if(i<listOfGroups.length){
setTimeout(function(){
fetch("/post_to_fcb_group/" + listOfGroups[i] + "/" + theID).then((response) => {
// console.log(response);
response.json().then((data) => {
arrayOfPostURL.push("+" + data.url)
i++;
});
});
// console.log(arr[i])
},5000)
}
}
This code even freezes the browser (Google Chrome) !
Any ideas?

It looks like you're using a while loop when you could be using a for.
var arrayOfPostURL = [];
for (let group of listOfGroups) {
setTimeout(function() {
fetch("/post_to_fcb_group/" + group + "/" + theID).then((response) => {
response.json().then((data) => {
arrayOfPostURL.push("+" + data.url)
});
});
}, 5000)
}
fetch("/update_db_fcb_groups/" + theID + "/" + arrayOfPostURL).then((response) => {
response.json().then((data) => {
this.setState({onFcbGroups: arrayOfPostURL})
});
});
Breaking your code down like this reveals a couple other issues.
Your setTimeouts will all finish around the same time. You're just queueing a bunch of fetches that will each take place 5 seconds after they were queued. If you meant to wait 5 seconds between each fetch, this is not the way to do so.
Your final fetch concatenates an array into the URL. That will end up looking something like "/your/url/+url1,+url2". Is that what you intended? It's a fairly unusual URL schema. You might want to change that call to a POST or PUT and pass in a JSON array in the body.
Because you're calling setTimeout on all of your fetches, your final fetch will actually finish when the loop completes which could be before any of the other fetches execute. You likely want to use Promise.all or something similar.

Related

Tried my best to stop duplicating data on continuous http request by clicking refresh button(for that particular component) Angular 5

The below is my .ts file for the Alarm Component and over HTML I am using a simple *ngFor over criticalObject.siteList to display the records
This is not the original code I have simplified this but the problem I am facing is that on rigorous click on the refresh button(fires HTTP request), the list is adding duplicate siteNames and that should not happen. I have heard of debounce time, shareReplay, and trying applying here, which even doesn't make sense here.
NOTE: I have to fire the HTTP request on every refresh button click.
Keenly Waiting for Help.
criticalObject.siteList = [];
siteList = ["c404", "c432"];
onRefresh() {
this.criticalObject.siteList = [];
this.siteList.forEach(elem => {
getAlarmStatus(elem);
})
}
getAlarmStatus(item) {
critical_list = [];
alarmService.getAlarmStatusBySite(item.siteName).subcribe(data => {
if(data) {
// do some calculations
if(this.criticalObject.siteList.length === 0) {
this.criticalObject.siteList.push({
siteName = item.siteName;
})
}
this.criticalObject.siteList.forEach((elem, idx) => {
if(elem.siteName === item.siteName) {
return;
} else if(idx === this.criticalObject.siteList.length - 1) {
this.criticalObject.siteList.push({
siteName = item.siteName;
})
}
})
}
}
})
I did a silly mistake, I am new to JavaScript, I found out you cannot return from a forEach loop and that's why I was getting duplicated records, return statement in forEach acts like a continue in JavaScript.

How to run one getJSON after another based on the results of the first one

In my application, I am trying to gather data from two different sources. So first it has to loop.each in to an internal JSON file and see if the data is found, if not, it has to request another $.getjson() to get the data from an external source.
So the second $.getjson() is dependant on the first one and sometimes does not need to be run if the the data is already found in the first one.
First $.getjson() call:
$.getJSON(InternalURL, function (data) {
$.each(data.Source, function (index, value) {
if(artist.indexOf(value.keyword) > -1){
image = value.image;
$(".bg").css("background-image", "url(" + image + ")");
}
});
});
Second $.getjson() call:
$.getJSON(ExternalURL, function (data) {
image = data.artist.image;
(".bg").css("background-image", "url(" + image + ")");
});
The other consideration is the timing of this process. Of course it has to be done as fast as possible, so the it wont be noticeable in the interface.
UPDATE
Example using Async / Await based on the answer provided by #Tiny Giant
JSFiddle
Currently this code works on JSFiddle, but in the actual application, while it works fine, it gives a Console error as "Uncaught (in promise)" with an object of methods such as, always, abort, fail, and etc. Any idea why this error comes up?
You could use Async / Await (initially defined in the ECMAScript® 2017 Language Specification). See caniuse.com for information on current support.
This works by pausing the current execution context and removing it from the stack once the getJSON call begins. Once the $.getJSON call returns, the paused execution context will be added back onto the stack, then execution will continue once the preceding items in the stack have been processed.
(async () => {
const post = await $.getJSON('https://jsonplaceholder.typicode.com/posts/1');
const comments = await $.getJSON('https://jsonplaceholder.typicode.com/comments');
post.comments = comments.filter(e => e.postId === post.id);
console.log(post);
})();
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
To apply this to your specific example, you could use the following code:
(async () => {
let image, data = await $.getJSON(InternalURL);
if(data) {
for(let value of data.Source) {
if(!image && artist.indexOf(value.keyword) > -1) {
image = value.image;
}
}
} else {
// request failed
}
if(!image) {
let data = await $.getJSON(ExternalURL);
if(data) {
image = data.artist.image;
} else {
// request failed
}
}
$(".bg").css("background-image", "url(" + image + ")");
})();
This requests the first resource, then—once that request is completed and execution is continued—it checks the response as your example does. If the script doesn't find what it is looking for in the first response, it initiates the second request. Once execution is continued, it sets the background image.
Further reading:
Async functions
Await operator
Another option would be to use callbacks. This method is more difficult to follow, but is supported everywhere.
$.getJSON('https://jsonplaceholder.typicode.com/posts/1', post => {
$.getJSON('https://jsonplaceholder.typicode.com/comments', comments => {
post.comments = comments.filter(e => e.postId === post.id);
console.log(post);
});
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
$.getJSON(InternalURL, function (data) {
$.each(data.Source, function (index, value) {
if(artist.includes(value.keyword)){
if (value.image) {
image = value.image;
$(".bg").css("background-image", "url(" + image + ")");
} else {
$.getJSON(ExternalURL, function (data) {
image = data.artist.image;
(".bg").css("background-image", "url(" + image + ")");
});
}
}
});
});

waitForElementsToBePresent in Protractor

So I found this little bit of code online that works pretty well for waiting for a unique identifier to be loaded on the page before you interact with it..
this.waitForElementToBePresent = function(element) {
browser.wait(function() {
return element.isPresent();
}, 60000);
};
I am new to JS and Protractor.. I was wondering how this code could be changed to wait for the presence of an element where there are multiples. I know you use $$ for the identifier when there are multiples, but how can I change this method to recognize that so I would then do something like..
utility.waitForElementsToBePresent(myElement).get(0);
Look at Alecxe's answer on this question. I've been using it for awhile and it works perfectly. Here's my slightly modified version:
// wait for X number of elements
presenceOfAll = function (elem, num, timeout) {
var time = timeout || 5000;
console.log('Waiting for elements ' + elem.locator() + ' to have a count of ' + num);
return browser.wait(function () {
return elem.count().then(function (count) {
return count >= num;
});
}, time, 'Failed waiting for ' + elem.locator() + ' to have ' + num + ' total items');
};
Rather than making a new function I would probably just last element in a group then wait for it.
var els = elements.all(by.css("#id"));
waitForElementToBePresent(els.last());
As something to remember ther is "isPresent" and "isDisplayed", present mean that an element exists on the page, visible or not. If you want to wait for it to actually show on the page, first wait for it to be present then wait for it to be displayed.
http://www.protractortest.org/#/api?view=ElementArrayFinder.prototype.last

Twitch TV JSON API Issue

So,I am trying to use the twitch API:
https://codepen.io/sterg/pen/yJmzrN
If you check my codepen page you'll see that each time I refresh the page the status order changes and I can't figure out why is this happening.
Here is my javascript:
$(document).ready(function(){
var ur="";
var tw=["freecodecamp","nightblue3","imaqtpie","bunnyfufuu","mushisgosu","tsm_dyrus","esl_sc2"];
var j=0;
for(var i=0;i<tw.length;i++){
ur="https://api.twitch.tv/kraken/streams/"+tw[i];
$.getJSON(ur,function(json) {
$(".tst").append(JSON.stringify(json));
$(".name").append("<li> "+tw[j]+"<p>"+""+"</p></li>");
if(json.stream==null){
$(".stat").append("<li>"+"Offline"+"</li>");
}
else{
$(".stat").append("<li>"+json.stream.game+"</li>");
}
j++;
})
}
});
$.getJSON() works asynchronously. The JSON won't be returned until the results come back. The API can return in different orders than the requests were made, so you have to handle this.
One way to do this is use the promise API, along with $.when() to bundle up all requests as one big promise, which will succeed or fail as one whole block. This also ensures that the response data is returned to your code in the expected order.
Try this:
var channelIds = ['freecodecamp', 'nightblue3', 'imaqtpie', 'bunnyfufuu', 'mushisgosu', 'tsm_dyrus', 'esl_sc2'];
$(function () {
$.when.apply(
$,
$.map(channelIds, function (channelId) {
return $.getJSON(
'https://api.twitch.tv/kraken/streams/' + encodeURIComponent(channelId)
).then(function (res) {
return {
channelId: channelId,
stream: res.stream
}
});
})
).then(function () {
console.log(arguments);
var $playersBody = $('table.players tbody');
$.each(arguments, function (index, data) {
$playersBody.append(
$('<tr>').append([
$('<td>'),
$('<td>').append(
$('<a>')
.text(data.channelId)
.attr('href', 'https://www.twitch.tv/' + encodeURIComponent(data.channelId))
),
$('<td>').text(data.stream ? data.stream.game : 'Offline')
])
)
})
})
});
https://codepen.io/anon/pen/KrOxwo
Here, I'm using $.when.apply() to use $.when with an array, rather than list of parameters. Next, I'm using $.map() to convert the array of channel IDs into an array of promises for each ID. After that, I have a simple helper function with handles the normal response (res), pulls out the relevant stream data, while attaching the channelId for use later on. (Without this, we would have to go back to the original array to get the ID. You can do this, but in my opinion, that isn't the best practice. I'd much prefer to keep the data with the response so that later refactoring is less likely to break something. This is a matter of preference.)
Next, I have a .then() handler which takes all of the data and loops through them. This data is returned as arguments to the function, so I simply use $.each() to iterate over each argument rather than having to name them out.
I made some changes in how I'm handling the HTML as well. You'll note that I'm using $.text() and $.attr() to set the dynamic values. This ensures that your HTML is valid (as you're not really using HTML for the dynamic bit at all). Otherwise, someone might have the username of <script src="somethingEvil.js"></script> and it'd run on your page. This avoids that problem entirely.
It looks like you're appending the "Display Name" in the same order every time you refresh, by using the j counter variable.
However, you're appending the "Status" as each request returns. Since these HTTP requests are asynchronous, the order in which they are appended to the document will vary each time you reload the page.
If you want the statuses to remain in the same order (matching the order of the Display Names), you'll need to store the response data from each API call as they return, and order it yourself before appending it to the body.
At first, I changed the last else condition (the one that prints out the streamed game) as $(".stat").append("<li>"+jtw[j]+": "+json.stream.game+"</li>"); - it was identical in meaning to what you tried to achieve, yet produced the same error.
There's a discrepancy in the list you've created and the data you receive. They are not directly associated.
It is a preferred way to use $(".stat").append("<li>"+json.stream._links.self+": "+json.stream.game+"</li>");, you may even get the name of the user with regex or substr in the worst case.
As long as you don't run separate loops for uploading the columns "DisplayName" and "Status", you might even be able to separate them, in case you do not desire to write them into the same line, as my example does.
Whatever way you're choosing, in the end, the problem is that the "Status" column's order of uploading is not identical to the one you're doing in "Status Name".
This code will not preserve the order, but will preserve which array entry is being processed
$(document).ready(function() {
var ur = "";
var tw = ["freecodecamp", "nightblue3", "imaqtpie", "bunnyfufuu", "mushisgosu", "tsm_dyrus", "esl_sc2"];
for (var i = 0; i < tw.length; i++) {
ur = "https://api.twitch.tv/kraken/streams/" + tw[i];
(function(j) {
$.getJSON(ur, function(json) {
$(".tst").append(JSON.stringify(json));
$(".name").append("<li> " + tw[j] + "<p>" + "" + "</p></li>");
if (json.stream == null) {
$(".stat").append("<li>" + "Offline" + "</li>");
} else {
$(".stat").append("<li>" + json.stream.game + "</li>");
}
})
}(i));
}
});
This code will preserve the order fully - the layout needs tweaking though
$(document).ready(function() {
var ur = "";
var tw = ["freecodecamp", "nightblue3", "imaqtpie", "bunnyfufuu", "mushisgosu", "tsm_dyrus", "esl_sc2"];
for (var i = 0; i < tw.length; i++) {
ur = "https://api.twitch.tv/kraken/streams/" + tw[i];
(function(j) {
var name = $(".name").append("<li> " + tw[j] + "<p>" + "" + "</p></li>");
var stat = $(".stat").append("<li></li>")[0].lastElementChild;
console.log(stat);
$.getJSON(ur, function(json) {
$(".tst").append(JSON.stringify(json));
if (json.stream == null) {
$(stat).text("Offline");
} else {
$(stat).text(json.stream.game);
}
}).then(function(e) {
console.log(e);
}, function(e) {
console.error(e);
});
}(i));
}
});

Run function after another function completes JavaScript and JQuery

I need a little help. I'm trying to run my second function "likeLinks();" but only after my first function "getLikeURLs();" is finished. This is because my 2nd function relies on the links Array to execute. It seems like they are trying to run at the same time.
Any help would be appreciated.
var links = [];
var url = '/' + window.location.pathname.split('/')[1] + '/' + window.location.pathname.split('/')[2] + '/'
getLikeURLs();
likeLinks();
function getLikeURLs() {
for (i = 1; i < parseInt(document.getElementsByClassName('PageNav')[0].getAttribute('data-last')) + 2; i++) {
var link = $.get(url + 'page-' + i, function(data) {
//gets the like links from current page
$(data).find('a[class="LikeLink item control like"]').each(function() {
links.push($(this).attr('href')); // Puts the links in the Array
});
});
}
}
function likeLinks() {
for (t = 0; t <= links.length; t++) {
var token = document.getElementsByName('_xfToken')[0].getAttribute('value')
$.post(links[t], {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
}, function(data) {});
}
}
The link variables are actually jQuery deferred objects - store them in an array and then you can use $.when() to create a mew deferred object that only resolves when all of the previous $.get() operations have completed:
function getLikeURLs(url) { // NB: parameter, not global
var defs = [], links = []; // NB: links no longer global
for (...) {
var link = $.get(...);
defs.push(link);
}
// wait for previous `$.get` to finish, and when they have create a new
// deferred object that will return the entire array of links
return $.when.apply($, defs).then(function() { return links; });
}
Then, to start the chain of functions:
getLikeURLs(url).then(likeLinks);
Note that likeLinks will now be passed the array of links instead of accessing it from the global state. That function should also be rewritten to allow you to wait for its $.post calls to complete, too:
function likeLinks(links) {
// loop invariant - take it outside the loop
var token = document.getElementsByName('_xfToken')[0].getAttribute('value');
// create array of deferreds, one for each link
var defs = links.map(function(link) {
return $.post(link, {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
});
});
// and another for when they're all done
return $.when.apply($, defs);
}
p.s. don't put that (relatively) expensive parseInt(document.getAttribute(...)) expression within the for statement - it'll cause it to be evaluated every iteration. Calculate it once outside the loop and store it in a variable. There's a few other places where you're repeating calls unnecessarily, e.g. window.location.pathname.split()
EDIT: My answer discusses the issue but see Alnitak answer for a much better solution.
The get in getLikeURLs and the put in likeLinks are both asynchronous. The calls to both of these function return immediately. When data is returned from the called server at some indeterminate time later, the callback functions are then called. The puts could return before the gets which would be a problem in your case. Also note that JavaScript is NOT multi-threaded so the two methods, getLikeURLs and likeLinks will never run at the same time. The callback functions, on the other hand, might be called at anytime later with no guarantee as to the call back order. For example, the 3rd get/put might return before the 1st get/put in your loops.
You could use $.ajax to specify that the gets and puts are synchronous but this is ill advised because the browser will hang if ANY get/put doesn't return in a reasonable amount of time (e.g. server is offline). Plus you don't have the "multi-tasking" benefit of sending out a lot of requests and having the various servers working at the same time. They would do so serially.
The trick is to simply call likeLinks form the callback function in getLikeURL. Your case is a little tricky because of the for loop but this should work:
var links = [];
var url = '/' + window.location.pathname.split('/')[1] + '/' + window.location.pathname.split('/')[2] + '/'
getLikeURLs();
//likeLinks(); // Don't call yet. Wait for gets to all return.
function getLikeURLs() {
var returnCount = 0; // Initialize a callback counter.
var count = parseInt(document.getElementsByClassName('PageNav')[0].getAttribute('data-last')) + 1;
for (i = 0; i < count; i++) {
var link = $.get(url + 'page-' + (i + 1), function(data) {
//gets the like links from current page
$(data).find('a[class="LikeLink item control like"]').each(function() {
links.push($(this).attr('href')); // Puts the links in the Array
});
// If all gets have returned, call likeLinks.
returnCount++;
if (returnCount === count) {
likeLinks();
}
});
}
}
function likeLinks() {
for (t = 0; t <= links.length; t++) {
var token = document.getElementsByName('_xfToken')[0].getAttribute('value')
$.post(links[t], {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
}, function(data) {});
}
}

Categories