I've got a for loop inside a for loop. The first loop should get a username, get their rating, append them to the same list item, then start over with the next username, rating, append, and so on and so forth until it's gone through ever user in the friends list.
//Get usernames
var current = Parse.User.current();
var relation = current.relation("FriendRelations");
relation.query().find({
success: function(results) {
for (var i = 0; i < results.length; i++) {
//This shouldn't increment until the rating has been retrieved in the
// next function.
theuser = results[i].getUsername();
$('ul').prepend('<li id = "frienditems_' + i + '"><div id ="friendname">' +
results[i].getUsername() + '</div></li>');
//Get friend's rating.
//This is a query within a query.
var GameScore = Parse.Object.extend("Rating");
var query = new Parse.Query(GameScore);
query.equalTo("user", results[i].getUsername());
query.find({
success: function(result) {
//The log just outputs the last user's name each time, because I guess the
//other function already looped through completely, so it's forever set to the last one?**
console.log(theuser);
for (var y = 0; y < result.length; y++) {
var object = result[y].get("Rating");
//Logging the rating.
console.log("Rating:" + object);
$('#frienditems_' + y).append('<p class="friendrating">' + object + '</p>');
}
},
error: function(error) {}
});
}
}
});
Here is my console.log:
John
Rating:5
John
Rating:50
John
Rating:43
John
Rating:80
I want it to be this:
George
Rating:5
Smith
Rating:50
Robert
Rating:43
John
Rating:80
Can anyone help? I've looked at other questions, but I can't figure out how to apply them to my situation. I wish I could at least access the first functions results within the second function.
UPDATE: Mihail's answer really helped me out. The console log now shows data being retrieved in the correct order. But it's still not all appending to its respective list item:
Your issue is caused by the asynchronous call to query.find(). You practically order the browser to retrieve your information from the cloud database and tell him what to do with the result if it retrieves anything successfully, while the code continues execution (in your case, the for loop still iterates through the first collection).
By the time the first request ends, for reaches the end and because the second query's success is in the scope of the first, the variable theuser is still instantiated and has the last value available.
To prevent that, you can change the scope of the variable using a function call with your parameter. I've rewritten your code and it looks like this:
var current = Parse.User.current();
var relation = current.relation("FriendRelations");
var $ul = $('ul')
relation.query().find({
success:function(results){
for(var i = 0; i < results.length; i++) {
theuser = results[i].getUsername();
// you can use <<var $friendrating = $("<li id='frienditems_" + i"'/>").html(...... >>
var $friendrating = $(document.createElement("li")).attr("id", "frienditems_" + i).html('<div class="friendname">' + results[i].getUsername() + '</div>')
$ul.prepend($friendrating);
getUserRating(theuser, $friendrating);
}
}
});
function getUserRating(theUser, $node) {
//Get friend's rating.
//This is a query within a query.
var GameScore = Parse.Object.extend("Rating");
var query = new Parse.Query(GameScore);
query.equalTo("user", theUser);
query.find({
success: function(result) {
//The log just outputs the last user's name each time, because I guess the
//other function already looped through completely, so it's forever set to the last one?**
console.log(theUser);
for (var y = 0; y < result.length; y++) {
var object = result[y].get("Rating");
//Logging the rating.
console.log("Rating:" +object);
$node.append('<p class="friendrating">' + object + '</p>');
}
},
error: function(error) {
}
});
}
Please, be advised that i haven't tested it and i'm not 100% sure it will work on the first try.
because the anonymous callback function success: function(result) {} is called after the for loop finishes in this case, query.find() is an asynchronous call, meaning the for loop will continue independently to the next iteration no matter if the query.find() has returned or not.
variable theuser is always the value of theuser in the last for loop because the for loop is able to complete before any of the callback functions are triggered
Related
I need a little help. I'm trying to run my second function "likeLinks();" but only after my first function "getLikeURLs();" is finished. This is because my 2nd function relies on the links Array to execute. It seems like they are trying to run at the same time.
Any help would be appreciated.
var links = [];
var url = '/' + window.location.pathname.split('/')[1] + '/' + window.location.pathname.split('/')[2] + '/'
getLikeURLs();
likeLinks();
function getLikeURLs() {
for (i = 1; i < parseInt(document.getElementsByClassName('PageNav')[0].getAttribute('data-last')) + 2; i++) {
var link = $.get(url + 'page-' + i, function(data) {
//gets the like links from current page
$(data).find('a[class="LikeLink item control like"]').each(function() {
links.push($(this).attr('href')); // Puts the links in the Array
});
});
}
}
function likeLinks() {
for (t = 0; t <= links.length; t++) {
var token = document.getElementsByName('_xfToken')[0].getAttribute('value')
$.post(links[t], {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
}, function(data) {});
}
}
The link variables are actually jQuery deferred objects - store them in an array and then you can use $.when() to create a mew deferred object that only resolves when all of the previous $.get() operations have completed:
function getLikeURLs(url) { // NB: parameter, not global
var defs = [], links = []; // NB: links no longer global
for (...) {
var link = $.get(...);
defs.push(link);
}
// wait for previous `$.get` to finish, and when they have create a new
// deferred object that will return the entire array of links
return $.when.apply($, defs).then(function() { return links; });
}
Then, to start the chain of functions:
getLikeURLs(url).then(likeLinks);
Note that likeLinks will now be passed the array of links instead of accessing it from the global state. That function should also be rewritten to allow you to wait for its $.post calls to complete, too:
function likeLinks(links) {
// loop invariant - take it outside the loop
var token = document.getElementsByName('_xfToken')[0].getAttribute('value');
// create array of deferreds, one for each link
var defs = links.map(function(link) {
return $.post(link, {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
});
});
// and another for when they're all done
return $.when.apply($, defs);
}
p.s. don't put that (relatively) expensive parseInt(document.getAttribute(...)) expression within the for statement - it'll cause it to be evaluated every iteration. Calculate it once outside the loop and store it in a variable. There's a few other places where you're repeating calls unnecessarily, e.g. window.location.pathname.split()
EDIT: My answer discusses the issue but see Alnitak answer for a much better solution.
The get in getLikeURLs and the put in likeLinks are both asynchronous. The calls to both of these function return immediately. When data is returned from the called server at some indeterminate time later, the callback functions are then called. The puts could return before the gets which would be a problem in your case. Also note that JavaScript is NOT multi-threaded so the two methods, getLikeURLs and likeLinks will never run at the same time. The callback functions, on the other hand, might be called at anytime later with no guarantee as to the call back order. For example, the 3rd get/put might return before the 1st get/put in your loops.
You could use $.ajax to specify that the gets and puts are synchronous but this is ill advised because the browser will hang if ANY get/put doesn't return in a reasonable amount of time (e.g. server is offline). Plus you don't have the "multi-tasking" benefit of sending out a lot of requests and having the various servers working at the same time. They would do so serially.
The trick is to simply call likeLinks form the callback function in getLikeURL. Your case is a little tricky because of the for loop but this should work:
var links = [];
var url = '/' + window.location.pathname.split('/')[1] + '/' + window.location.pathname.split('/')[2] + '/'
getLikeURLs();
//likeLinks(); // Don't call yet. Wait for gets to all return.
function getLikeURLs() {
var returnCount = 0; // Initialize a callback counter.
var count = parseInt(document.getElementsByClassName('PageNav')[0].getAttribute('data-last')) + 1;
for (i = 0; i < count; i++) {
var link = $.get(url + 'page-' + (i + 1), function(data) {
//gets the like links from current page
$(data).find('a[class="LikeLink item control like"]').each(function() {
links.push($(this).attr('href')); // Puts the links in the Array
});
// If all gets have returned, call likeLinks.
returnCount++;
if (returnCount === count) {
likeLinks();
}
});
}
}
function likeLinks() {
for (t = 0; t <= links.length; t++) {
var token = document.getElementsByName('_xfToken')[0].getAttribute('value')
$.post(links[t], {
_xfToken: token,
_xfNoRedirect: 1,
_xfResponseType: 'json'
}, function(data) {});
}
}
I'm trying to save different food names without duplicates on parse.com. However, when I run the code, the database consists of the same 2 or 3 foods over and over, instead of 200 or so unique names.
Below is my function. I tried logging the name of the food at two different points, and I get different values. The first point gives the correct name of the food, but the second point only shows either flaxseed muffins or raspberry pie. I think the problem has to do with the code running asynchronously, but I'm not sure how to resolve the issue.
Parse.Cloud.define("recordFavorite", function(request, response) {
var foodList = request.params.foodList; //string array of food names
var Food = Parse.Object.extend("Food");
var query = new Parse.Query(Food);
for (i = 0; i < foodList.length; i++ ) {
var name = foodList[i];
console.log("before name is " + name);
var query = new Parse.Query(Food);
query.exists("name", name);
query.find({
success: function(results) {
if(results.length == 0){
var food = new Food();
food.set("name", name);
food.save(null, {
success: function(food) {
console.log("saved with name " + name);
},
error: function(food, error) {
}
});
} else {
//don't create new food
}
},
error: function(error) {
}
});
}
});
EDIT:
I was able to make some progress by modifying it to the code pasted below. Now it saves all the objects, including duplicates. I noticed that the lines
var query = new Parse.Query(Food);
query.exists("name", name);
returns an array of all the foods and doesn't filter out the objects containing "name". (To be clear, this was probably still occurring in the original code, but I hadn't noticed.)
Parse.Cloud.define("recordFavorite", function(request, response) {
var foodList = request.params.foodList; //string array of food names
var foodListCorrected = new Array();
var Food = Parse.Object.extend("Food");
// Wrap your logic in a function
function process_food(i) {
// Are we done?
if (i == foodList.length) {
Parse.Object.saveAll(foodListCorrected, {
success: function(foodListCorrected) {
},
error: function(foodListCorrected) {
}
});
return;
}
var name = foodList[i];
var query = new Parse.Query(Food);
query.exists("name", name);
query.find({
success: function(results) {
console.log(results.length);
if(results.length == 0){
//console.log("before " + foodListCorrected.length);
var food = new Food();
food.set("name", name);
foodListCorrected.push(food);
// console.log(foodListCorrected.length);
} else {
//don't create new food
}
process_food(i+1)
},
error: function(error) {
console.log("error");
}
});
}
// Go! Call the function with the first food.
process_food(0);
});
I think you're right about the problem being the async logic. The problem is that the outer loop completes as quickly as it can, firing off the various, slower async calls for your food lookup queries as it goes. The outer loop doesn't wait and because of what's know as 'variable hoisting' when you access 'name' inside your success function, its value will be the latest value of 'name' in the outer loop. So when the success function is called, the value of name has moved on to a different food to when you first initiated the exists/save query sequence.
Here's a really simple example:
Say your foodList looked like ['Muffin'], ['Cheesecake']. When you enter the loop for the first time, you have name='Muffin'. You fire off your exists query for name='Muffin' and that now happens asynchronously. Meanwhile, the outer loop happily moves on and sets name='Cheesecake' and fires off another exists query. Meanwhile. your first exists query completes and you are now ready to save the first food. But, because of hoisting, the value of name within your success function is now 'Cheesecake'. So it saves 'Cheesecake' when it should have saved 'Muffin' Then the second set of async queries complete, and this one also saves 'Cheesecake'. So you get two foods, representing your two unique foods, but both are called 'Cheesecake'!
Here's the classic article on variable hoisting, it is well worth a read:
http://www.adequatelygood.com/JavaScript-Scoping-and-Hoisting.html
A way of solving this would be to only trigger the processing of the next food once all the async calls for the current food have completed. You can do this like this:
Parse.Cloud.define("recordFavorite", function(request, response) {
var foodList = request.params.foodList; //string array of food names
var Food = Parse.Object.extend("Food");
var query = new Parse.Query(Food);
// Wrap your logic in a function
function process_food(i) {
// Are we done?
if (i == foodList.length) return;
var name = foodList[i];
console.log("before name is " + name);
var query = new Parse.Query(Food);
query.exists("name", name);
query.find({
success: function(results) {
if(results.length == 0){
var food = new Food();
food.set("name", name);
food.save(null, {
success: function(food) {
console.log("saved with name " + name);
// Move onto the next food, only after all the async operations
// have completed.
process_food(i+1)
},
error: function(food, error) {
}
});
} else {
//don't create new food
}
},
error: function(error) {
}
});
}
// Go! Call the function with the first food.
process_food(0);
});
(Note, I've not tested this code, so there might be syntax errors).
I've not come across Parse before... I saw your question, went off to read about it, and thought it looked very interesting! I will remember it for my next PHP API project. I think there are some smarter things you can try to do. For example, your approach requires 2 async calls per food, one to see if it exists, and one to save it. For 200 foods, that's 400 async calls. However, the Parse API looks very helpful, and I think it will offer tools to help you cut this down. You could probably try something along the following lines:
You already have an array of strings of the names you want to save:
var foodList = request.params.foodList; //string array of food names
Say it looks like ["Cupcakes", "Muffins", "Cake"].
Now build a Parse query that gets all food names already on the server. (I don't know how to do this!). But you should get back an array, let's say ["Cupcakes", "Cheesecake"].
Now you an strip the duplicates in JavaScript. There'll be some nice questions here on StackOverflow to help with this! The result will be that "Cupcake" is a duplicate, so we are left with the array ["Muffins", "Cake"]
Now it looks like in Parse you can Batch some operations:
https://parse.com/docs/rest#objects-batch
so your goal is to save this array of ["Muffins", "Cake"] with one API call.
This approach will scale well with the number of foods, so even with 200 foods, you should be able to do it in one query, and one batch update per 50 foods (I think 50 is a batch limit, from the Parse docs), so at most you will need 5 API calls.
I believe this (https://www.parse.com/docs/js_guide#promises-series) is the solution you're looking for. You need to utilize promises to force synchronicity.
I am using node.js.
I have to add new elements in the object before to send a response to client.
user.getMatch(req.user, function(err, match){
for( k=0; k<match.length; k++){
var userId = {
id : match[k].match_id
};
var user = new User(userId);
console.log('k: ' + k);
user.getUserInfo(function(err2, info){
console.log('k here: ' + k);
if(info){
match[k].foo = info[0].foo;
}
});
}
var response = {
data : match
};
res.json(response);
});
I want to add an element "foo" from user.getUserInfo to the object "match" that was returned by user.getMatch. And then send all the data as response to the client.
But it got an error because "k" inside of user.getUserInfo is not equal to the "k" outside.
I do not know why the both "k" are not equal.
And how will I send a response to the client after performing the loop.
Thanks for your help!
Some problems here:
First, k is not defined so the k you're using is actually a global variable which is not what you want. You need to define it as 'var k'.
Second, the callback function you're passing to user.getUserInfo() is (probably) executed at some unknown time in the future. At this point your loop for (k ... has already finished so the the k variable already has a new value since the value that it had when you called user.getUserInfo(). And here's the tricky part: the code inside your callback function will use k's most recent value. It will not use the value that k had when the function was created.
You can solve this by adding a parameter to your callback function and binding k to it using the .bind method:
user.getMatch(req.user, function(err, match){
var k;
for(k=0; k < match.length; k++){
var userId = {
id : match[k].match_id
};
var user = new User(userId);
console.log('k: ' + k);
var callback = function(k, err2, info){
console.log('k here: ' + k);
if(info){
match[k].foo = info[0].foo;
}
}.bind(null, k);
user.getUserInfo(callback);
}
var response = {
data: match
};
res.json(response);
});
Also, you'd be better off by using .forEach for iterating over an array:
user.getMatch(req.user, function(err, match){
match.forEach(function(curr) {
var userId = {
id : curr.match_id
};
var user = new User(userId);
user.getUserInfo(function(err2, info){
if(info){
curr.foo = info[0].foo;
}
}
});
var response = {
data: match
};
res.json(response);
});
Although Array.forEach can give you your current index in the iteration, this is no longer needed. simply use the curr value (which gives you the current element in the iteration).
Finally, I think the code here is likely to send the response before all user.getUserInfo() calls have been executed. To achieve that you need to know when all user.getUserInfo() have been completed. This can be achieved by adding a variable numLeft which is decremented each time we get a user info. when this variable reaches zero we know that all getUserInfo() have completed and it is therefore safe to send the response back.
user.getMatch(req.user, function(err, match) {
var numLeft = match.length;
match.forEach(function(curr) {
var user = new User({
id : curr.match_id
});
user.getUserInfo(function(err2, info){
if(info) {
curr.foo = info[0].foo;
}
--numLeft;
if (numLeft == 0)
res.json({ data: match });
}
});
});
When you say "k inside and outside" do you mean inside and outside ofuser.getUserInfo(function(err2, info){})?
I am not sure of your context however i can think of two things
Since the function "function(err2, info)" is a callback and is executed asynchronously the context/stack in which k is used within getUserInfo is completely different. So try to pass k while calling i.e.
user.getUserInfo(function(err2, info, k){}). This should work
Try to declare k i.e var k in the closure that you want it to be used
Updating for another part of question
"But I got another issue.. it sends a response to client before it adds the "foo" element. So in the response to client, it only sends the object from "match" without the "foo" element."
That is again because ur code inside get user info gets executed asynchronously.
For this you need to keep a global flag or try to send the response from within getUserInfo
i.e.
var mathLen = match.length;
user.getUserInfo(function(err2, info,k,mathLen)
{
console.log('k here: ' + k);
if(info){
match[k].foo = info[0].foo;
}
if(k==mathLen)
{
var response = {
data : match
};
res.json(response);
}
});
Each time the "success" function of this code block executes, some unwanted events happen:
If I dont refresh the browser and enter the same name again, a duplicated image appears. This happens each time the code is run.
If I don't refresh the browser and type a name that doesn't exist, wheres before the search returned a name that did, then both images are displayed on the page.
How do I stop the duplication? I've looked at alternatives to .append in jQuery, but none are having the desired result.
I think I also need the query to reset each time its run, other wise it appears this also causes complication.
var friendName;
function findFriend() {
friendName = $('#friendsearch').val();
console.log(friendName);
var query = new Parse.Query(Parse.User);
query.equalTo("username", friendName); // find users that match
query.find({
success: function (friendMatches) {
// This section is always run, no matter the outcome. Depending on the input depends on which result is shown
if (friendMatches.length === 0)
// console.log('NO MATCH FOUND!!!');
$(".no_user").show();
else // Query executed with success
imageURLs = [];
for (var i = 0; i < friendMatches.length; i++) {
var object = friendMatches[i];
imageURLs.push(object.get('pic'));
}
// If the imageURLs array has items in it, set the src of an IMG element to the first URL in the array
for (var j = 0; j < imageURLs.length; j++) {
$('#imgs').append("<img src='" + imageURLs[j] + "'/>");
}
console.log('MATCH FOUND!!!');
},
// Only if the code breaks and cannot either find or not find a user should this error be returned
error: function (error) {
alert('Opps we have a problem' + error.message);
}
});
}
// This captures the users input and is triggered when the user presses the find
$('#find_button').click(function (e) {
findFriend();
});
You need to remove the old image(s) before adding the new one(s):
$('#imgs').empty();
When to clear is another issue (I think this is your 'unwanted event #2'):
success: function (friendMatches) {
// clear images first, so that previous search results
// don't show when the current search returns 0 results
$('#imgs').empty();
...
}
So I am quite new to javascript and unsure as to how to go about solving this issue. I know that the query is asynchronous so cannot execute inside a loop but do not know how to go about this.
Here I am calling a query to get a list of all users. There is then a for loop to go through each user, the second query gets executed for each user and uses that users username as part of the class name. Does anyone have any suggestions for what I should do?
var query = new Parse.Query(User);
query.find({
success: function (results) {
$(".success1").show();
progressBar.max = results.length * 2;
for (var i = 0; i < results.length; i++) {
var object = results[i];
var Predictions = Parse.Object.extend("predictions" + object.get("username"));
var query2 = new Parse.Query(Predictions);
query2.equalTo("matchWeekID", weekNum.value);
query2.find({
This is where the code does not run. The query2.find runs once the for loop has been finished.