Here is the parse javascript cloud code I wrote. I want to find all objects in subclass "Example" having one like and then reset them as four. I have already established the Example class and Like column in the data browser. But the query didn't work and I can't figure out why.
function exampleFunction() {
var Example = Parse.Object.extend("Example");
var newObject = new Example();
newObject.save(); // until here, the function works, it continues creating new objects
var query = new Parse.Query(Example);
query.equalTo('Like',1);
query.find({
success:function(result){
for (var i = 0; i < result.length; i++) {
result[i].set('Like',4);
}
},
error:function(error){
}
})
}
Parse.Cloud.define("nice", function(request, response) {
exampleFunction();
response.success();
});
I use this period of code on iOS device to trigger the cloud function:
[PFCloud callFunctionInBackground:#"nice"
withParameters:#{}
block:^(NSString *result, NSError *error) {
if (!error) {
}
}];
A couple of possible issues..
You are calling asynchronous methods and not giving them time to complete. That's where Parse Promises come in. You have to make sure to use the then function. See http://blog.parse.com/2013/01/29/whats-so-great-about-javascript-promises/
You are correctly setting 'Like' to 4, but you aren't saving the rows by calling save
You may not have any rows coming back from your query, they way to check that is to pass the number of rows found back through the success callback, which I am doing below
Try this below, noticing that the .success should return a result if you NSLog(#"result %#",result) from your objective-c. Also, the error should be coming through now as well because of response.error(error)
var Example = Parse.Object.extend("Example");
function exampleFunction() {
var query = new Parse.Query(Example);
query.equalTo('Like',1);
return query.find().then(function(examplesLikedOnce){
var promises = [];
for (var i = 0; i < examplesLikedOnce.length; i++) {
var example = examplesLikedOnce[i];
var promise = example.save({Like:4});
promises.push(promise);
}
return Parse.Promise.when(promises).then(function(){
return examplesLikedOnce.length;
});
});
}
Parse.Cloud.define("nice", function(request, response) {
exampleFunction().then(function(numExamples){
response.success("The number of Example objects with 1 like: "+numExamples);
}, function(error){
response.error(error);
});
});
Related
Specifically, given a list of data, I want to loop over that list and do a fetch for each element of that data before I combine it all afterward. The thing is, as written, the code iterates through the entire list immediately, starting all the operations at once. Then, even though the fetch operations are still running, the then call I have after all that runs, before the data could've been processed.
I read something about putting all the Promises in an array, then passing that array to a Promise.all() call, followed by a then that will have access to all that processed data as intended, but I'm not sure how exactly to go about doing it in this case, since I have nested Promises in this for loop.
for(var i in repoData) {
var repoName = repoData[i].name;
var repoUrl = repoData[i].url;
(function(name, url) {
Promise.all([fetch(`https://api.github.com/repos/${username}/${repoData[i].name}/commits`),
fetch(`https://api.github.com/repos/${username}/${repoData[i].name}/pulls`)])
.then(function(results) {
Promise.all([results[0].json(), results[1].json()])
.then(function(json) {
//console.log(json[0]);
var commits = json[0];
var pulls = json[1];
var repo = {};
repo.name = name;
repo.url = url;
repo.commitCount = commits.length;
repo.pullRequestCount = pulls.length;
console.log(repo);
user.repositories.push(repo);
});
});
})(repoName, repoUrl);
}
}).then(function() {
var payload = new Object();
payload.user = user;
//console.log(payload);
//console.log(repoData[0]);
res.send(payload);
});
Generally when you need to run asynchronous operations for all of the items in an array, the answer is to use Promise.all(arr.map(...)) and this case appears to be no exception.
Also remember that you need to return values in your then callbacks in order to pass values on to the next then (or to the Promise.all aggregating everything).
When faced with a complex situation, it helps to break it down into smaller pieces. In this case, you can isolate the code to query data for a single repo into its own function. Once you've done that, the code to query data for all of them boils down to:
Promise.all(repoData.map(function (repoItem) {
return getDataForRepo(username, repoItem);
}))
Please try the following:
// function to query details for a single repo
function getDataForRepo(username, repoInfo) {
return Promise
.all([
fetch(`https://api.github.com/repos/${username}/${repoInfo.name}/commits`),
fetch(`https://api.github.com/repos/${username}/${repoInfo.name}/pulls`)
])
.then(function (results) {
return Promise.all([results[0].json(), results[1].json()])
})
.then(function (json) {
var commits = json[0];
var pulls = json[1];
var repo = {
name: repoInfo.name,
url: repoInfo.url,
commitCount: commits.length,
pullRequestCount: pulls.length
};
console.log(repo);
return repo;
});
}
Promise.all(repoData.map(function (repoItem) {
return getDataForRepo(username, repoItem);
})).then(function (retrievedRepoData) {
console.log(retrievedRepoData);
var payload = new Object();
payload.user = user;
//console.log(payload);
//console.log(repoData[0]);
res.send(payload);
});
In a Chrome extension im using the HTML5 FileSytem API.
Im retrieving a list of records in a folder.
var entries = [];
var metadata = [];
listFiles(folder);
function listFiles(fs) {
var dirReader = fs.createReader();
entries = [];
// Call the reader.readEntries() until no more results are returned.
var readEntries = function () {
dirReader.readEntries(function (results) {
if (!results.length) {
addMeta(entries);
} else {
console.log(results);
entries = entries.concat(toArray(results));
readEntries();
}
});
};
readEntries(); // Start reading dirs.
}
The FileEntry object does not contain metadata, I need the last modified date. I'm able to retrieve a object of metadata
function addMeta(entries) {
for (var i = 0; i < entries.length; i++) {
entries[i].getMetadata(function (metadata) {
console.log(entries);
console.log(metadata);
});
}
}
Problem is that i get the metadata in a callback.
How can i join the two object making sure the right match is made?
The simplified result im looking for is:
[
["fileName1", "modifyDate1"],
["fileName2", "modifyDate2"],
]
To get lastModifiedDate, you don't need to use getMetadata, as per the description of this question, just use entry.file.lastModifiedDate, though maybe file() is another callback.
To "join the two object making sure the right match is made", because of Closures, you could use the following code to get the right results. (Assuming the data structure is [[entry, metadata]] as you mentioned)
var ans = [];
function addMeta(entries) {
for (var i = 0; i < entries.length; i++) {
(function(entry) {
entry.getMetadata(function (metadata) {
ans.push([entry, metadata]);
});
}(entries[i]);
}
}
If what you want is to wait for all asynchronous callback ends, see this answer for more details, basically you could adjust your code and use Promise, or use other implementations like setInterval or use a variable to calculate how many callbacks remain.
I suggest to have a look on Promise-based bro-fs implementation of HTML Filesystem API.
To read all entries with metadata you can do something like this:
fs.readdir('dir')
.then(entries => {
const tasks = entries.map(entry => fs.stat(entry.fullPath))
return Promise.all(tasks);
})
.then(results => console.log(results))
I am tryng to create posts using a for loop, but when i look at Parse database only the last object of my array get's stored. this is the code i wrote.
var Reggione = Parse.Object.extend("Reggione");
var creaReggione = new Reggione();
var selectobject = $('#searcharea')[0];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
creaReggione.save();
Thanks, Bye.
Do this by creating an array of new objects, then save them together...
var newObjects = [];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
newObjects.push(creaReggione);
// ...
}
Parse.Object.saveAll(newObjects);
Remember, if you want something to happen after saveAll completes (like call response.success() if you're in cloud code), then you should use that promise as follows...
Parse.Object.saveAll(newObjects).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
In extension to danhs answer, the reason this does not work is because only one transaction can happen at a time from the JS client to Parse.
Therefore in your loop the first call to .save() is made and the object is saved to Parse at some rate (asynchrounously), in that time the loop continues to run and skips over your other save calls, these objects are NOT queued to be saved. As Danh pointed out, you must use Parse's batch operations to save multiple objects to the server in one go, to do this you can:
var newObjects = [];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
newObjects.push(creaReggione);
// ...
}
Parse.Object.saveAll(newObjects);
Hope this helps, I'd also recommend taking a look at Parse's callback functions on the save method to get more details on what happened (you can check for errors and success callbacks here to make debugging a little easier)
An example of this would be to extend the previous call with:
Parse.Object.saveAll(newObjects, {
success: function(messages) {
console.log("The objects were successfully saved...")
},
error: function(error) {
console.log("An error occurred when saving the messages array: %s", error.message)
}
})
I hope this is of some help to you
I'm using the Request library. I have an array of URLs (var resources) that I want to populate with the text content of those URLs.
My code looks like this:
var resources = [<URL_1>, <URL_2>, <URL_3>, ...];
var resourcesText = resourceToText(resources);
function resourcesToText(resources) {
var text = [];
for (var i in resources) {
request(resources[i], fetch);
}
function fetch(error, response, body) {
if (!error) {
text.push(body);
} else {
console.log('Sorry. I couldn\'t parse that resource. ' + error);
}
}
return text;
};
The problem is that resourceToText() returns the array before fetch() has had time to populate it. Thus, resourcesText ends up being empty. I assume higher order functions are meant to deal with this kind of problem but I can't fathom a way to work the callback.
What's the best approach to this problem?
Make an asynchronous function out of resourcesToText by:
Using an additional callback:
function resourcesToText(resources, callback) {
var text = [],
requestCount = resources.length;
for (var i in resources) {
request(resources[i], fetch);
}
function fetch(error, response, body) {
if (!error) {
text.push(body);
} else {
console.log('Sorry. I couldn\'t parse that resource. ' + error);
}
requestCount--;
if(requestCount <= 0){
callback(text);
}
}
};
Call the callback with your result text after you you got all the results of your requests using a counter (requestCount).
Now the call to resourcesToTextlooks like:
resourcesToText(resources, function(text){
...
});
Using deferred/promises
Instead of using a callback, you could also implement the concept of deferred/promises. There are many libraries out there, such as q (https://github.com/kriskowal/q)
I use Parse in iOS to run a cloud code method that gets an ID in it's request and receives a number in the response.
The purpose of the cloud code function is to take the request ID and add it to a field of 3 different users.
Here is the cloud code method in Javascript:
amount = 3;
// Use Parse.Cloud.define to define as many cloud functions as you want.
// For example:
Parse.Cloud.define("addToIDs", function(request, response) {
var value = request.params.itemId;
var query = new Parse.Query(Parse.User);
query.ascending("createdAt");
query.limit(100);
query.find({
success: function(results) {
var sent = 0;
for (var i = 0; i < results.length; i++) {
var idlst = results[i].get("idString");
if (idlst != null && idlst.indexOf(value) <= -1) {
idlst += value+"|";
results[i].set("idString", idlst);
results[i].save();
sent = sent+1;
}
if (sent >= amount) {
break;
}
}
response.success(sent);
},
error: function() {
response.error("Test failed");
}
});
});
When running this cloud code method I get a response of '3' meaning it called .save for 3 users. The problem is that when i go back to look in the Database viewer in the parse website it actually only updated a single user (Its always the same user). No matter how many times i run this code, it will only actually update the first user..
Anyone know why this is happening?
Both save and saveAll are asynchronous, so you should make sure the saving process is done.
Also note that, the user object can only be updated by the owner or request with masterkey.
The following code should work:
var amount = 3;
Parse.Cloud.define("addToIDs", function(request, response) {
var value = request.params.itemId;
var query = new Parse.Query(Parse.User);
query.ascending("createdAt");
query.limit(100);
return query.find()
.then(function(results) { // success
var toSave = [];
var promise = new Parse.Promise();
for (var i = 0; i < results.length; i++) {
var idlst = results[i].get("idString");
if (idlst != null && idlst.indexOf(value) <= -1) {
idlst += value+"|";
results[i].set("idString", idlst);
toSave.push(results[i]);
}
if (toSave.length >= amount) {
break;
}
}
// use saveAll to save multiple object without bursting multiple request
Parse.Object.saveAll(toSave, {
useMasterKey: true,
success: function(list) {
promise.resolve(list.length);
},
error: function() {
promise.reject();
}
});
return promise;
}).then(function(length) { // success
response.success(length);
}, function() { // error
response.error("Test failed");
});
});
The reason this is happening is two-fold:
save() is an asynchronous method, and
response.success() will immediately kill your running code as soon as it's called.
So what's happening is that inside your for loop you're running save() several times, but since it's asynchronous, they're simply thrown into the processing queue and your for loop continues on through. So it's quickly throwing all of your save()'s into the processing queue, and then it reaches your response.success() call but, by the time it's reached, only one of the save()'s has had a chance to successfully process.