I use Parse in iOS to run a cloud code method that gets an ID in it's request and receives a number in the response.
The purpose of the cloud code function is to take the request ID and add it to a field of 3 different users.
Here is the cloud code method in Javascript:
amount = 3;
// Use Parse.Cloud.define to define as many cloud functions as you want.
// For example:
Parse.Cloud.define("addToIDs", function(request, response) {
var value = request.params.itemId;
var query = new Parse.Query(Parse.User);
query.ascending("createdAt");
query.limit(100);
query.find({
success: function(results) {
var sent = 0;
for (var i = 0; i < results.length; i++) {
var idlst = results[i].get("idString");
if (idlst != null && idlst.indexOf(value) <= -1) {
idlst += value+"|";
results[i].set("idString", idlst);
results[i].save();
sent = sent+1;
}
if (sent >= amount) {
break;
}
}
response.success(sent);
},
error: function() {
response.error("Test failed");
}
});
});
When running this cloud code method I get a response of '3' meaning it called .save for 3 users. The problem is that when i go back to look in the Database viewer in the parse website it actually only updated a single user (Its always the same user). No matter how many times i run this code, it will only actually update the first user..
Anyone know why this is happening?
Both save and saveAll are asynchronous, so you should make sure the saving process is done.
Also note that, the user object can only be updated by the owner or request with masterkey.
The following code should work:
var amount = 3;
Parse.Cloud.define("addToIDs", function(request, response) {
var value = request.params.itemId;
var query = new Parse.Query(Parse.User);
query.ascending("createdAt");
query.limit(100);
return query.find()
.then(function(results) { // success
var toSave = [];
var promise = new Parse.Promise();
for (var i = 0; i < results.length; i++) {
var idlst = results[i].get("idString");
if (idlst != null && idlst.indexOf(value) <= -1) {
idlst += value+"|";
results[i].set("idString", idlst);
toSave.push(results[i]);
}
if (toSave.length >= amount) {
break;
}
}
// use saveAll to save multiple object without bursting multiple request
Parse.Object.saveAll(toSave, {
useMasterKey: true,
success: function(list) {
promise.resolve(list.length);
},
error: function() {
promise.reject();
}
});
return promise;
}).then(function(length) { // success
response.success(length);
}, function() { // error
response.error("Test failed");
});
});
The reason this is happening is two-fold:
save() is an asynchronous method, and
response.success() will immediately kill your running code as soon as it's called.
So what's happening is that inside your for loop you're running save() several times, but since it's asynchronous, they're simply thrown into the processing queue and your for loop continues on through. So it's quickly throwing all of your save()'s into the processing queue, and then it reaches your response.success() call but, by the time it's reached, only one of the save()'s has had a chance to successfully process.
Related
I am tryng to create posts using a for loop, but when i look at Parse database only the last object of my array get's stored. this is the code i wrote.
var Reggione = Parse.Object.extend("Reggione");
var creaReggione = new Reggione();
var selectobject = $('#searcharea')[0];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
creaReggione.save();
Thanks, Bye.
Do this by creating an array of new objects, then save them together...
var newObjects = [];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
newObjects.push(creaReggione);
// ...
}
Parse.Object.saveAll(newObjects);
Remember, if you want something to happen after saveAll completes (like call response.success() if you're in cloud code), then you should use that promise as follows...
Parse.Object.saveAll(newObjects).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
In extension to danhs answer, the reason this does not work is because only one transaction can happen at a time from the JS client to Parse.
Therefore in your loop the first call to .save() is made and the object is saved to Parse at some rate (asynchrounously), in that time the loop continues to run and skips over your other save calls, these objects are NOT queued to be saved. As Danh pointed out, you must use Parse's batch operations to save multiple objects to the server in one go, to do this you can:
var newObjects = [];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
newObjects.push(creaReggione);
// ...
}
Parse.Object.saveAll(newObjects);
Hope this helps, I'd also recommend taking a look at Parse's callback functions on the save method to get more details on what happened (you can check for errors and success callbacks here to make debugging a little easier)
An example of this would be to extend the previous call with:
Parse.Object.saveAll(newObjects, {
success: function(messages) {
console.log("The objects were successfully saved...")
},
error: function(error) {
console.log("An error occurred when saving the messages array: %s", error.message)
}
})
I hope this is of some help to you
Here is the parse javascript cloud code I wrote. I want to find all objects in subclass "Example" having one like and then reset them as four. I have already established the Example class and Like column in the data browser. But the query didn't work and I can't figure out why.
function exampleFunction() {
var Example = Parse.Object.extend("Example");
var newObject = new Example();
newObject.save(); // until here, the function works, it continues creating new objects
var query = new Parse.Query(Example);
query.equalTo('Like',1);
query.find({
success:function(result){
for (var i = 0; i < result.length; i++) {
result[i].set('Like',4);
}
},
error:function(error){
}
})
}
Parse.Cloud.define("nice", function(request, response) {
exampleFunction();
response.success();
});
I use this period of code on iOS device to trigger the cloud function:
[PFCloud callFunctionInBackground:#"nice"
withParameters:#{}
block:^(NSString *result, NSError *error) {
if (!error) {
}
}];
A couple of possible issues..
You are calling asynchronous methods and not giving them time to complete. That's where Parse Promises come in. You have to make sure to use the then function. See http://blog.parse.com/2013/01/29/whats-so-great-about-javascript-promises/
You are correctly setting 'Like' to 4, but you aren't saving the rows by calling save
You may not have any rows coming back from your query, they way to check that is to pass the number of rows found back through the success callback, which I am doing below
Try this below, noticing that the .success should return a result if you NSLog(#"result %#",result) from your objective-c. Also, the error should be coming through now as well because of response.error(error)
var Example = Parse.Object.extend("Example");
function exampleFunction() {
var query = new Parse.Query(Example);
query.equalTo('Like',1);
return query.find().then(function(examplesLikedOnce){
var promises = [];
for (var i = 0; i < examplesLikedOnce.length; i++) {
var example = examplesLikedOnce[i];
var promise = example.save({Like:4});
promises.push(promise);
}
return Parse.Promise.when(promises).then(function(){
return examplesLikedOnce.length;
});
});
}
Parse.Cloud.define("nice", function(request, response) {
exampleFunction().then(function(numExamples){
response.success("The number of Example objects with 1 like: "+numExamples);
}, function(error){
response.error(error);
});
});
I created code like this for getting news from xml export from another website and I am trying to fill with it my database.
function UpdateLunchTime() {
var httpRequest = require('request');
var xml2js = require('xml2js');
var parser = new xml2js.Parser();
var url = 'http://www...com/export/xml/actualities';
httpRequest.get({
url: url
}, function(err, response, body) {
if (err) {
console.warn(statusCodes.INTERNAL_SERVER_ERROR,
'Some problem.');
} else if (response.statusCode !== 200) {
console.warn(statusCodes.BAD_REQUEST,
'Another problem');
} else {
//console.log(body);
parser.parseString(body, function (err2, result) {
//console.log(result.Root.event);
var count = 0;
for (var i=0;i<result.Root.event.length;i++)
{
//console.log(result.Root.event[i]);
InsertActionToDatabase(result.Root.event[i]);
}
/*
result.Root.event.forEach(function(entry) {
InsertActionToDatabase(entry);
});
*/
});
}
});
}
function InsertActionToDatabase(action)
{
var queryString = "INSERT INTO Action (title, description, ...) VALUES (?, ?, ...)";
mssql.query(queryString, [action.akce[0], action.description[0],...], {
success: function(insertResults) {
},
error: function(err) {
console.log("Problem: " + err);
}
});
}
For individual actualities it's working fine but when I run it over whole xml I get this error:
Error: [Microsoft][SQL Server Native Client 10.0][SQL Server]Resource ID : 1. The request limit for the database is 180 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance.
And for a few last objects I get this error:
Error: [Microsoft][SQL Server Native Client 10.0]TCP Provider: Only one usage of each socket address (protocol/network address/port) is normally permitted.
Thanks for help
The problem is that you're trying to make too many concurrent (insert) operations in your database. Remember that in node.js (almost) everything is asynchronous, so when you call InsertActionToDatabase for one of the items, this operation will start right away and not wait before it finishes to return. So you're basically trying to insert all of the events at once, and as the error message said there's a limit on the number of concurrent connections which can be made to the SQL server.
What you need to do is to change your loop to run asynchronously, by waiting for one of the operations to complete before starting the next one (you can also "batch" send a smaller number of operations at once, continuing after each batch is complete, but the code is a little more complicated) as shown below.
var count = result.Root.event.length;
var insertAction = function(index) {
if (index >= count) return;
InsertActionToDatabase(result.Root.event[i], function() {
insertAction(index + 1);
});
}
insertAction(0);
And the InsertActionToDatabase function would take a callback parameter to be called when it's done.
function InsertActionToDatabase(item, done) {
var table = tables.getTable('event');
table.insert(item, {
success: function() {
console.log('Inserted event: ', item);
done();
}
});
}
My program is communicating with a web service that only accepts ~10 requests per second. From time to time, my program sends 100+ concurrent requests to the web service, causing my program to crash.
How do I limit concurrent requests in Node.js to 5 per second? Im using the request library.
// IF EVENT AND SENDER
if(data.sender[0].events && data.sender[0].events.length > 0) {
// FIND ALL EVENTS
for(var i = 0; i < data.sender[0].events.length; i++) {
// IF TYPE IS "ADDED"
if(data.sender[0].events[i].type == "added") {
switch (data.sender[0].events[i].link.rel) {
case "contact" :
batch("added", data.sender[0].events[i].link.href);
//_initContacts(data.sender[0].events[i].link.href);
break;
}
// IF TYPE IS "UPDATED"
} else if(data.sender[0].events[i].type == "updated") {
switch (data.sender[0].events[i].link.rel){
case "contactPresence" :
batch("updated", data.sender[0].events[i].link.href);
//_getContactPresence(data.sender[0].events[i].link.href);
break;
case "contactNote" :
batch("updated", data.sender[0].events[i].link.href);
// _getContactNote(data.sender[0].events[i].link.href);
break;
case "contactLocation" :
batch("updated", data.sender[0].events[i].link.href);
// _getContactLocation(data.sender[0].events[i].link.href);
break;
case "presenceSubscription" :
batch("updated", data.sender[0].events[i].link.href);
// _extendPresenceSubscription(data.sender[0].events[i].link.href);
break;
}
}
};
And then the homegrown batch method:
var updated = [];
var added = [];
var batch = function(type, url){
console.log("batch called");
if (type === "added"){
console.log("Added batched");
added.push(url);
if (added.length > 5) {
setTimeout(added.forEach(function(req){
_initContacts(req);
}), 2000);
added = [];
}
}
else if (type === "updated"){
console.log("Updated batched");
updated.push(url);
console.log("Updated length is : ", updated.length);
if (updated.length > 5){
console.log("Over 5 updated events");
updated.forEach(function(req){
setTimeout(_getContactLocation(req), 2000);
});
updated = [];
}
}
};
And an example of the actual request:
var _getContactLocation = function(url){
r.get(baseUrl + url,
{ "strictSSL" : false, "headers" : { "Authorization" : "Bearer " + accessToken }},
function(err, res, body){
if(err)
console.log(err);
else {
var data = JSON.parse(body);
self.emit("data.contact", data);
}
}
);
};
Using the async library, the mapLimit function does exactly what you want. I can't provide an example for your specific use case as you did not provide any code.
From the readme:
mapLimit(arr, limit, iterator, callback)
The same as map only no more than "limit" iterators will be simultaneously
running at any time.
Note that the items are not processed in batches, so there is no guarantee that
the first "limit" iterator functions will complete before any others are
started.
Arguments
arr - An array to iterate over.
limit - The maximum number of iterators to run at any time.
iterator(item, callback) - A function to apply to each item in the array.
The iterator is passed a callback(err, transformed) which must be called once
it has completed with an error (which can be null) and a transformed item.
callback(err, results) - A callback which is called after all the iterator
functions have finished, or an error has occurred. Results is an array of the
transformed items from the original array.
Example
async.mapLimit(['file1','file2','file3'], 1, fs.stat, function(err, results){
// results is now an array of stats for each file
});
EDIT: Now that you provided code, I see that your use is a bit different from what I assumed. The async library is more useful when you know all the tasks to run up front. I don't know of a library off hand that will easily solve this for you. The above note is likely still relevant to people searching this topic so I'll leave it in.
Sorry, I don't have time to restructure your code, but this is an (un-tested) example of a function that makes an asynchronous request while self-throttling itself to 5 requests per second. I would highly recommend working off of this to come up with a more general solution that fits your code base.
var throttledRequest = (function () {
var queue = [], running = 0;
function sendPossibleRequests() {
var url;
while (queue.length > 0 && running < 5) {
url = queue.shift();
running++;
r.get(url, { /* YOUR OPTIONS HERE*/ }, function (err, res, body) {
running--;
sendPossibleRequests();
if(err)
console.log(err);
else {
var data = JSON.parse(body);
self.emit("data.contact", data);
}
});
}
}
return function (url) {
queue.push(url);
sendPossibleRequests();
};
})();
Basically, you keep a queue of all the data to be asynchronously processed (such as urls to be requested) and then after each callback (from a request) you try to launch off as many remaining requests as possible.
This is precisely what node's Agent class is designed to address. Have you done something silly like require('http').globalAgent.maxSockets = Number.MAX_VALUE or passed agent: false as a request option?
With Node's default behavior, your program will not send more than 5 concurrent requests at a time. Additionally, the Agent provides optimizations that a simple queue cannot (namely HTTP keepalives).
If you try to make many requests (for example, issue 100 requests from a loop), the first 5 will begin and the Agent will queue the remaining 95. As requests complete, it starts the next.
What you probably want to do is create an Agent for your web service requests, and pass it in to every call to request (rather than mixing requests in with the global agent).
var http=require('http'), svcAgent = http.Agent();
request({ ... , agent: svcAgent });
I've noticed that the size of a file requested will effect how long the response takes for ajax calls. So if I fire 3 ajax GET requests for files of varying size, they may arrive in any order. What I want to do is guarantee the ordering when I append the files to the DOM.
How can I set up a queue system so that when I fire A1->A2->A3. I can guarantee that they are appeneded as A1->A2->A3 in that order.
For example, suppose A2 arrives before A1. I would want the action to wait upon the arrival and loading of A1.
One idea is to create a status checker using a timed callback as such
// pseudo-code
function check(ready, fund) {
// check ready some how
if (ready) {
func();
} else {
setTimeout(function () {
check(ready, fund);
}, 1); // check every msec
}
}
but this seems like a resource heavy way, as I fire the same function every 1msec, until the resources is loaded.
Is this the right path to complete this problem?
status checker using a 1msec-timed callback - but this seems like a resource heavy way; Is this the right path to complete this problem?
No. You should have a look at Promises. That way, you can easily formulate it like this:
var a1 = getPromiseForAjaxResult(ressource1url);
var a2 = getPromiseForAjaxResult(ressource2url);
var a3 = getPromiseForAjaxResult(ressource3url);
a1.then(function(res) {
append(res);
return a2;
}).then(function(res) {
append(res);
return a3;
}).then(append);
For example, jQuery's .ajax function implements this.
You can try something like this:
var resourceData = {};
var resourcesLoaded = 0;
function loadResource(resource, callback) {
var xhr = new XMLHttpRequest();
xhr.onload = function() {
var state = this.readyState;
var responseCode = request.status;
if(state == this.DONE && responseCode == 200) {
callback(resource, this.responseText);
}
};
xhr.open("get", resource, true);
xhr.send();
}
//Assuming that resources is an array of path names
function loadResources(resources) {
for(var i = 0; i < resources.length; i++) {
loadResource(resources[i], function(resource, responseText) {
//Store the data of the resource in to the resourceData map,
//using the resource name as the key. Then increment the
//resource counter.
resourceData[resource] = responseText;
resourcesLoaded++;
//If the number of resources that we have loaded is equal
//to the total number of resources, it means that we have
//all our resources.
if(resourcesLoaded === resources.length) {
//Manipulate the data in the order that you desire.
//Everything you need is inside resourceData, keyed
//by the resource url.
...
...
}
});
}
}
If certain components must be loaded and executed before (like certain JS files) others, you can queue up your AJAX requests like so:
function loadResource(resource, callback) {
var xhr = new XMLHttpRequest();
xhr.onload = function() {
var state = this.readyState;
var responseCode = request.status;
if(state == this.DONE && responseCode == 200) {
//Do whatever you need to do with this.responseText
...
...
callback();
}
};
xhr.open("get", resource, true);
xhr.send();
}
function run() {
var resources = [
"path/to/some/resource.html",
"path/to/some/other/resource.html",
...
"http://example.org/path/to/remote/resource.html"
];
//Function that sequentially loads the resources, so that the next resource
//will not be loaded until first one has finished loading. I accomplish
//this by calling the function itself in the callback to the loadResource
//function. This function is not truly recursive since the callback
//invocation (even though it is the function itself) is an independent call
//and therefore will not be part of the original callstack.
function load(i) {
if (i < resources.length) {
loadResource(resources[i], function () {
load(++i);
});
}
}
load(0);
}
This way, the next file will not be loaded until the previous one has finished loading.
If you cannot use any third-party libraries, you can use my solution. However, your life will probably be much easier if you do what Bergi suggested and use Promises.
There's no need to call check() every millisecond, just run it in the xhr's onreadystatechange. If you provide a bit more of your code, I can explain further.
I would have a queue of functions to execute and each of them checks the previous result has completed before executing.
var remoteResults[]
function requestRemoteResouse(index, fetchFunction) {
// the argument fetchFunction is a function that fetches the remote content
// once the content is ready it call the passed in function with the result.
fetchFunction(
function(result) {
// add the remote result to the list of results
remoteResults[index] = result
// write as many results as ready.
writeResultsWhenReady(index);
});
}
function writeResults(index) {
var i;
// Execute all functions at least once
for(i = 0; i < remoteResults.length; i++) {
if(!remoteResults[i]) {
return;
}
// Call the function that is the ith result
// This will modify the dom.
remoteResults[i]();
// Blank the result to ensure we don't double execute
// Store a function so we can do a simple boolean check.
remoteResults[i] = function(){};
}
}
requestRemoteResouse(0, [Function to fetch the first resouse]);
requestRemoteResouse(1, [Function to fetch the second resouse]);
requestRemoteResouse(2, [Function to fetch the thrid resouse]);
Please note that this is currently O(n^2) for simplicity, it would get faster but more complex if you stored an object at every index of remoteResults, which had a hasRendered property. Then you would only scan back until you found a result that had not yet occurred or one that has been rendered.