This LOOP queries the Parse.com server & then plays with the results if any. The problem is that when nArray is greater than 100, the function exceeds the query/burst limit of Parse.com CloudCode & it fails.
One idea would be to delay the LOOP for a second after every 100 LOOPS, but I'm not sure how to do that. Any other solutions would be greatly appreciated.
Thanks in Advance,
for (var k = 1; k < nArray.length; k++) {
(function (k, mArray) { // <-- define an inline function
query2.equalTo("username", nArray[k]); // BURST LIMIT EXCEEDS
query2.find({
success: function (results) {
if (results.length !== 0) {
var object = results[0];
var compareUserEmail = object.get('email');
if (compareUserEmail !== userEmail) {
// alert("The result is equal to" + object.get('Name'));
mArray.push({
name: object.get('Name'),
email: object.get('email'),
bloxID: object.get('bloxID')
});
gameScore.set("filtered", mArray);
gameScore.save(null, {
success: function (gameScore) {
response.success("Success!");
alert('New object created with objectId: ' + gameScore.id);
},
error: function (gameScore, error) {
alert('Failed to create new object, with error code: ' + error.description);
}
});
}
};
},
error: function () {}
});
})(k, mArray);
// <-- call it after definition using (k)
};
You've got a couple of issues to deal with.
The reason Parse.com doesn't support setInterval is because that would be inviting disaster. It terminates your Cloud Code if it takes too long, so letting you add delays would just increase the chance your code is terminated before completion.
The reason Parse.com has a burst limit is that this usually suggest "you are doing it wrong (tm)". In your case you are looping through an array and running a query for each item in the array. Instead you should be using the containedIn method to get all records for the array in one go. If you are getting more than 100 items in your array you can choose to increase the record limit to 1000, but first consider carefully if this is really what you need.
Given that you are modifying a lot of objects and saving them all, consider using the saveAll method to save them all in one hit too.
You might want to consider batching these operations, but be aware of the restrictions on overall duration for Cloud Code.
You can use a setInterval:
var i = 0;
var intervalId = setInterval(function() {
if(i < nArray.length) {
... your code ...
i++;
} else {
clearInterval(intervalId);
}
}, 100); //every 100ms; change it to what you need
Related
I am tryng to create posts using a for loop, but when i look at Parse database only the last object of my array get's stored. this is the code i wrote.
var Reggione = Parse.Object.extend("Reggione");
var creaReggione = new Reggione();
var selectobject = $('#searcharea')[0];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
creaReggione.save();
Thanks, Bye.
Do this by creating an array of new objects, then save them together...
var newObjects = [];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
newObjects.push(creaReggione);
// ...
}
Parse.Object.saveAll(newObjects);
Remember, if you want something to happen after saveAll completes (like call response.success() if you're in cloud code), then you should use that promise as follows...
Parse.Object.saveAll(newObjects).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
In extension to danhs answer, the reason this does not work is because only one transaction can happen at a time from the JS client to Parse.
Therefore in your loop the first call to .save() is made and the object is saved to Parse at some rate (asynchrounously), in that time the loop continues to run and skips over your other save calls, these objects are NOT queued to be saved. As Danh pointed out, you must use Parse's batch operations to save multiple objects to the server in one go, to do this you can:
var newObjects = [];
for (var i = 2; i < selectobject.length; i++) {
creaReggione.set("name", selectobject.options[i].text);
newObjects.push(creaReggione);
// ...
}
Parse.Object.saveAll(newObjects);
Hope this helps, I'd also recommend taking a look at Parse's callback functions on the save method to get more details on what happened (you can check for errors and success callbacks here to make debugging a little easier)
An example of this would be to extend the previous call with:
Parse.Object.saveAll(newObjects, {
success: function(messages) {
console.log("The objects were successfully saved...")
},
error: function(error) {
console.log("An error occurred when saving the messages array: %s", error.message)
}
})
I hope this is of some help to you
I have the next code:
exports.getCommunities = function(user, callback)
{ //I am getting the communities for a user.
community_users.find({'user':user}).sort({_id : 1 }).toArray(function(err, docs) {
docs.sort
var clas = [];
//for each community, I need to find the country of that community and add it to each docs object. To do so, i call a getCommunityByName function that finds the community document in the communities mongodb collection by a given name.
docs.forEach(function(entry,i) {
clas.push(entry);
getCommunityByName(entry.name, function(e, o){
if (o){
clas[i].country = o.country;
if (docs.length-1 == i) {callback(null,clas)}
} else { console.log('community-not-found: '+entry.name)}
});
});
});
};
I am having strange behavior. Imagine docs is a 7 object array. I am obtaining a 7 positions array but a random number of them have the country key. Sometimes only 3 of them have the country key, sometimes are 5, sometimes 6...
I think that the if statement to call the callback is not waiting for every call to getCommunityByName and i don't know really why...
I need some light in this...
Regards,
Assuming getCommunityByName performs an asynchronous request, it could be that the request for the final item is returning before some of the previous items, so it's calling the callback too soon. Rather than using i from the loop to decide when to call back, instead count down the returned requests and call the callback when they're all complete:
exports.getCommunities = function(user, callback)
{ //I am getting the communities for a user.
community_users.find({'user':user}).sort({_id : 1 }).toArray(function(err, docs) {
docs.sort
var clas = [];
//for each community, I need to find the country of that community and add it to each docs object. To do so, i call a getCommunityByName function that finds the community document in the communities mongodb collection by a given name.
//initialise counter to number of items
var counter = docs.length;
docs.forEach(function(entry,i) {
clas.push(entry);
getCommunityByName(entry.name, function(e, o) {
//request returned, decrement counter
counter--;
if (o){
clas[i].country = o.country;
} else { console.log('community-not-found: '+entry.name)}
if (counter == 0) {
//All requests returned, fire callback
callback(null, clas);
}
});
});
});
};
I am literally giving my first steps with node and mongodb and I have recently hit this RangeError wall.
Here's what I am trying to do, I have a file that contains a list of countries that I would like to add to my mongo db. This would be part of my "seed" mechanism to get the app running.
I load the json and then I iterate through the collection of objects and add them one by one to the 'Countries' collection.
However, everytime I run the code, I get a "RangeError: Maximum call stack size exceeded".
I have googled around but none of the suggested solutions seem to apply for me.
My guess is there is something wrong with my insertCountry function...
Anyways, here's my code:
var mongoose = require('mongoose');
var countries = require('./seed/countries.json');
// mongodb
var Country = mongoose.Schema({
name: String,
code: String,
extra: [Extra]
});
var Extra = mongoose.Schema({
exampleField: Boolean,
anotherField: Boolean
});
var mCountry = mongoose.model('Countries', Country);
var mExtra = mongoose.model('Extras', Extra);
// do connection
mongoose.connect('...');
var db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error'));
db.once('open', function callback() {
});
// async function
var insertCountry = function(document, callback) {
db.model('Countries').count({code: document.code}, function (err, count) {
if (count < 1) {
db.collection('Countries').insert(document, function (err, result) {
if (!err) {
console.log('country ' + document.name + ' added');
}
else {
console.log('- [' + document.name + '] ' + err);
}
});
}
callback(null,document);
});
};
// doing countries
var Country = mongoose.model('Countries');
var Extras = mongoose.model('Extras');
for(i = 0; i < countries.length; i++)
{
nCountry = new Country();
nCountry.name = countries[i].name;
nCountry.code = countries[i].code;
nCountry.benefits = new Extras();
nCountry.benefits.exampleField = false;
nCountry.benefits.anotherField = false;
insertCountry(nCountry, function (err, value) {
console.log(value.name + ' added to collection (callback)');
});
}
I have been using some guides I have found to build this so this might not be optimal code. Any best pratices, standards, guides or tutorials you can share are most welcome!
Your callback is in the wrong place. It is not waiting for the insert operation to complete before you return from it's own callback. Altering your code:
var insertCountry = function(document, callback) {
db.model('Countries').count({code: document.code}, function (err, count) {
if (count < 1) {
db.collection('Countries').insert(document, function (err, result) {
if (!err) {
console.log('country ' + document.name + ' added');
}
else {
console.log('- [' + document.name + '] ' + err);
}
callback(null,document);
});
}
});
};
That is part of your problem, but it does not completely solve it. The other part is the loop which also does not wait for the wrapping function to complete before moving on. You want something like asyc.eachSeries in order to wait for inserts to complete before performing the next iteration. This is mostly why you are exceeding the call stack:
async.eachSeries(
countries,
function(current,callback) {
// make your nCountry object
insertCountry(nCountry,function(err,value) {
// do something, then
callback(err);
})
},
function(err) {
// called where done, err contains err where set
console.log( "done" );
}
);
There is really still and issue with the array, which must be reasonably large if you are exceeding the call stack limit. You probably should look at using event streams to process that rather that load everything in memory to the array.
Personally, if you were just trying not to insert duplicates for a field and had MongoDB 2.6 available I would just use the Bulk Operations API with "unordered operations" and allow non fatal failures on the duplicate keys. Coupled with the fact that bulk operations are sent in "batches" and not one at a time, this is much more efficient than checking for the presence on every request:
var Country = mongoose.Schema({
name: String,
code: { type: String, unique: true }, // define a unique index
extra: [Extra]
});
var insertCountries = function(countries,callback) {
var bulk = Country.collection.initializeUnorderedBulkOp();
var counter = 0;
async.eachSeries(
countries,
function(current,callback) {
// same object construction
bulk.insert(nCountry);
counter++;
// only send once every 1000
if ( counter % 1000 == 0 ) {
bulk.execute(function(err,result) {
// err should generally not be set
// but result would contain any duplicate errors
// along with other insert responses
// clear to result and callback
bulk = Country.collection.initializeUnorderedBulkOp();
callback();
});
} else {
callback();
}
},
function(err) {
// send anything still queued
if ( counter % 1000 != 0 )
bulk.execute(function(err,result) {
// same as before but no need to reset
callback(err);
});
}
);
};
mongoose.on("open",function(err,conn) {
insertCountries(countries,function(err) {
console.log("done");
});
});
Keeping in mind that unlike the methods implemented directly on the mongoose models, the native driver methods require that a connection is actually established before they can be called. Mongoose "queues" these up for you, but otherwise you need something to be sure the connection is actually open. The example of the "open" event is used here.
Take a look at event streams as well. If you are constructing an array large enough to cause a problem by missing callback execution then you probably should not be loading it all in memory from whatever your source is. Stream processing that source combined with an approach as shown above should provide efficient loading.
I have to make multiple API calls in as short a time as possible. The need to make multiple calls arises from me having to populate a wide range of conditional data sets. Say I have n metrics, each to be filtered with m possible filters. I would want to get the results.totalsForAllResults for the n*m queries I generate.
While I faced a lot of hiccups initially, knowing about closures solved many issues about my trouble with sending async API calls. I was even able to handle results well, in the proper order. Now I'm facing an issue where the maximum number of API requests per second poses an issue for me. Google Core Reporting API v3 allows a maximum of 10 API requests in a second, and I'm well past this limit.
Here is how I tried to make the API calls and process the responses. I have little freedom with the structure:
function getMetrics() {
initResultsArray();
for (mI = 0; mI < metricsArray.length; mI++) {
(function (mI) { //Closure. Holds a local value of 'mI' within the scope
for (fI = 0; fI < filtersArray.length; fI++) {
(function (fI) { //Closure. Holds a local value of 'fI' within the scope
gapi.client.analytics.data.ga.get({
'ids': tableID,
'start-date': startDate,
'end-date': endDate,
'metrics': metricsArray[mI],
'filters': filtersArray[fI],
'samplingLevel': 'HIGHER_PRECISION',
}).execute(function putToVar(results) { //this fn is defined inline to get access to fI
console.log(results.totalsForAllResults);
resultsArray[mI][fI] = parseInt(results.totalsForAllResults[metricsArray[mI]]);
});
})(fI); //This ends the closure for fI
}
})(mI); //This ends the closure for mI
}
}
//Print results to console, called when I believe all results have been populated.
function logResults() {
console.log(resultsArray);
}
I need to be able to find out if I have made 10 queries in the last second and wait to send the remaining queries, because as soon as I exceed 10 queries per second I get null objects as response for my API calls and it ruins my ability to retrieve values into arrays. How can this be done? I do not know how to use wait() and people say the browser becomes unresponsive if you use wait(), and I don't know how setTimeOut() can be applied to my problem.
mI and fI are global iterators for metrics and filters and metricsArray and filtersArray are arrays of strings representing metrics and filters in the way GA API would expect them, I just need to iterate through them to obtain a lot of results.TotalsForAllResults. There is no problem with the execution of the API calls and responses. My only issue is exceeding the 10 queries per second limit and not getting further responses.
You could solve this by first creating a list of the calls you need to make, then making them 10 at a time. This is all off the cuff, so no guarantees that it actually works but hopefully you can apply it to your situation.
The general idea is to create a simple Scheduler constructor function that takes in an array of stuff to process. Naming stuff more descriptively would be better :). The created object has a single function - start.
var Scheduler = function (stuffToProcess) {
var started,
executeFunction;
getExecuteFunction = function (current) {
return function (results) {
console.log(results.totalsForAllResults);
resultsArray[current.mI][current.fI] = parseInt(results.totalsForAllResults[metricsArray[current.mI]], 10);
};
}
var processNext = function () {
var current = stuffToProcess.shift(),
counter = 0;
while (current && ++counter <= 10) {
gapi.client.analytics.data.ga
.get(current.gaBit)
.execute(getExecuteFunction(current));
if (counter !== 10) {
current = stuffToProcess.shift(); // <- EDIT: Forgot this in original answer.
}
}
if (stuffToProcess.length > 0) {
window.setTimeout(function () {
processNext();
}, 1000);
}
};
this.start = function () {
if (!started) {
started = true;
processNext();
}
};
};
Then in your getMetrics function, instead of calling ga directly, you build an array of the calls you want to make, then create a scheduler instance and start it.
function getMetrics() {
initResultsArray();
var listOfCalls = [];
for (mI = 0; mI < metricsArray.length; mI++) {
for (fI = 0; fI < filtersArray.length; fI++) {
listOfCalls.push({
gaBit: {
'ids': tableID,
'start-date': startDate,
'end-date': endDate,
'metrics': metricsArray[mI],
'filters': filtersArray[fI],
'samplingLevel': 'HIGHER_PRECISION'
},
mI: mI,
fI: fI
});
}
}
var s = new Scheduler(listOfCalls);
s.start();
}
EDIT:
Modified code to use getExecuteFunction instead.
geturls(data,function(urls){
var data = {
"data": [
{ "userProfile": userP },
{ "urls": urls }
]
};
res.send(data);
});
function getUrls(data,done){
links = new Array();
for (var i=0; i<data.length; i++){
user = data[i]
Url.find({where:{data.id}}).success(function(url){
links.push({
"url": ur.text,
"date": data.syncedTime
});
if (urls.length == data.length){
done(links);
}
});
}
}
My problem with my code is this:
I'm returning the response through a callback once data collected in my array equals the length of the parent array. This is obviously a very dangerous and not so elegant solution. As, suppose I get a .failure from Url database, then my urls.length won't be equal with data.length. So, I'm a bit confused how to go about this.
Any help?
It will be easy for you, if you use async.js.
I used mapSeries here. It takes 3 parameters.
collection/array
iterator, which will be called for each item in the passed collection/array with 2 arguments. 1. item in collection, 2. callback. After completing the job in iterator, You should call the callback in node style(err first, results follows).
Final callback, which will be called after all the items in the collection mapped.
function getUrls(data,done){
var async = require('async');
async.mapSeries(data, function(user, cb) {//If you want it to be async `async.map`
Url.find({where:{user.id}}).success(function(url){
cb(null, {
"url": url.text,
"date": user.syncedTime
});
});
}, function(err, results) {
//results is an array. Its the same as `links` in your old code.
done(results);
});
}
geturls(data,function(urls){
var data = {
"data": [
{ "userProfile": userP },
{ "urls": urls }
]
};
res.send(data);
});
Use recursion:
function getUrls(data,done) {
var links = new Array();
function doGetUrl(i) {
var user = data[i];
Url.find({where:{data.id}}).
success(function(url){
links.push({
"url": ur.text,
"date": data.syncedTime
});
if (links.length == data.length){
done(links);
} else {
doGetUrl(i + 1); // get next url
}
}).
failure(function(err) {
doGetUrl(i); // on error, try to get current url again
// other error handling code
});
}
doGetUrl(0);
}
I would probably make use of the complete callback, in jQuery terms. Have a counter that records how many records have been processed and update this in complete, as this executes on success or failure. Then when that counter is >= the length of the data array you can exit.
As an aside, I would always do a >= rather than an == for the comparison you are doing there, that way, if for any crazy reason, the count is upped more than it should you still exit.
If alll you want to do is avoif the problem of checking links.length to determine when you are done then I think its just a matter of adding a separate counter that gets incremented even if the urk database fails. If you do that you can continue using your current stype where the async requests are run in parallel.
var nreq = 0;
for (var i=0; i<data.length; i++){
doTheAsyncOperation(function(){
//Run this part in both the success and error cases
nreq = nreq + 1;
if(nreq >= data.length){ done(links) }
})
}
On the other hand, if you want to run one query after the other you will need to rewrite the for to use recursion. This time, you don't need to worry about keeping a separate counter since you know when the final request runs:
function loop(i){
if(i >= data.length){
done(links);
}else{
doTheAsyncOperation(function(){
loop(i+1);
})
}
}
loop(0);
Finally, its good to know how to code this sort of patterns yourself but in the long run I highly recommend using a control flow library to keep things cleaner.