Many request to parse at once - javascript

I have a list of request to make to Parse.com using the swif API for tasks acumulated once the application was offine. Some tests show that if I dowload it all at once the overall time is slow that is I use multiple requests. However I couldn't figureout how can I request many "random" objectId's from Pase.com (I have a list of course, by random I mean out of order and not a fix number)
At the moment I am using a loop and calling many:
let pred = NSPredicate(format: "newDataID = %#, dataID[i])
query.findObjectsInBackgroundWithBlock { (result:[AnyObject]?, error:NSError?)
I was thinking in auto generate the string for the predicate but it can get very long what I image would make the query very slow.
Any ideas?

Under any circumstances, initiating many requests in a tight loop is ill-advised. Instead, send the dataID array to a cloud function. Also, if its really an array of object ids, then find is the wrong method, use get() instead...
var _ = require('underscore'); // underscore includes many handy functions, including map and toArray
Parse.Cloud.define("getManyObjectsById", function(request, response) {
var dataID = request.params.dataID;
var promises = _.map(dataID, function(anID) {
var query = new Parse.Query("MyCustomClassName");
return query.get(anID);
});
Parse.Promise.when(promises).then(function() {
response.success(_.toArray(arguments));
}, function(error) {
response.error(error);
});
});
Call it...
PFCloud.callFunctionInBackground("getManyObjectsById", withParameters: dataID) {
(objects: [AnyObject]?, error: NSError?) -> Void in
// objects should be an array of objects corresponding to the ids
}

Related

JSON.stringify is very slow for large objects

I have a very big object in javascript (about 10MB).
And when I stringify it, it takes a long time, so I send it to backend and parse it to an object( actually nested objects with arrays), and that takes long time too but it's not our problem in this question.
The problem:
How can I make JSON.stringify faster, any ideas or alternatives, I need a javaScript solution, libraries I can use or ideas here.
What I've tried
I googled a lot and looks there is no better performance than JSON.stringify or my googling skills got rusty!
Result
I accept any suggestion that may solve me the long saving (sending to backend) in the request (I know its big request).
Code Sample of problem (details about problem)
Request URL:http://localhost:8081/systemName/controllerA/update.html;jsessionid=FB3848B6C0F4AD9873EA12DBE61E6008
Request Method:POST
Status Code:200 OK
Am sending a POST to backend and then in JAVA
request.getParameter("BigPostParameter")
and I read it to convert to object using
public boolean fromJSON(String string) {
if (string != null && !string.isEmpty()) {
ObjectMapper json = new ObjectMapper();
DateFormat dateFormat = new SimpleDateFormat(YYYY_MM_DD_T_HH_MM_SS_SSS_Z);
dateFormat.setTimeZone(TimeZone.getDefault());
json.setDateFormat(dateFormat);
json.configure(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true);
WebObject object;
// Logger.getLogger("JSON Tracker").log(Level.SEVERE, "Start");
try {
object = json.readValue(string, this.getClass());
} catch (IOException ex) {
Logger.getLogger(JSON_ERROR).log(Level.SEVERE, "JSON Error: {0}", ex.getMessage());
return false;
}
// Logger.getLogger("JSON Tracker").log(Level.SEVERE, "END");
return this.setThis(object);
}
return false;
}
Like This
BigObject someObj = new BigObject();
someObj.fromJSON(request.getParameter("BigPostParameter"))
P.S : FYI this line object = json.readValue(string, this.getClass());
is also very very very slow.
Again to summarize
Problem in posting time (stringify) JavaScript bottle nick.
Another problem parsing that stringified into an object (using jackson), and mainly I have svg tags content in that stringified object as a style column, and other columns are strings, int mainly
As commenters said - there is no way to make parsing faster.
If the concern is that the app is blocked while it's stringifying/parsing then try to split data into separate objects, stringily them and assemble back into one object before saving on the server.
If loading time of the app is not a problem you could try to ad-hoc incremental change on top of the existing app.
... App loading
Load map data
Make full copy of the data
... End loading
... App working without changes
... When saving changes
diff copy with changed data to get JSON diff
send changes (much smaller then full data)
... On server
apply JSON diff changes on the server to the full data stored on server
save changed data
I used json-diff https://github.com/andreyvit/json-diff to calc changes, and there are few analogs.
Parsing is a slow process. If what you want is to POST a 10MB object, turn it into a file, a blob, or a buffer. Send that file/blob/buffer using formdata instead of application/json and application/x-www-form-urlencoded.
Reference
An example using express/multer
Solution
Well just as most big "repeatable" problems go, you could use async!
But wait, isn't JS still single-threaded even when it does async... yes... but you can use Service-Workers to get true async and serialize an object way faster by parallelizing the process.
General Approach
mainPage.js
//= Functions / Classes =============================================================|
// To tell JSON stringify that this is already processed, don't touch
class SerializedChunk {
constructor(data){this.data = data}
toJSON() {return this.data}
}
// Attach all events and props we need on workers to handle this use case
const mapCommonBindings = w => {
w.addEventListener('message', e => w._res(e.data), false)
w.addEventListener('error', e => w._rej(e.data), false)
w.solve = obj => {
w._state && await w._state.catch(_=>_) // Wait for any older tasks to complete if there is another queued
w._state = new Promise((_res, _rej) => {
// Give this object promise bindings that can be handled by the event bindings
// (just make sure not to fire 2 errors or 2 messages at the same time)
Object.assign(w, {_res, _rej})
})
w.postMessage(obj)
return await w._state // Return the final output, when we get the `message` event
}
}
//= Initialization ===================================================================|
// Let's make our 10 workers
const workers = Array(10).fill(0).map(_ => new Worker('worker.js'))
workers.forEach(mapCommonBindings)
// A helper function that schedules workers in a round-robin
workers.schedule = async task => {
workers._c = ((workers._c || -1) + 1) % workers.length
const worker = workers[workers._c]
return await worker.solve(task)
}
// A helper used below that takes an object key, value pair and uses a worker to solve it
const _asyncHandleValuePair = async ([key, value]) => [key, new SerializedChunk(
await workers.schedule(value)
)]
//= Final Function ===================================================================|
// The new function (You could improve the runtime by changing how this function schedules tasks)
// Note! This is async now, obviously
const jsonStringifyThreaded = async o => {
const f_pairs = await Promise.all(Object.entries(o).map(_asyncHandleValuePair))
// Take all final processed pairs, create a new object, JSON stringify top level
final = f_pairs.reduce((o, ([key, chunk]) => (
o[key] = chunk, // Add current key / chunk to object
o // Return the object to next reduce
), {}) // Seed empty object that will contain all the data
return JSON.stringify(final)
}
/* lot of other code, till the function that actually uses this code */
async function submitter() {
// other stuff
const payload = await jsonStringifyThreaded(input.value)
await server.send(payload)
console.log('Done!')
}
worker.js
self.addEventListener('message', function(e) {
const obj = e.data
self.postMessage(JSON.stringify(obj))
}, false)
Notes:
This works the following way:
Creates a list of 10 workers, and adds a few methods and props to them
We care about async .solve(Object): String which solves our tasks using promises while masking away callback hell
Use a new method: async jsonStringifyThreaded(Object): String which does the JSON.stringify asynchronously
We break the object into entries and solve each one parallelly (this can be optimized to be recursive to a certain depth, use best judgement :))
Processed chunks are cast into SerializedChunk which the JSON.stringify will use as is, and not try to process (since it has .toJSON())
Internally if the number of keys exceeds the workers, we round-robin back to the first worker and overschedule them (remember, they can handle queued tasks)
Optimizations
You may want to consider a few more things to improve performance:
Use of Transferable Objects which will decrease the overhead of passing objects to service workers significantly
Redesign jsonStringifyThreaded() to schedule more objects at deeper levels.
You can explore libraries like fast-json-stringify which use a template schema and use it while converting the json object, to boost the performance. Check the below article.
https://developpaper.com/how-to-improve-the-performance-of-json-stringify/

Rxjs - Consume API output and re-query when cache is empty

I'm trying to implement a version of this intro to RxJS (fiddle here) that instead of picking a random object from a returned API array, it consumes a backthrottled stream of objects from the returned API array.
Here's a portion of the code that produces a controlled Observable from the API response (full fiddle here):
var responseStream = requestStream.flatMap(function (requestUrl) {
return Rx.Observable.fromPromise(fetch(requestUrl));
}).flatMap(function(response) {
return Rx.Observable.fromPromise(response.json());
}).flatMap(function(json) {
return Rx.Observable.from(json);
}).controlled();
I just dump each emitted user in console.log, and use a click event stream to trigger the request() call in the controlled Observable:
responseStream.subscribe(function(user) {
console.log(user);
});
refreshClickStream.subscribe(function (res) {
responseStream.request(1);
});
There's about 50 user objects returned from the GitHub API, and I'd like to backthrottle-consume them one per click (as seen above). However, after I'm fresh out of user objects I'd like to send in another call to requestStream to fetch another API call, replenish the responseStream and continue providing user objects to console.log upon each click. What would be the RxJS-friendly way to do so?
I'd do it similarly to the article example with combineLatest() although I wonder if there's an easier way than mine.
I'm making request for only 3 items. Working with 3 items is hardcoded so you'll want to modify this. I was thinking about making it universal but that would require using Subject and made it much more complicated so I stayed with this simple example.
Also, I'm using concatMap() to trigger fetching more data. However, just clicking the link triggers just the combineLatest() which emits another item from the array.
See live demo: https://jsfiddle.net/h3bwwjaz/12/
var refreshButton = document.querySelector('#main');
var refreshClickStream = Rx.Observable.fromEvent(refreshButton, 'click')
.startWith(0)
.scan(function(acc, val, index) {
return index;
});
var usersStream = refreshClickStream
.filter(function(index) {
return index % 3 === 0;
})
.concatMap(function() {
var randomOffset = Math.floor(Math.random() * 500);
var url = 'https://api.github.com/users?since=' + randomOffset + '&per_page=3';
return Rx.Observable.fromPromise(fetch(url))
.flatMap(function(response) {
return Rx.Observable.fromPromise(response.json());
});
})
.combineLatest(refreshClickStream, function(responseArray, index) {
return responseArray[index % 3];
})
.distinct();
usersStream.subscribe(function(user) {
console.log(user);
});
I use refreshClickStream twice:
to emit next item in the array in combineLatest()
to check whether this is the end of the array and we need to make another request (that's the filter() operator).
At the end distinct() is required because when you click index % 3 === 0 time triggers in fact two emission. First is the one from downloading the data and the second one is directly in combineLatest() that we want to ignore because we don't want to iterate the same data again. Thanks to distinct() it's ignored and only the new values is passed.
I was trying to figure out a method without using distinct() but I couldn't find any.

Copy a Parse.com class to new class with transformation of values

There is an existing Parse.com class that needs to be copied to a new Parse.com class with some new columns and the transformation of one of the columns. The code currently works and uses the Parse.Query.each method to iterate over all records as detailed in the Parse.com documentation but it stops processing at 831 records although there are 12k+ records in the class. This is odd given each should not have a limit and other default limits are 100 or 1000 for find. Should another method be used to iterate over all records or is there something wrong with the code?
var SourceObject = Parse.Object.extend("Log_Old_Class");
var source_query = new Parse.Query(SourceObject);
var TargetObject = Parse.Object.extend("Log_New_Class")
source_query.each(function(record) {
//save record to new class code works fine
var target_query = new TargetObject();
target_query.set("col1_new",record.col1);
target_query.set("col2_new",record.col2);
//etc...
target_query.save(null, {
success: function(obj) {
//SAVED
},
error: function(obj, error) {
//ERROR
}
});
}).then(function() {
//DONE
},
function(error) {
//error
});
One thing that comes to my mind immediately is that the function is getting timed-out. Parse has time limitations on each function. If I were you, I'd first load all the objects in the source class and then add them separately by having a delay between to API calls (server overload issues can also be present).

java script - using parse.com query with angular ng-repeat

I make a query from parse.com angd get and array of 2 object. Now I want to user ng-reapet('phone in phones') , so I need to convert it to json. I didn't suucess to do it. for some reason, it doesnt see the result as a json.
var Project = Parse.Object.extend("Project");
var query = new Parse.Query(Project);
query.find({
success: function (results) {
var allProjects = [];
for (var i = 0; i < results.length; i++) {
allProjects.push(results[i].toJSON());
}
$scope.phones = allProjects;
//i also tried this : $scope.phones = JSON.stringify(allProjects);
},
error: function (error) {
alert("Error: " + error.code + " " + error.message);
}
});
Thanks
Not sure if you already figured this out, but I was having a similar issue and found your post. I'm not sure what your HTML looks like, but I ended up using the Parse.Object's get method in my repeater like so:
<ul ng-repeat="list in lists">
<li>
<a ng-href="#/lists/{{list.id}}">{{list.get('title')}}</a>
</li>
</ul>
I also looked into using promises so that the Parse query success callback actually updates the view when you set $scope.phones to the query result. My code is similar to yours but my object is List instead of Project. Here is what mine looks like:
// Define your Parse object
var List = Parse.Object.extend('List');
// Define a function that runs your Parse query. Use an angular promise so that
// the view updates when you set your $scope var equal to the query result
function getList() {
var deferred = $q.defer();
var query = new Parse.Query(List);
query.find({
success: function(results) {
deferred.resolve(results);
},
error: function(error) {
deferred.reject(error.message);
}
});
return deferred.promise;
}
// Call the getList function on load
var promise = getLists();
promise.then(function(lists) {
$scope.lists = lists;
});
So basically, it isn't that Angular doesn't see the response right. You shouldn't have to modify the result from Parse in any way. It's just that you need to use the Parse.Object get method like you would if you were accessing properties of the object, and make sure that you are using a promise so that Angular accesses your query result as it should in the view.
Do not use .get function on Parse in you angular code, its not working any more, plus its not a good idea to change your angular code because your object is three level nested and need a get method.
The proper way is to extend the object and then map the values back to whatever items you need in that class.
Then you can bind in normally to ng-repeat without changing your html code specifically for Parse.
var Game = Parse.Object.extend("Game");
var query = new Parse.Query(Game);
query.find({
success: function(results) {
$scope.$apply(function() {
$scope.games = results.map(function(obj) {
return {points: obj.get("points"), gameDate: obj.get("gameDate"), parseObject: obj};
});
});
},
error: function(error) {
console.log(error);
}
There may be better tools , frameworks to use.
line 188 is the fetch. It automatically loads json for the model into the collection defined at line 47.
Looping over entries in the result from the query on parse is all automated in the framework so that you can save yourself tons of time by learning a relevant framework ( ie backbone ). On backbone/parse you focus on business logic , not manipulating network io and query structures.
'phone in phones' from your question may just be a nested model or nested collection which IMO can be handled by more advanced manipulation of the basic backbone framework.

How to make a clean Asynchronous loop?

Following typical REST standards, I broke up my resources into separate endpoints and calls. The main two objects in question here are List and Item (and of course, a list has a list of items, as well as some other data associated with it).
So if a user wants to retrieve his lists, he might make a Get request to api/Lists
Then the user might want to get the items in one of those lists and make a Get to api/ListItems/4 where 4 was found from List.listId retrieved in the previous call.
This is all well and good: the options.complete attribute of $.ajax lets me point to a callback method, so I can streamline these two events.
But things get very messy if I want to get the elements for all the lists in question. For example, let's assume I have a library function called makeGetRequest that takes in the end point and callback function, to make this code cleaner. Simply retrieving 3 elements the naive way results in this:
var success1 = function(elements){
var success2 = function(elements){
makeGetRequest("api/ListItems/3", finalSuccess);
}
makeGetRequest("api/ListItems/2", success2);
}
makeGetRequest("api/ListItems/1", success1);
Disgusting! This is the kind of thing in programming 101 we're smacked across the wrists for and pointed to loops. But how can you do this with a loop, without having to rely on external storage?
for(var i : values){
makeGetRequest("api/ListItems/" + i, successFunction);
}
function successFunction(items){
//I am called i-many times, each time only having ONE list's worth of items!
}
And even with storage, I would have to know when all have finished and retrieved their data, and call some master function that retrieves all the collected data and does something with it.
Is there a practice for handling this? This must have been solved many times before...
Try using a stack of endpoint parameters:
var params = [];
var results [];
params.push({endpoint: "api/ListItems/1"});
params.push({endpoint: "api/ListItems/2"});
params.push({endpoint: "api/ListItems/3"});
params.push({endpoint: "api/ListItems/4"});
Then you can make it recursive in your success handler:
function getResources(endPoint) {
var options = {} // Ajax Options
options.success = function (data) {
if (params.length > 0) {
results.push({endpoint: endpoint, data: data});
getResources(params.shift().endpoint);
}
else {
theMasterFunction(results)
}
}
$.get(endPoint, options)
}
And you can start it with a single call like this:
getResources(params.shift().endpoint);
Edit:
To keep everything self contained and out of global scope you can use a function and provide a callback:
function downloadResources(callback) {
var endpoints = [];
var results [];
endpoints.push({endpoint: "api/ListItems/1"});
endpoints.push({endpoint: "api/ListItems/2"});
endpoints.push({endpoint: "api/ListItems/3"});
endpoints.push({endpoint: "api/ListItems/4"});
function getResources(endPoint) {
var options = {} // Ajax Options
options.success = function (data) {
if (endpoints.length > 0) {
results.push({endpoint: endpoint, data: data});
getResources(endpoints.shift().endpoint);
}
else {
callback(results)
}
}
$.get(endPoint, options)
}
getResources(endpoints.shift().endpoint);
}
In use:
downloadResources(function(data) {
// Do stuff with your data set
});
dmck's answer is probably your best bet. However, another option is to do a bulk list option, so that your api supports requests like api/ListItems/?id=1&id=2&id=3.
You could also do an api search endpoint, if that fits your personal aesthetic more.

Categories