I'm working on an auto-updating table of information using AJAX, but I've run into a bump in the road. I'm using PHP to return a JSON object on each request, which contains data in the following format:
({
"table": {
"544532": {
"field1": "data",
"field2": "data2",
"field3": "data3",
.....
},
"544525": {
"field1": "data",
"field2": "data2",
"field3": "data3",
.....
},
......
}
}); //
I use Prototype.js to get the list of IDs into an array:
var ids = Object.keys(data.table).sort();
However, random rows of the table could be disappear from the list at any time, and new rows could be added to the end at any time. I assume I would store the array of IDs from the previous request and compare those with the new array, but since random rows can disappear, thus shifting the IDs after that one, how do I compare these so that I can only add new rows or remove deleted rows from the page?
Unfortunately Prototype doesn't include a Set type which would have made things a whole lot simpler. So we'll have to make do with this:
Array.prototype.subtract = function(a){
return this.reject(this.include.bind(a));
}
The above adds a much needed subtract function. We can use it like this:
added_ids = new_ids.subtract(old_ids);
removed_ids = old_ids.subtract(new_ids);
It's not too slow either since some browsers supports indexOf which Prototype's include checks for and uses.
PS. Array already has an intersect function, if you'd like a complement too here it is...
Array.prototype.complement = function(a){
return a.reject(this.include.bind(this));
}
Essentially a.subtract(b) is the same as b.complement(a).
Im sure there are better ways to do this - but you will need to store the rows that are shown in the table somewhere - perhaps an array - then loop the JSON object comparing the rows against the array.
You should update your JSON data structure when the table is updated. That would be that data model for the page. Then you can just call Object.keys(data.table) everytime you need it.
Related
I need to use a Zapier webhook to take some incoming JSON data, which contains an array of items, loop that array and do an action for each element.
Here's a sample of incoming JSON data:
{
"first_name": "Bryan",
"last_name": "Helmig",
"age": 27,
"data": [
{
"title": "Two Down, One to Go",
"type": "Left"
},
{
"title": "Talk the Talk",
"type": "Right"
},
{
"title": "Know the Ropes",
"type": "Top"
}
]
}
The size of the array will be dynamic.
The problem is that when I import this data in the hook, it gives me
data
title: Two Down, One to Go
type: Left
title: Talk the Talk
type: Right
title: Know the Ropes
type: Top
So, basically it says that data is just a big string of all this stuff together.
Can anyone help me figure out if it's possible to have a Zap loop over this and do something, e.g., insert data into a sheet, for ever item in the array? I'm aware of the "Code" actions, I've chosen JavaScript, which I could parse out the string, but that doesn't seem efficient. Plus, in reality, there will be a lot of data in the objects inside the JSON array.
EDIT: SOLVED! ANSWER BELOW
So, the first part is to Catch Raw Hook for a trigger. It's the normal "Webhooks", but you have to click to show the less common variations. With the Catch Raw Hook, your data will not be turned automatically turned into variables via the Zapier app, you'll have the raw JSON data.
Once you have the raw JSON, in my case, you'll have an action, and this will be the "Code" action. I'm using JavaScript. In my template, I'm grabbing the entire JSON string (your whole imported JSON is a string right now, not an object, so we can't use "." (dot) notation to access parts of it).
You'll need to JSON.parse() the string in the code. But first, let me explain that Zapier has a pre-defined variable called inputData that you'll use in your code. Then in the top of the "Edit Template" section of your "Code" Action, you'll see you can name the variable of that JSON string you imported.
Now the fun part! In the code, you'll type:
// of course, you can change the variables to what you want
// but 'inputData' is unique, can't change that
const myData = JSON.parse(inputData.rawJsonData);
So, my raw data is a string, it's not JSON yet, so this line of code makes it a JSON object. And now, as an object we can loop over it or .map or access 'this.that' or whatever you want.
The next important thing to mention about "Code" in Zapier, is that to get your stuff out, you return. So, in the next few lines, I'm going to return a .map function that returns each item in an array. And it's tough to grasp how Zapier treats this, but it actually runs the next "Action" you create (e.g. adding a row to a sheet) for each time you loop in that .map. So, let's take a look below:
return myData.data.map(item => {
return item;
});
If you remember, I had an array called "data" in my raw JSON I listed in the original question. It will loop over that array and since I'm returning, then it will perform an "Add Row to Sheet" (in my case) for each loop, thus, inserting all of my data as multiple rows in my spreadsheet.
So the finished code:
const myData = JSON.parse(inputData.rawJsonData);
return myData.data.map(item => {
return item;
});
Prepending that a solution only needs to work in the latest versions of Chrome, Firefox, and Safari as a bonus.
-
I am trying to use an associative array for a large data set with knockout. My first try made it a true associative array:
[1: {Object}, 3: {Object},...,n:{Object}]
but knockout was not happy with looping over that. So I tried a cheating way, hoping that:
[undefined, {Object}, undefined, {Object},...,{Object}]
where the location in the array is the PK ID from the database table. This array is about 3.2k items large, and would be iterated over around every 10 seconds, hence the need for speed. I tried doing this with a splice, e.g.
$.each(data, function (index, item) {
self.myArray.splice(item.PKID, 0, new Object(item));
}
but splice does not create indices, so since my first PKID is 1, it is still inserted at myArray[0] regardless. If my first PK was 500, it would start at 0 still.
My second thought is to initialize the array with var myArray = new Array(maxSize) but that seems heavy handed. I would love to be able to use some sort of map function to do this, but I'm not really sure how to make the key value translate into an index value in javascript.
My third thought was to keep two arrays, one for easy look up and the other to store the actual values. So it combines the first two solutions, almost, by finding the index of the object in the first example and doing a lookup with that in the second example. This seems to be how many people manage associative arrays in knockout, but with the array size and the fact that it's a live updating app with a growing data set seems memory intensive and not easily manageable when new information is added.
Also, maybe I'm hitting the mark wrong here? We're putting these into the DOM via knockout and managing with a library called isotope, and as I mentioned it updates about every 10 seconds. That's why I need the fast look up but knockout doesn't want to play with my hash table attempts.
--
clarity edits:
so on initial load the whole array is loaded up (which is where the new Array(maxLength) would go, then every 10 seconds anything that has changed is loaded back. That is the information I'm trying to quickly update.
--
knockout code:
<!-- ko foreach: {data: myArray(), afterRender: setInitialTileColor } -->
<div class="tile" data-bind="attr: {id: 'tileID' + $data.PKID()}">
<div class="content">
</div>
</div>
<!-- /ko -->
Then on updates the hope is:
$.each(data.Updated, function (index, item) {
var obj = myModel.myArray()[item.PKID];
//do updates here - need to check what kind of change, how long it's been since a change, etc
}
Here is a solution how to populate array items with correct indexes, so it doesn't start from the first one (0 (zero) I meant)
just use in loop
arr[obj.PKID] = obj;
and if your framework is smart (to use forEach but not for) it will start from your index (like 500 in case below)
http://jsfiddle.net/0axo9Lgp/
var data = [], new_data = [];
// Generate sample array of objects with index field
for (var i = 500; i < 3700; i++) {
data.push({
PKID: i,
value: '1'
});
}
data.forEach(function(item) {
new_data[item.PKID] = item;
});
console.log(new_data);
console.log(new_data.length); // 3700 but real length is 3200 other items are undefined
It's not an easy problem to solve. I'm assuming you've tried (or can't try) the obvious stuff like reducing the number of items per page and possibly using a different framework like React or Mithril.
There are a couple of basic optimizations I can suggest.
Don't use the framework's each. It's either slower than or same as the native Array method forEach, either way it's slower than a basic for loop.
Don't loop over the array over and over again looking for every item whose data has been updated. When you send your response of data updates, send along an array of the PKIds of the updated item. Then, do a single loop:
.
var indexes = []
var updated = JSON.parse(response).updated; // example array of updated pkids.
for(var i=0;i<allElements.length;i++){
if(updated.indexOf(allElements[i].pkid)>-1)
indexes.push(i);
}
So, basically the above assumes you have a simple array of objects, where each object has a property called pkid that stores its ID. When you get a response, you loop over this array once, storing the indexes of all items that match a pk-id in the array of updated pk-ids.
Then you only have to loop over the indexes array and use its elements as indexes on the allElements array to apply the direct updates.
If your indexes are integers in a reasonable range, you can just use an array. It does not have to be completely populated, you can use the if binding to filter out unused entries.
Applying updates is just a matter of indexing the array.
http://jsfiddle.net/0axo9Lgp/2/
You may want to consider using the publish-subscribe pattern. Have each item subscribe to its unique ID. When an item needs updating it will get the event and update itself. This library may be helpful for this. It doesn't depend upon browser events, just arrays so it should be fairly fast.
I am quite new to this languages and trying to make sense of what's going on. I have managed to get data from an external JSON file and create a list from it.
This is the contents from the JSON file:
{
"player": [
{
"name": "John",
"country": "USA",
"score": 102400
},
{
"name": "Mary",
"country": "Australia",
"score": 80001
},
{
"name": "Jane",
"country": "England",
"score": 103900
}
]
}
Now here is the fiddle with the HTML and js.
http://jsfiddle.net/tusika_/ut3NZ/
As you can see, every ul is wrapped in a div with class "player". What I would like to achieve is to be able to sort those divs of class "player", by sorting alphabetically the name (default) or country or descending score of the players.
After two days of research and finding answers to similar questions, I managed to put the data into an array, and when I use the sort method and the function in the js, i see in the console that the objects do get sorted differently, however they only sort alphabetically for the first three objects and then the last two get not sorted (in the original file I have many more players than three).
Also I do not undestand how to reprint of screen that new order. (it should replace the current output each time)
I would appreciate a response that indicates where the error of the logic is and doesn't only provide the code but helps me understand why the code is such.
Thank you very much!!!
The issue is (as you have noticed) the disconnection of the array order and the DOM order. You only use the array to create the DOM elements. They are not somehow linked so that what happens to one affect the other.
You will have to manually redraw the dom by either emptying the container and redrawint the element, or by re-arranging the existing DOM elements. For example you could have a function that will clear the #list element and then append the sorted nodes.
function displayData(array) {
var list = $("#list").empty();
$.each(array, function () {
list.append("<div class='player'><ul><li>" + this['name'] + "</li><li>" + this['country'] + "</li><li>" + this['score'] + "</li></ul></div>");
});
}
Also you do not need to sort the array while adding each element, just sort the whole of the array once.
So you can use the above code like this
var sorted = data.player.sort(byCountry);
displayData(sorted);
You can see a simple demo at http://jsfiddle.net/gaby/YhvTt/
When you're doing array.push(key, value);, you're pushing both the key and the value in the array (at position i and i+1).
OH, and BTW, you can simply do: data.player.sort(compare);
I am working on a Javascript web application (SPA) with RESTful api on the back-end. In my client-side datacontext I want to add objects to my model graph and then send the whole graph to server at once.
Suppose the following example:
I have a Person object in my model which itself has an array of say PhoneNumbers as a property. Now I load a Person from api for edditing and map it to my model. Suppose I want to add some phone number objects to my PhoneNumbers. For this I add each number e.g. {"id": 0, "number": 6536652226} with an id of zero to my client model and send the whole graph to server when user clicks save. In server I add the objects with the id of zero (new objects) to database with auto-incremented ids.
I am doing my project based on a tutorial. They do something like this to add objects to context:
var items = {},
// returns the model item produced by merging json obj into context
mapJsonToContext = function (json) {
var id = mapper.getJsonId(json);
var existingItem = items[id];
items[id] = mapper.fromDto(json, existingItem); //returns the mapped obj
return items[id];
},
add = function (newObj) {
items[newObj.id()] = newObj;
}
The problem is that if I use this method I wouldn't be able to remove by id the newly-added-not-yet-saved items in client-side 'cause all the ids are zero!
Any suggestions to fix this, or do I need a totally different approach?
First of all, two little misconceptions I've spot:
1) Forget about "associative arrays". Numeric arrays are the only kind arrays you have; the other constructs are just "objects" (this is not PHP).
2) If it's JSON it's a string, not an object.
Other than that, you can of course use an arbitrary value to represent "new" (though I'd probably use null rather than 0) as soon as you don't use such value to uniquely identify the yet-to-add item. E.g., this is just fine:
[
{"id": 0, "number": "6536652226"},
{"id": 0, "number": "9876543210"},
{"id": 0, "number": "0123456789"}
]
This is not:
// WRONG!!!!
{
0: "6536652226",
0: "9876543210",
0: "0123456789"
}
}
And of course you cannot find numbers by ID if they still don't have an ID. You need to choose:
Retrieve the generated ID from DB and update your local data
Delete by number
Create a localId property on newly created client-side objects, and use that as your key when reconciling server returned-data. Obviously the server would have to return this localId to you.
Can you suggest me an algorithm for filtering out data.
I am using javascript and trying to write out a filter function which filters an array of data.I have an array of data and an array of filters, so in order to apply each filter on every data, I have written 2 for loops
foreach(data)
{
foreach(filter)
{
check data with filter
}
}
this is not the proper code, but in short that what my function does, the problem is this takes a huge amount of time, can someone suggest a better method.
I am using the Mootools library and the array of data is JSON array
Details of data and Filter
Data is JSON array of lets say user, so it will be
data = [{"name" : "first", "email" : "first#first", "age" : "20"}.
{"name" : "second", "email" : "second#second", "age" : "21"}
{"name" : "third", "email" : "third#third", "age" : "22"}]
Array of filters is basically self define class for different fields of data
alFilter[0] = filterName;
alFilter[1] = filterEmail;
alFilter[2] = filterAge;
So when I enter the first for loop, I get a single JSON opbject (first row) in the above case.
When I enter the second for loop (filters loop) I have a filter class which extracts the exact field on which the current filter would work and check the filter with the appropriate field of the data.
So in my example
foreach(data)
{
foreach(filter)
{
//loop one - filter name
// loop two - filter email
// loop three - filter age
}
}
when the second loop ends i set a flag denoting if the data has been filtered or not and depending on it the data is displayed.
You're going to have to give us some more detail about the exact structure of your data and filters to really be able to help you out. Are the filters being used to select a subset of data, or to modify the data? What are the filters doing?
That said, there are a few general suggestions:
Do less work. Is there some way you can limit the amount of data you're working on? Some pre-filter that can run quickly and cut it down before you do your main loop?
Break out of the inner loop as soon as possible. If one of the filters rejects a datum, then break out of the inner loop and move on to the next datum. If this is possible, then you should also try to make the most selective filters come first. (This is assuming that your filters are being used to reject items out of the list, rather than modify them)
Check for redundancy in the computation the filters perform. If each of them performs some complicated calculations that share some subroutines, then perhaps memoization or dynamic programming may be used to avoid redundant computation.
Really, it all boils down to the first point, do less work, at all three levels of your code. Can you do less work by limiting the items in the outer loop? Do less work by stopping after a particular filter and doing the most selective filters first? Do less work by not doing any redundant computation inside of each filter?
That's pretty much how you should do it. The trick is to optimize that "check data with filter"-part. You need to traverse all your data and check against all your filters - you'll not going to get any faster than that.
Avoid string comparisons, use data models as native as possible, try to reduce the data set on each pass with filter, etc.
Without further knowledge, it's hard to optimize this for you.
You should sort the application of your filters, so that two things are optimized: expensive checks should come last, and checks that eliminate a lot of data should come first. Then, you should make sure that checking is cut short as soon as an "out" result occurs.
If your filters are looking for specific values, a range, or start of a text then jOrder (http://github.com/danstocker/jorder) will fit your problem.
All you need to do is create a jOrder table like this:
var table = jOrder(data)
.index('name', ['name'], { grouped: true, ordered: true })
.index('email', ['email'])
.index('age', ['age'], { grouped: true, ordered: true, type: jOrder.number });
And then call table.where() to filter the table.
When you're looking for exact matches:
filtered = table.where([{name: 'first'}, {name: 'second'}]);
When you're looking for a certain range of one field:
filtered = table.where([{age: {lower: 20, upper: 21}}], {mode: jOrder.range});
Or, when you're looking for values starting with a given string:
filtered = table.where([{name: 'fir'}], {mode: jOrder.startof});
Filtering will be magnitudes faster this way than with nested loops.
Supposing that a filter removes the data if it doesn't match, I suggest, that you switch the two loops like so:
foreach(filter) {
foreach(data) {
check data with filter
}
}
By doing so, the second filter doesn't have to work all data, but only the data that passed the first filter, and so on. Of course the tips above (like doing expensive checks last) are still true and should additionally be considered.