Save the order in Object while inserting numerical keys - javascript

I'm parsing an XML and building an object with those values.
I have the parse parseXmlByTag() function which parse by specific TAG - its works good, so consider it is returning what it should return.
this.mytempect = {};
for (var i = 0; i < xml.length; i++) {
var temp = {};
temp.ID = parseXmlByTag(xml[i], "ID");
temp.name = parseXmlByTag(xml[i], "name");
temp.phone = parseXmlByTag(xml[i], "phone");
if (this.mytempect[temp.ID] == null)
this.mytempect[temp.ID] = [];
this.mytempect[temp.ID].push(temp);
}
Before I save each object I check if I need to create for him a new Key or to add to existing one, in the end I get something like this:
Object {56: Array[1], 70: Array[1], 78: Array[3]}
But the first one was with ID 78 and the second one was with ID 70.When I using .push() its automatically place it by his ID numerical number - if its bigger
than after, else before.
I need to save the order in which I'm getting them so I'll save them in the order I entered them(the order I created the keys) like this:
Object {78: Array[1], 70: Array[1], 56: Array[3]}
Any ideas how to fix it?

Objects in JavaScript are a lot like HashMaps in other languages, if you're familiar with those - in other words, the data is kept in key-value pairs, but no order is specifically retained. So, you cannot rely on the object's member variables staying the same, because they won't.
You'll need to use a data structure that does preserve ordering, like an array.

Related

Checking for a key in an object containing array of objects saved in chrome.storage.local

I currently save a bunch of objects (thousands) into the chrome.storage.local and then when on a specific web page checking whether specific IDs on the web page are in fact saved in local storage.
Here's a pseudo code
Bakcground script:
var storage = chrome.storage.local;
var json = '[{"kek1": {"aaa": "aaaValue", "bbb": "bbbValue", "ccc": "cccValue"}},{"kek2": {"ddd": "dddValue", "eee": "eeeValue", "fff": "fffValue"}}]';
var jsonParsed = JSON.parse(json);
jsonParsed.forEach(function(object) {
storage.set(object);
});
Content script (when on a specific page):
ids.forEach(function(id) {
storage.get(id, function(result){
if(!isEmpty(result)) {
//we found it, nice, now flag it as found
}
});
});
function isEmpty(obj) {
for(var key in obj) {
if(obj.hasOwnProperty(key))
return false;
}
return true;
}
Which is easy and nice since I only have to do storage.get(id, ...
Unfortunately, I save a lot of stuff in storage, some of it I need to be removing periodically, which then becomes a hustle since I have to loop through all the objects and determining whether that particular object needs to be removed or it needs to remain.
So i decided I would do like these "parent object". Ie one object for settings, containing an array of objects with different settings the user would save. One object for the stuff that needs to be removed, containing an array objects. Etc
Like so - all relevant info that I want to remove periodically will be under one key "test" (temp name):
var json = '{"test":[{"kek1": {"aaa": "aaaValue", "bbb": "bbbValue", "ccc": "cccValue"}},{"kek2": {"ddd": "dddValue", "eee": "eeeValue", "fff": "fffValue"}}]}';
I know how to access the nested objects and their values:
var jsonParsed = JSON.parse(json);
jsonParsed.test[0].kek1.aaa
But I don't know how I would easily check for the keys saved in the storage since I would have to specify the "element number" ([i]).
Do I just do a for loop itterating over the array like so?
for (i = 0; i < jsonParsed.test.length; i++) {
var getKey = Object.keys(jsonParsed.test[i]);
if (getKey[0] == 'theKeyImLookingFor') {
//do stuff
}
}
To me that feels like non ideal solution since the for loop would have to run for each of the ids on the page and there could sometimes be close to 4000 of them. (4000 for loops back to back)
Is it a good idea to save a single object holding an array of thousands of other objects?
Am I doing it wrong or is this the way to go?
But I don't know how I would easily check for the keys saved in the storage
Use the standard Array methods like find or findIndex:
const i = arrayOfObjects.findIndex(o => 'someKey' in o);
Is it a good idea to save a single object holding an array of thousands of other objects?
It's a bad idea performance-wise.
What you probably need here is an additional value in the storage that would contain an array with ids of other values in the storage that need to be processed in some fashion e.g. expired/removed. It's basically like a database index so you would update it every time when writing an individual object. Since it contains only the ids, updating it is cheaper than rewriting the entire data.
Also, instead of performing lots of calls to the API, do just a single call:
// writing
chrome.storage.local.set(Object.assign({}, ...arrayOfObjects));
// reading
chrome.storage.local.get(arrayOfIds, data => {
for (const id of arrayOfIds) {
const value = data[id];
if (value !== undefined) {
// ok
}
}
});

Organise array push order

I've got an array. I push items multiple times into this array using a function. Below is an simplified version of the code.
var arr = [];
function pushItems(i){
//do something with i
var abc = "string"
arr.push(abc);
//do something with i
var xyz = "string"
arr.push(xyz);
}
Sometimes abc value is pushed before xyz. Sometimes xyz gets pushed before abc value. My question is how do I always have the abc value ahead of 'xyz' value?
So basically I need the array values to be [abc1, xyz1, abc2, xyz2, abc3, xyz3, ...] so on. How do I order the push accordingly?
This is wrong. According to the specification of this method:
The push() method adds one or more elements to the end of an array and
returns the new length of the array.
Please have a look here.
For a more formal approach please see the ECMAScript specification here.
The arguments are appended to the end of the array, in the order in
which they appear. The new length of the array is returned as the
result of the call.
Update
But even if the elements are added at the end of the array, I'm
looking a way of ordering my array.
You can use the sort function for this reason passing to it an appropriate function that will do the compare. For instance, let we have the following array
var array = [4,1,2,5,3];
and we want to order it in a descending order, we could do this like below:
var array = array.sort(function(a,b){ return b-a; });
Since you need your base64-strings to be in an arbitrary order in the array, sort them by an identifier you define.
var firstObj = {id: 0, base64: 'asdf'}
var secondObj = {id: 1, base64: 'qwer'}
var arr = []
// do stuff
// callback needs to have something along these lines:
function base64isLoaded(obj){
arr[obj.id] = obj.base64;
}
Now the 'front' image (as you gave this as example) can be given id: 0, so it ends up in the '0' spot of the array. I can't really help more without more information about how your code is structured.
EDIT: From your comment ("passing multiple items into pushItems"), I am going to assume that i (the argument) is an array and you iterate this array to transform each element into a base64 encoded string. You then want these encoded strings added to arr in the same order, correct?
easily done, simply make i an array of objects:
var i = [{source: 'abc'}, {source: 'xyz'}];
pushItems(i){
for(var c = 0; c < i.length; c++){
makeIntoBase64(i[c]);
}
}
makeIntoBase64(obj){
// this is whatever function that transforms it and takes a callback when it is done
transform(obj.source, function(result){ //pass the source to be encoded
//result should be base64 encoded string
obj.encoded = result;
});
}
after all this, the array i has objects with both .source and .encoded. If you need to know when ALL encoding is done, create a counter and add one to it in the transform callback, and check if counter === i.length every time. When it is, you know you have loaded all base64 strings and can run another function, adding these images to your catalogue or whatever else you need this for :)

JS: name/val pairs to array

I'm using a one-off language similar to javascript in syntax, so an answer in that more common language should suffice.
I have a list of name/val pairs that i built from a big GET string that looks something like
"n1=v1,n2=v2..."
I'm not sure that my initial approach is correct. I used a primitive in this language
tolist(GETstring,"=")
to split the name value pairs into the above list. Perhaps, this is the wrong approach from the gate.
This gives me
data = [["n1","v1"],["n2","v2"],...]
I'm trying to change this into a named array, such as
data["n1"]="v1";
data["n2"]="v2";
...
so that I can access items by name, not by list index (as it is highly volitale)
What is the better approach to getting the data in this format. I've tried a few including evals but nothing seems to work.
You'll have to split the string up then iterate through it.
var obj = {};
var originalString = "n1=v1,n2=v2";
var splitOriginalString = originalString.split(",");
for (var i = 0; i < splitOriginalString.length; i++) {
var tmpObj = splitOriginalString[i].split("=");
obj[tmpObj[0]] = tmpObj[1];
}
There is no option to do it. You've got two ways to do workaround.
Create two arrays, one for keys and one for values.
var indexes = ["test", "test2"];
var values = ["val", "val2"];
var value = values[indexes.indexOf("test2")]; // will get "val2"
Create nested array with key 0 for your string key and with 1 for its value.

jQuery: $.getJSON sorting the data on Chrome / IE?

I'm passing an associative array (id => val) using Ajax and receiving it with jQuery's $.getJSON which read the data properly and prepared the object. There is, however, very annoying sorting issue.
It appears that on Chrome and IE the data becomes sorted by the id part of the associate array. So if the array should be (5=> 'xxx', 3 => 'fff') it actually becomes (3 => 'fff',5=> 'xxx'). On FireFox it works as expected, i.e. not sorted.
Any ideas?
You can add a leading 0 for all integer indexes.
var json = { '05' => 'xxx', '03' => 'fff' };
Seems the best way is to avoid associative arrays at all. When you want to send an associate array simply send it as two separate arrays - one of keys and one of values. Here's the PHP code to do that:
$arWrapper = array();
$arWrapper['k'] = array_keys($arChoices);
$arWrapper['v'] = array_values($arChoices);
$json = json_encode($arWrapper);
and the simple JavaScript code to do whatever you'd like with it
for (i=0; i < data['k'].length; i++) {
console.log('key:' + data['k'][i] + ' val:' + data['v'][i]);
}
Another option is to return the data as an array of objects. That will ensure that the objects stay in the order that you return them.
Edit:
Basically, for each key > value pair, push it to a new array and json_encode that array.

JavaScript: memory/efficiency of associative arrays?

I am building a tree-like data structure out of associative arrays. Each key is 1-2 characters. Keys are unique to their respective level. There will be no more than 40 keys on the root level and no more than 5 keys on each subsequent levels of the tree. It might look something like this:
{a:{b:null,c:null},de:{f:{g:null}},h:null,i:null,j:null,k:null}
Initially, I thought that creating so many objects with so few keys (on average, < 3) would be inefficient and memory intensive. In that case, I would implement my own hash table like so:
//Suppose keys is a multi-dimensional array [[key,data],...]
var hash = function(keys){
var max = keys.length*3, tbl = [];
//Get key hash value
var code = function(key){
return (key.charCodeAt(0)*31)%max;
}
//Get key values
this.get(key){
//2 character keys actually have a separate hash generation algorithm...
//we'll ignore them for now
var map = code(key), i=map;
//Find the key value
while(true){
if (typeof tbl[i] == 'undefined') return false;
if (code(tbl[i][0]) == map && tbl[i][0] == key) return tbl[i][1];
else i = (i+1)%max;
}
}
//Instantiate the class
for (var i=0; i<keys.length; i++){
var index = code(keys[i][0]);
while(typeof tbl[index] != 'undefined')
index = (index+1)%max;
tbl[index] = keys[i];
}
}
Then, I read somewhere that JavaScript's arrays are sometimes implemented as associative arrays when sparsely filled, which could defeat the purpose of making my own hash structure. But I'm not sure. So, which would be more efficient, in terms of memory and speed?
Read this article: http://mrale.ph/blog/2011/11/05/the-trap-of-the-performance-sweet-spot.html
Basically due to the dynamic nature of JavaScript, your data structures will not be very efficient. If you do need very efficient data structures, you should try using the new Typed Arrays introduced recently.
If you aren't into theoretical results, Resig has done real word performance testing on different types of trees looking at data size and performance parsing and processing: http://ejohn.org/blog/javascript-trie-performance-analysis/
Your solution, if I understand it correctly, will definitely perform worse. You express a concern with this:
[...] creating so many objects with so few keys (on average, < 3) [...]
but your solution is doing the same thing. Every one of your nested hashes will still be an object with a small number of keys, only now some of its keys are a closure named get (which will have higher memory requirements, since it implicitly closes over variables such as tbl and code, where code is another closure . . .).

Categories