I have an array of approximately 19.000 items.
I'll have to access them by an arbitrary id at random (that is, there's no need to traverse the array)
I was just wondering if js con optimize the code if I use the id as the index of the array, or if there's any kind of trick or library to speed up these kind of things.
To be more precise, I'll have the results of an election in approximately 20k schools, and I'd like to know your advice about which one would be faster:
[
{
school_id: xx
results: [
{
party_id: xx
votes: xx
}, [...]
]
}, [...]
]
[ // use school_id as index to the array
[
{
party_id: xx
votes: xx
}, [...]
], [...]
]
The question is if js is smart enough to optimize array random access.
And any tool you could advice me to use to test the performance would be much welcome
These questions are always engine-dependent. In V8 (Google Chrome, Node.js):
Objects and Arrays are not radically different. For implementation simplicity, all objects have an external elements array where properties that are positive integers are stored.
So when you do obj[5], it doesn't matter if obj is the Javascript Array object or any javascript object - it will access the object's external elements array.
So if you created an object like this:
var a = {
a: 3,
b: 4,
c: {},
5: 5,
6: 6
};
The object layout will be:
[HiddenClassPointer, PropertiesArrayPointer, ElementsArrayPointer, TaggedSmallInteger(3), TaggedSmallInteger(4), JSObjectPointer]
Note how the named fields are stored side by side with the internal fields. If you now add any property after the fact, it will
go into the external properties array pointed by the second field instead of stored on the object directly.
The "fields" with the integer key would be in the external elements array pointed to by ElementsArrayPointer like this:
[HiddenClassPointer, TaggedSmallInteger(25), TheHolePointer, TheHolePointer, TheHolePointer, TheHolePointer, TheHolePointer, TaggedSmallInteger(5), TaggedSmallInteger(6), ...more hole pointers until 25 elements]
The 25 is length of the backing array. I will come back to that soon.
The hole pointer is needed to disambiguate between explicit undefined values given from the user and actual holes in the array. When you try to retrieve a[3], it will
return you undefined because there was a hole. So the actual hole object is not returned to user. So there are actually 3 different types of null :P
The initial length of 25 comes from the formula (initial index + 1 ) + ((initial_index + 1 ) / 2) + 16 so 6 + 7/2 + 16 = 25. You can see it in a heap snapshot.
( 108 - 8 ) / 4 === 25
Write a test using JSPerf. You can test a number of different scenarios using it.
Related
I wanted to use a set in javascript and I found in docs we have Set in js.
But when I tried to use it, it doesn't returns the elements in sorted order.
let mySet1 = new Set()
mySet1.add(1) // Set [ 1 ]
mySet1.add(5) // Set [ 1, 5 ]
mySet1.add(5) // Set [ 1, 5 ]
mySet1.add(15)
mySet1.add(150)
mySet1.add(23)
mySet1.add(45)
console.log(mySet1)
Set(6) {1, 5, 15, 150, 23, …}
[[Entries]]
0: 1
1: 5
2: 15
3: 150
4: 23
5: 45
size: (...)
Isn't the set implemented using a Binary Search Tree like structure in javascript as in other languages like C++, Java
What will be the time complexity for insertion, deletion here.
Or are they HashSet??
Isn't the set implemented using a Binary Search Tree like structure in javascript as in other languages like C++, Java
No, because that wouldn't make any sense - sets very often have non-numeric values in them, and may even have non-serializable values like functions and class instances. Binary search only works for numbers (or for values that can be serialized and converted into numeric values one-to-one, like strings)
But when I tried to use it, it doesn't returns the elements in sorted order.
The order in which values are iterated over is their insertion order, not their sorted order.
What will be the time complexity for insertion, deletion here.
See here. It's required to be at least sublinear. Past that is up to each individual implementation.
I'm dealing with the library qs in Node.js, which lets you stringify and parse query strings.
For example, if I want to send a query with an array of items, I would do qs.stringify({ items: [1,2,3] }), which would send this as my query string:
http://example.com/route?items[0]=1&items[1]=2&items[2]=3
(Encoded URI would be items%5B0%5D%3D1%26items%5B1%5D%3D2%26items%5B2%5D%3D3)
When I do qs.parse(url) on the server, I'd get the original object back:
let query = qs.parse(url) // => { items: [1,2,3] }
However, the default size of the array for qs is limited to 20, according to the docs:
qs will also limit specifying indices in an array to a maximum index of 20. Any array members with an index of greater than 20 will instead be converted to an object with the index as the key
This means that if I have more than 20 items in the array, qs.parse will give me an object like this (instead of the array that I expected):
{ items: { '0': 1, '1': 2 ...plus 19 more items } }
I can override this behavior by setting a param, like this: qs.parse(url, { arrayLimit: 1000 }), and this would allow a max array size of 1,000 for example. This would, thus, turn an array of 1,001 items into a plain old JavaScript object.
According to this github issue, the limit might be for "security considerations" (same in this other github issue).
My questions:
If the default limit of 20 is meant to help mitigate a DoS attack, how does turning an array of over 20 items into a plain old JavaScript object supposed to help anything? (Does the object take less memory or something?)
If the above is true, even if there is an array limit of, say, 20, couldn't the attacker just send more requests and still get the same DoS effect? (The number of requests necessary to be sent would decrease linearly with the size limit of the array, I suppose... so I guess the "impact" or load of a single request would be lower)
Initializing a map having string keys can be performed as following in Javascript:
var xxx = {
"aaa" : ["a1","a2","a3"],
"bbb" : ["b1","b2","b3"],
"ccc" : ["c1","c2","c3"],
...
};
I need to initialize some map with integer keys, something like:
var xxx = {
0 : ["a1","a2","a3"],
1 : ["b1","b2","b3"],
2 : ["c1","c2","c3"],
...
};
Of course, I could proceed with an array like this:
xxx[0]=["a1","a2","a3"];
xxx[1]=["b1","b2","b3"];
xxx[2]=["c1","c2","c3"];
...
but the issue is that it makes the initialization code long. I need to squeeze all the bytes I can from it, because I need to push this object on the user side and any saved byte counts.
The xxx object needs to be initialized with n arrays, and each entry has a unique associated id between 0 and n-1. So, there is a one to one mapping between the id I can use and the arrays, and these ids are consecutive from 0 to n-1, which makes me think I could use an array instead of a Javascript 'map'.
I have noticed that one can push objects into an array. May be I could use something like this:
var xxx = [];
xxx.push(["a1","a2","a3"],["b1","b2","b3"],["c1","c2","c3"],...);
Is this a proper to achieve my objective or is there something smarter in Javascript? Thanks.
P.S.: Later, xxx will be referenced with something like xxx[2], xxx[10], etc...
This strikes me as cleaner than using String or integer keys, though, unless you need to add additional properties on xxx:
var xxx = [
["a1","a2","a3"],
["b1","b2","b3"],
["c1","c2","c3"]
//...
];
Just make xxx into an array containing arrays. You can get at, say, "b3", via xxx[1][2] (because xxx[1] == ["b1", "b2", "b3"].)
I have a set of variables which represent prices and number of items sold in a 2d array. I
have sorted it in order to find the lowest price.
I'd like to set the second variable (number sold) of the first item (player A) to a value (200) by referring to the array.
For example:
var playerASold;
var arr=[
[playerAPrice,playerASold],
[playerBPrice,playerBSold],
[playerCPrice,playerCSold]];
arr[0][1]=200;
this doesn't work, probably because playerASold currently has a value of 0 and it is trying to set 0=0.
How do I refer to the variable and not the value of the variable?
JavaScript has no notion of C's pointers or C++'s references, so you'll have to do it in a different way. Rather than trying to store references in an array, try making the array the sole holder of the data.
That might look like this:
var players = [
{ price: 5, sold: 1 },
{ price: 3, sold: 6 },
{ price: 9, sold: 2 }
];
Then rather than, say, playerBSold, you can use players[1].sold. Now you can use a variable in place of that 1 if you wanted.
Just take a close look at your code you are setting value to array.you are replacing arr[0][1] item value.previously it was playerASold i.e 0 and now 200.So you are not assigning value to playerSold.do like this:
var arr=[[playerAPrice:0,playerASold:0],[playerBPrice:0,playerBSold:0], [playerCPrice:0,playerCSold:0]];
and use this:
arr[0].playerASold=200.
Javascript primitives (in this case Number) are immutable. I.e., you can't change their values. Operations on Numbers create new numbers. Objects are however mutable.
As icktoofay suggested, refactoring as mutable Player objects with price and sold properties is probably a good idea here.
I am coding a lot of annual data in JavaScript, and I was considering adding it to arrays, using the year as the array index and putting the data into the array. However, Firebug seems to be indicating that JavaScript handles this by populating two thousand odd entries in the array with "undefined." With hundreds of such arrays kicking around in active memory, I'm worried the overhead of hundreds of thousands of useless array items could start to slow the program down. Will it?
When you set the value of a numeric index higher than the current length of your array, the length property is affected.
In brief, you should use an Object:
var data = {};
data[year] = "some data";
// or
var data = {
2009: "2009 data",
2010: "2010 data"
};
Now I answer the question title: "Does JavaScript populate empty array items?"
No, as I said before, only the length property is changed, (if necessary, only if the index added is larger than the current length), length is incremented to be one more than the numeric value of that index.
The Array.prototype methods work assuming that the array object will have its indexes starting from zero.
The previous indexes don't really exist in the Array object, you can test it:
var array = [];
array[10] = undefined;
array.hasOwnProperty(10); // true
array.hasOwnProperty(9); // false
In conclusion, arrays are meant to contain sequential indexes, starting from zero, if your properties don't meet those requirements, you should simply use an object.
Yes, most likely. You should consider using a JavaScript object instead:
var years = {2009: 'Good', 2010: 'Better'};
Well, if you iterate over many thousands of undefined, it will affect overall program speed, not sure if you'll notice it though.
On the other hand, sometimes a sparse array is simpler to use than a custom object,
and arrays have such handy methods available.
In a calendar application I begin with objects for each year in use, but each year consists of a twelve member (months array) and each 'month' is a sparse array of significant dates, whose lengths depend on the highest date of that month that has any data.