I am coding a lot of annual data in JavaScript, and I was considering adding it to arrays, using the year as the array index and putting the data into the array. However, Firebug seems to be indicating that JavaScript handles this by populating two thousand odd entries in the array with "undefined." With hundreds of such arrays kicking around in active memory, I'm worried the overhead of hundreds of thousands of useless array items could start to slow the program down. Will it?
When you set the value of a numeric index higher than the current length of your array, the length property is affected.
In brief, you should use an Object:
var data = {};
data[year] = "some data";
// or
var data = {
2009: "2009 data",
2010: "2010 data"
};
Now I answer the question title: "Does JavaScript populate empty array items?"
No, as I said before, only the length property is changed, (if necessary, only if the index added is larger than the current length), length is incremented to be one more than the numeric value of that index.
The Array.prototype methods work assuming that the array object will have its indexes starting from zero.
The previous indexes don't really exist in the Array object, you can test it:
var array = [];
array[10] = undefined;
array.hasOwnProperty(10); // true
array.hasOwnProperty(9); // false
In conclusion, arrays are meant to contain sequential indexes, starting from zero, if your properties don't meet those requirements, you should simply use an object.
Yes, most likely. You should consider using a JavaScript object instead:
var years = {2009: 'Good', 2010: 'Better'};
Well, if you iterate over many thousands of undefined, it will affect overall program speed, not sure if you'll notice it though.
On the other hand, sometimes a sparse array is simpler to use than a custom object,
and arrays have such handy methods available.
In a calendar application I begin with objects for each year in use, but each year consists of a twelve member (months array) and each 'month' is a sparse array of significant dates, whose lengths depend on the highest date of that month that has any data.
Related
const arrPassword = []
const passarrayLength = 5
function addPassword(passwd) {
if(arrPassword.length === passarrayLength)
{
arrPassword.shift()
}
arrPassword.push(passwd)
console.log(arrPassword.length)
console.log(arrPassword)
}
addPassword('Pass')
addPassword('Pass2')
addPassword('Pass3')
addPassword('Pass4')
addPassword('Pass5')
addPassword('Pass6')
addPassword('Pass7')
addPassword('Pass8')
addPassword('Pass9')
addPassword('Pass10')
I have a few cases where I want to store objects like user password history in an Array of objects to ensure he has not used the password in the last 5 times for example. My question is can I specify an array of objects with a size of 5 and then just push new passwords to array and any object in the array above size set would be discarded ? Or do I have to do this my self where I count the objects in my Array and if it is = max size I pop the oldest one before I push the new object to array ?
Based on the research I did typescript or javascript does not have a fixed array size, I can specify a array of 3 5 objects but will need to assign all 5 and even so the push would make it 6 objects as there is no limit.
So what would be the best approach to handle this ?
I included some basic concept i cam up with
Can I specify an array of objects with a size of 5 and then just push new passwords to array and any object in the array above size set would be discarded?
Nope. Arrays in javascript do not have a maximum size, so there is no "built in" way to do this.
Or do I have to do this my self where I count the objects in my Array and if it is = max size I pop the oldest one before I push the new object to array?
Yep, that's exactly right. It shouldn't be too hard to make a little class that handles this logic.
When i need a functionality and there happens to be no such a functionality, the first thing that i think is "what am i missing?".
In this particular case all you need to do is to take the last passarrayLength many items from your arrPassword array and reassign it to the arrPassword array like;
arrPassword = arrPassword.slice(-passarrayLength);
like
[1,2,3].slice(-5); // <- [1,2,3]
[1,2,3,4,5,6,7,8,9].slice(-5); // <- [5,6,7,8,9]
Prepending that a solution only needs to work in the latest versions of Chrome, Firefox, and Safari as a bonus.
-
I am trying to use an associative array for a large data set with knockout. My first try made it a true associative array:
[1: {Object}, 3: {Object},...,n:{Object}]
but knockout was not happy with looping over that. So I tried a cheating way, hoping that:
[undefined, {Object}, undefined, {Object},...,{Object}]
where the location in the array is the PK ID from the database table. This array is about 3.2k items large, and would be iterated over around every 10 seconds, hence the need for speed. I tried doing this with a splice, e.g.
$.each(data, function (index, item) {
self.myArray.splice(item.PKID, 0, new Object(item));
}
but splice does not create indices, so since my first PKID is 1, it is still inserted at myArray[0] regardless. If my first PK was 500, it would start at 0 still.
My second thought is to initialize the array with var myArray = new Array(maxSize) but that seems heavy handed. I would love to be able to use some sort of map function to do this, but I'm not really sure how to make the key value translate into an index value in javascript.
My third thought was to keep two arrays, one for easy look up and the other to store the actual values. So it combines the first two solutions, almost, by finding the index of the object in the first example and doing a lookup with that in the second example. This seems to be how many people manage associative arrays in knockout, but with the array size and the fact that it's a live updating app with a growing data set seems memory intensive and not easily manageable when new information is added.
Also, maybe I'm hitting the mark wrong here? We're putting these into the DOM via knockout and managing with a library called isotope, and as I mentioned it updates about every 10 seconds. That's why I need the fast look up but knockout doesn't want to play with my hash table attempts.
--
clarity edits:
so on initial load the whole array is loaded up (which is where the new Array(maxLength) would go, then every 10 seconds anything that has changed is loaded back. That is the information I'm trying to quickly update.
--
knockout code:
<!-- ko foreach: {data: myArray(), afterRender: setInitialTileColor } -->
<div class="tile" data-bind="attr: {id: 'tileID' + $data.PKID()}">
<div class="content">
</div>
</div>
<!-- /ko -->
Then on updates the hope is:
$.each(data.Updated, function (index, item) {
var obj = myModel.myArray()[item.PKID];
//do updates here - need to check what kind of change, how long it's been since a change, etc
}
Here is a solution how to populate array items with correct indexes, so it doesn't start from the first one (0 (zero) I meant)
just use in loop
arr[obj.PKID] = obj;
and if your framework is smart (to use forEach but not for) it will start from your index (like 500 in case below)
http://jsfiddle.net/0axo9Lgp/
var data = [], new_data = [];
// Generate sample array of objects with index field
for (var i = 500; i < 3700; i++) {
data.push({
PKID: i,
value: '1'
});
}
data.forEach(function(item) {
new_data[item.PKID] = item;
});
console.log(new_data);
console.log(new_data.length); // 3700 but real length is 3200 other items are undefined
It's not an easy problem to solve. I'm assuming you've tried (or can't try) the obvious stuff like reducing the number of items per page and possibly using a different framework like React or Mithril.
There are a couple of basic optimizations I can suggest.
Don't use the framework's each. It's either slower than or same as the native Array method forEach, either way it's slower than a basic for loop.
Don't loop over the array over and over again looking for every item whose data has been updated. When you send your response of data updates, send along an array of the PKIds of the updated item. Then, do a single loop:
.
var indexes = []
var updated = JSON.parse(response).updated; // example array of updated pkids.
for(var i=0;i<allElements.length;i++){
if(updated.indexOf(allElements[i].pkid)>-1)
indexes.push(i);
}
So, basically the above assumes you have a simple array of objects, where each object has a property called pkid that stores its ID. When you get a response, you loop over this array once, storing the indexes of all items that match a pk-id in the array of updated pk-ids.
Then you only have to loop over the indexes array and use its elements as indexes on the allElements array to apply the direct updates.
If your indexes are integers in a reasonable range, you can just use an array. It does not have to be completely populated, you can use the if binding to filter out unused entries.
Applying updates is just a matter of indexing the array.
http://jsfiddle.net/0axo9Lgp/2/
You may want to consider using the publish-subscribe pattern. Have each item subscribe to its unique ID. When an item needs updating it will get the event and update itself. This library may be helpful for this. It doesn't depend upon browser events, just arrays so it should be fairly fast.
my array:
tempListArray = "[{"id":"12","value":false},{"id":"10","value":false},{"id":"9","value":false},{"id":"8","value":false}]";
To check if an element exists I would do this:
for (var i in tempListArray) {
//check flag
if (tempListArray[i].id == Id) {
flagExistsLoop = 1;
break;
}
}
Is there anyway, I can check if an Id exists without looping through the whole array. Basically I am worried about performance if say I have a 100 elements.
Thanks
No, without using custom dictionary objects (which you seriously don't want to for this) there's no faster way than doing a 'full scan' of all contained objects.
As a general rule of thumb, don't worry about performance in any language or any situation until the total number of iterations hits 5 digits, most often 6 or 7. Scanning a table of 100 elements should be a few milliseconds at worst. Worrying about performance impact before you have noticed performance impact is one of the worst kinds of premature optimization.
No, you can't know that without iterating the array.
However, note for...in loops are a bad way of iterating arrays:
There is no warranty that it will iterate the array with order
It will also iterate (enumerable) non-numeric own properties
It will also iterate (enumerable) properties that come from the prototype, i.e., defined in Array.prototype and Object.protoype.
I would use one of these:
for loop with a numeric index:
for (var i=0; i<tempListArray.length; ++i) {
if (tempListArray[i].id == Id) {
flagExistsLoop = 1;
break;
}
}
Array.prototype.some (EcmaScript 5):
var flagExistsLoop = tempListArray.some(function(item) {
return item.id == Id;
});
Note it may be slower than the other ones because it calls a function at each step.
for...of loop (EcmaScript 6):
for (var item of tempListArray) {
if (item.id == Id) {
flagExistsLoop = 1;
break;
}
}
Depending on your scenario, you may be able to use Array.indexOf() which will return -1 if the item is not present.
Granted it is probably iterating behind the scenes, but the code is much cleaner. Also note how object comparisons are done in javascript, where two objects are not equal even though their values may be equal. See below:
var tempListArray = [{"id":"12","value":false},{"id":"10","value":false},{"id":"9","value":false},{"id":"8","value":false}];
var check1 = tempListArray[2];
var check2 = {"id":"9","value":false};
doCheck(tempListArray, check1);
doCheck(tempListArray, check2);
function doCheck(array, item) {
var index = array.indexOf(item);
if (index === -1)
document.write("not in array<br/>");
else
document.write("exists at index " + index + "<br/>");
}
try to use php.js it may help while you can use same php function names and it has some useful functionalities
There is no way without iterating through the elements (that would be magic).
But, you could consider using an object instead of an array. The object would use the (presumably unique) id value as the key, and the value could have the same structure you have now (or without the redundant id property). This way, you can efficiently determine if the id already exists.
There is a possible cheat for limited cases :) and it is magic...cough cough (math)
imagine you have 3 elements:
1
2
3
and you want to know if one of these is in an array without iterating it...
we could make a number that contains a numerical flavor of the array. we do this by assigning prime numbers to the elements:
1 - 2
2 - 3
3 - 5
the array so when we add item 2 we check that the array doesn't already contain the prime associated to that item by checking (if Flavor!=0 && (Flavor%3)!=0) then adding the prime Flavor*=3;
now we can tell that the second element is in the array by looking at the number.
if Flavor!=0 && (Flavor%3)==0 // its There!
Of course this is limited to the numerical representation that can be handled by the computer. and for small array sizes (1-3 elements) it might still be faster to scan. but it's just one idea.
but the basis is pretty sound. However, this method becomes unusable if you cannot correlate elements one to one with a set of primes. You'll want to have the primes calculated in advance. and verify that the product of those is less numerical max numerical representation. (also be careful with floating-point. because they might not be able to represent the number at the higher values due to the gaps between representable values.) You probably have the best luck with an unsigned integer type.
This method will probably be too limiting. And there is something else you can do to possibly speed up your system if you don't want to iterate the entire array.
Use different structures:
dictionaries/maps/trees etc.
if your attached to the array another method can be a bloom filter. This will let you know if an element is not in your set, which can be just as useful.
What is the performance difference between retrieving the value by key in a JavaScript object vs iterating over an array of individual JavaScript objects?
In my case, I have a JavaScript object containing user information where the keys are the user's IDs and the values are each user's information.
The reason I ask this is because I would like to use the angular-ui-select module to select users, but I can't use that module with a Javascript object - it requires an array.
How much, if anything, am I sacrificing by switching from a lookup by key, to a lookup by iteration?
By key:
var user = users[id];
By iteration
var user;
for (var i = 0; i < users.length; i ++) {
if (users[i].id == id) {
user = users[i]; break;
}
}
The answer to this is browser dependent, however, there are a few performance tests on jsperf.com on this matter. It also comes down to the size of your data. Generally it is faster to use object key value pairs when you have large amounts of data. For small datasets, arrays can be faster.
Array search will have different performance dependent on where in the array your target item exist. Object search will have a more consistent search performance as keys doesn't have a specific order.
Also looping through arrays are faster than looping through keys, so if you plan on doing operations on all items, it can be wise to put them in an array. In some of my project I do both, since I need to do bulk operations and fast lookup from identifiers.
A test:
http://jsben.ch/#/Y9jDP
This problem touches all programming languages. It depends on many factors:
size of your collection -arrays will get slower when you are searching for the last key, and array is quite long
can elements repeat them selves-if yes, than you need a array. If no: you need either a dictionary (map) or you need to write a add method that for each add will iterate your array and find possible duplicates-that can be troublesome, when dealing with large lists
average key usage - you will lose performance, if the most requested userId is at the end of the list.
In your example map would be a better solution.
Secondly, you need to add a break to yor code:)
var user;
for (var i = 0; i < users.length; i ++) {
if (users[i].id == id) {
user = users[i]; break;
}
}
Or you will lose performance:)
associative arrays are much slower then arrays with numbered indexes, because associative arrays work by doing string comparisons, which are much, much slower then number comparisons!
Is it possible to turn a column of a multidimensional array to row using JavaScript (maybe Jquery)? (without looping through it)
so in the example below:
var data = new Array();
//data is a 2D array
data.push([name1,id1,major1]);
data.push([name2,id2,major2]);
data.push([name3,id3,major3]);
//etc..
Is possible to get a list of IDs from data without looping? thanks
No, it is not possible to construct an array of IDs without looping.
In case you were wondering, you'd do it like this:
var ids = [];
for(var i = 0; i < data.length; i++)
ids.push(data[i][1]);
For better structural integrity, I'd suggest using an array of objects, like so:
data.push({"name": name1, "id": id1, "major":major1});
data.push({"name": name2, "id": id2, "major":major2});
data.push({"name": name3, "id": id3, "major":major3});
Then iterate through it like so:
var ids = [];
for(var i = 0; i < data.length; i++)
ids.push(data[i].id);
JavaScript doesn't really have multidimensional arrays. What JavaScript allows you to have is an array of arrays, with which you can interact as if it was a multidimensional array.
As for your main question, no, you would have to loop through the array to get the list of IDs. It means that such an operation cannot be done faster than in linear time O(n), where n is the height of the "2D array".
Also keep in mind that arrays in JavaScript are not necessarily represented in memory as contiguous blocks. Therefore any fast operations that you might be familiar with in other low level languages will not apply. The JavaScript programmer should treat arrays as Hash Tables, where the elements are simply key/value pairs, and the keys are the indices (0, 1, 2...). You can still access/write elements in constant time O(1) (at least in modern JavaScript engines), but copying of elements will often be done in O(n).
You could use the Array map function which does the looping for you:
var ids = data.map(function(x) { return x[1] });
Unfortunately, like everything else on the web that would be really nice to use, INTERNET EXPLORER DOESN'T SUPPORT IT.
See this page for details on how the map function works:
https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/map
The good news it that the link above provides some nice code in the "Compatibility" section which will check for the existence of Array.prototype.map and define it if it's missing.
You don't need anything special- make a string by joining with newlines, and match the middle of each line.
var data1=[['Tom Swift','gf102387','Electronic Arts'],
['Bob White','ea3784567','Culinarey Arts'],
['Frank Open','bc87987','Janitorial Arts'],
['Sam Sneer','qw10214','Some Other Arts']];
data1.join('\n').match(/([^,]+)(?=,[^,]+\n)/g)
/* returned value: (Array)
gf102387,ea3784567,bc87987
*/