$watchCollection() with Nested Arrays - javascript

I have a nested array of the form :
$scope.itinerary =
[
[
{name:'x'},
{name:'y'},
{name:'z'}
],
[
{name:'a'},
{name:'b'},
{name:'c'}
]
]
And I am doing a $watchCollection using the following :
$scope.$watchCollection(function () {
return $scope.itinerary;
},
function () {
console.log("Changed")
}
);
But the console.log() is only executed if one of the sub array is deleted or a new array is inserted. If I move an element from One array to another, nothing happens. (eg when I move {name:'a'} from one array to another, nothing happens). How do I put a watch on the nested Array ?

Use deep watch
The $watch() function takes a third, optional argument for "object equality." If you pass-in "true" for this argument, AngularJS will actually perform a deep-object-tree comparison. This means that within each $digest, AngularJS will check to see if the new and old values have the same structure (not just the same physical reference). This allows you to monitor a larger landscape; however, the deep object tree comparison is far more computationally expensive.
$scope.$watch('itinerary',function (newVal,oldVal) {
console.log(newVal)
},true);

Rather than use $watchCollection, you should use $watch with a third argument set to true.
This will works, but it is also a bad idea for performance if the array is large, so use with caution.
Comparison is done using angular.eaquals comparing to a copied object obtained with angular.copy.
More details at https://docs.angularjs.org/api/ng/type/$rootScope.Scope#$watch

Related

Difference between returning a copy or manipulating original objects in array.prototype.map (In RxJS pipe)

I am working on an Angular 9, RxJS 6 app and have a question regarding the different outcomes of piping subject values and doing unit conversion in that pipe.
Please have a look at this stackblitz. There, inside the backend.service.ts file, an observable is created that does some "unit conversion" and returns everything that is emmitted to the _commodities Subject. If you look at the convertCommodityUnits function, please notice that I commented out the working example and instead have the way I solved it initially.
My question: When you use the unsubscribe buttons on the screen and subscribe again, when using the "conversion solution" that just overrides the object without making a copy, the values in the HTML are converted multiple times, so the pipe does not use the original data that the subject provides. If you use the other code, so creating a clone of the commodity object inside convertCommodityUnits, it works like expected.
Now, I don't understand why the two ways of converting the data behave so differently. I get that one manipulates the data directly, because js does Call by sharing and one returns a new object. But the object that is passed to the convertCommodityUnits function is created by the array.prototype.map function, so it should not overwrite anything, right? I expect that RxJS uses the original, last data that was emitted to the subject to pass into the pipe/map operators, but that does not seem to be the case in the example, right?
How/Why are the values converted multiple times here?
This is more or less a follow-up question on this: Angular/RxJS update piped subject manually (even if no data changed), "unit conversion in rxjs pipe", so it's the same setup.
When you're using map you got a new reference for the array. But you don't get new objects in the newly generated array (shallow copy of the array), so you're mutating the data inside the element.
In the destructuring solution, because you have only primitive types in each object in the array, you kind of generate completely brand new elements to your array each time the conversion method is called (this is important: not only a new array but also new elements in the array => you have performed a deep copy of the array). So you don't accumulate successively the values in each subscription.
It doesn't mean that the 1-level destructuring solution like you used in the provided stackblitz demo will work in all cases. I've seen this mistake being made a lot out there, particularly in redux pattern frameworks that need you to not mutate the stored data, like ngrx, ngxs etc. If you had complex objects in your array, the 1-level destructuring would've kept untouched all the embedded objects in each element of the array. I think it's easier to describe this behavior with examples:
const obj1 = {a: 1};
const array = [{b: 2, obj: obj1}];
// after every newArray assignment in the below code,
// console.log(newArray === array) prints false to the console
let newArray = [...array];
console.log(array[0] === newArray[0]); // true
newArray = array.map(item => item);
console.log(array[0] === newArray[0]); // true
newArray = array.map(item => ({...item}));
console.log(array[0] === newArray[0]); // false
console.log(array[0].obj === newArray[0].obj); // true
newArray = array.map(item => ({
...item,
obj: {...item.obj}
}));
console.log(array[0] === newArray[0]); // false
console.log(array[0].obj === newArray[0].obj); // false

VueJS data initialization for non existent objects

I am using Ajax to populate data properties for several objects. As such, the properties I want to bind to do not exist at the time of binding.
eg:
<template>
<my-list v-bind:dataid="myobject ? myobject.data_id : 0"></my-list>
</template>
<script>
export default {
data () {
return {
myobject: {}
}
}
</script>
In the Vue docs https://012.vuejs.org/guide/best-practices.html it mentions to initialize data instead of using a empty object.
However I am using multiple Ajax created objects with tens of parameters and sub parameters. To initialize every sub-parameter on all objects like this:
myobject: { subp1: [], subp2: [] ...}
where myobject may be an object containing array of objects, or an array of objects containing sub-arrays of objects for example.
would take quite a bit of work. Is there a better alternative when binding to 'not-yet existing' objects?
First of all, an empty array is still "truthy", so your check here
v-bind:dataid="myobject ? myobject.data_id : 0"
always returns true. You should check for myobject.length instead. Your code should work now.
Also, you really don't need to define dummy objects for an array. Vue detects whenever you mutate an array.
https://v2.vuejs.org/v2/guide/list.html#Array-Change-Detection

Angular rendering performance and optimization

Can anyone confirm or deny the below scenario, and explain your reasoning? I contend that this would cause two UI renders and is therefore less performant.
Suppose in Angular you have a data model that is hooked up to a dropdown in the UI. You start with a data model that is an array of objects, you clear the array, you re-fill the array with exactly equivalent objects that are different only in that a property has been changed:
[obj1, obj2, obj3, obj4]
// clear the array
[] // the first UI render event occurs
// you fill the array with new objects that are the same except the value
// of one property has changed from true to false
[obj1, obj2, obj3, obj4] // a second UI render event occurs
I contend that this is more performant:
[obj1, obj2, obj3, obj4]
// change a property on each object from true to false
[obj1, obj2, obj3, obj4] // a single render event occurs
Thank you for looking at this.
If the steps in your first example are supposed to be run synchronously, the assumption is false. Since JavaScript is single-threaded, angular won't have a chance to even notice that you have emptied the array before re-filling it.
For example:
// $scope.model === [obj1, obj2, obj3, obj4];
$scope.model.length = 0; // clear the array
// $scope.model === [] but no UI render occurs here
$scope.model = [obj5, obj6, obj7, obj8]; //re-fill with new objects
//UI render will happen later, and angular will only see the change
//from [obj1, obj2, obj3, obj4] to [obj5, obj6, obj7, obj8]
If the changes are supposed to involve asynchronicity, the delay in these asynchronous operations is likely to take much more time than the empty array render in between, so I wouldn't be concerned about that either.
The possible performance differences come from other things, like from creating new objects or angular needing to do deep equality checks when references haven't changed.
I doubt that this would be the bottleneck of any angular app, though, so I suggest you go with whatever suits your code style better. (Especially as mutable vs immutable objects is quite an important design decision to make).
Assuming that the process is in steps which require user interaction, the steps will be as follows. Note that the numbers in the indented lists represent the high level process that Angular uses.
Angular renders view with default array [obj1, obj2, ob3]
A watcher is created which watches array reference
Angular sets up watchers on the total array of objects
Watcher also watches properties of objects within the array
User interacts with view, causing array to be set to []
Watcher #1 above fires on new array reference, builds watchers for any new objects and deconstructs watchers for old references.
User interacts again to build new array of items with one property change [obj1, obj2′, obj3].
Watcher #1 fires, noticing a new array reference
Angular builds watchers for each object and properties of those objects within the array
In terms of speed, step #2 is essentially a NoOP. What you're probably running into is the time it takes Angular to construct and deconstruct the watchers when a new array of objects is created.
When angular sets up a watcher on an object or an array of objects it'll add in a $hash property to all objects within the array. Now imagine that a new array is created which looks the same as the old one but all of the object references in memory are new and the $hashes are gone or changed. This causes Angular to fire all of the $watch statements for that scope variable.
You shouldn't be creating a new array or assigning the original scope array to a new value. This is a bad pattern. The code below shows two methods: one which will cause the watchers to fire more than desirable, and a better method which will only fire the watchers that need to be fired.
scope.myArr = [{n:1, a:true}, {n:2, a:true}, {n:3, a:true}];
//BAD PATTERN
scope.doSomething = function(n){
scope.myArr = [];
//somehow it gets a new array
getNewArrayFromAPI(n)
.then(function(response){
scope.myArr = response.data;
});
//[{n:1, a:true}, {n:2, a:false}, {n:3, a:true}]
}
//user interaction
scope.doSomething(2);
The following good pattern updates the array in place, never changing the references to the original objects unless it needs to add a new object to the array.
//GOOD PATTERN
scope.myArr = [{n:1, a:true}, {n:2, a:true}, {n:3, a:true}];
scope.doSomething = function(n){
//this method shows an in-place non-HTTP change
scope.myArr.forEach(function(curr){
if(curr.n === n){
curr.a = false;
}
});
scope.getChangedArray(n);
};
//represents an HTTP call, POST or PUT maybe that updates the array
scope.getChangedArray = function(n){
$http.post("/api/changeArray", n)
.then(function(response){
//response.data represents the original array with n:2 changed
response.data.forEach(function(curr){
var match = scope.myArr.filter(function(currMyArr){
return currMyArr.n === curr.n;
});
if(match.length){
//update the existing properties on the matched object
angular.extend(match[0], curr);
}else{
//new object
scope.myArr.push(curr);
}
});
})
};
//user interaction
scope.doSomething(2);

UnderscoreJS find-and-replace

In my application, I have a very large array of objects on the front-end, and these objects all have some kind of long ID under the heading ["object_id"]. I'm using UnderscoreJS for all my manipulations of this list. This is a prototype for an application that will eventually be handling most of this effort on the backend.
Merging the list is a big part of the application's requirement. See, the list that I work with initially will have many distinct objects with identical object_ids. Initially I was merging them all in one go with a groupBy and a map-reduce, but now the requirements have changed and I'm merging them one at a time (about a second apart, to simulate a stream of input) into a initially empty array.
My naive implementation was something like this:
function(newObject, objects) {
var obj_id = newObject["object_id"]; //id of object to merge, let's say
var tempObject = null;
var objectToMerge = _.find(objects,
function(obj) {
return obj_id == obj["object_id"];
});
if (objectToMerge) {
tempObject = merge(objectToMerge, newObject);
objects = _.reject(objects, /*same function as findWhere*/ );
} else {
tempObject = newObject;
}
objects.push(tempObject);
return objects;
}
This is ridiculously more efficient than before, when I was remerging from the mock data "source" array every time a new object was supposed to be pushed, so it's down from what I think was O(N^2) at least to O(N), but N here is so large (for JavaScript, anyway!) I'd like to optimize it. Currently worst case, where the object_id is not redundant, is the entire list is traversed twice. So what I'd like is to do a find-and-replace, an operation which would return a new version of the list, but with the merged object in place of the old one.
I could do a map where the iterator returns a new, merged object iff the object_id is the same, but that doesn't have the short-circuit evaluation that _.find has, which means the difference between having a worst-case runtime and having that be the default runtime, and doesn't easily account for pushing the object if there wasn't a match.
I'd also like to avoid mutating the original array in place. I know objects.push(tempObject) does that very thing, but for data-binding reasons I'm ignoring that and returning the mutated list as though it were new.
It's also unavoidable that I'll have to check the array to see if the new object was merged or whether it was appended. Using closures I could keep track of a flag to see if the merge happened, but I'm trying to be as idiomatically LISPy as possible for my own sanity. Also, past a certain point, most objects will be merged, so extra runtime overheard for adding new items isn't a huge problem, as long as it is only incurred when it has to happen.

Memory assignment to arrays

I was wondering if any one knows how memory is handled with JS arrays if you have an array that starts with a high value.
For example, if you have:
array[5000] = 1;
As the first value in the array, everything before 5000 simply does not exist, will the amount of memory assigned to the array cater for the unassigned 4999 positions prior to it... or will it only assign memory to the value in the array for [5000] ?
I'm trying to cut down on the amount of memory used for my script so this led to me wondering about this question :)
When assigning a value to the 5000th key, not the whole array is populated:
var array = []; // Create array
array[5000] = 1;
'1' in array; // false: The key does not exists
Object.keys(array); // 5000 (it's the only key)
If you want to blow your new browser with arrays, populate a typed array:
var array = new ArrayBuffer(6e9); // 6 Gigs
Both can be verified easily in Chrome: Open the console and memory console (Shift+Esc), and paste the code. window.a=new Array(6e9); or window.a=[];window[6e9]=1; doesn't result in a significant memory increase,
while window.a=new ArrayBuffer(6e9); crashes the page.
PS. 6e9 === 6000000000
Javascript is really interpreted and run by the browser, so it depends on how the browser implements this behavior. In theory, once you do array[5000], you have an array of 5001 elements, all except the 5001st being undefined.
Though if I were the one implementing the logic for running such script, undefined would be the default value if not assigned to anything else, meaning I could probably get away with defining a map with 1 entry assigning key 5000 to value 1. Any accesses to any other value in the array would automatically return undefined, without having to do unnecessary work.
Here's a test of this here. As you can see, the alert is seen immediately.
JS arrays are actually not arrays as you know them from other programming languages like C, C++, etc. They are instead objects with a array like way of accessing them. This means that when you define array[5000] = 1; You actually define the 5000 property of the array object.
If you had used a string as the array key you would have been able to access the index as a property as well to demonstrate this behavior, but since variable names can't start with a number array.5000 would be invalid.
array['key'] = 1;
alert( array.key ); // Gives you 1
This means that arrays will probably be implemented much like objects, although each implementation is free to optimize, thus giving you the behavior you except from objects where you can define object.a and object.z without defining the whole alphabet.

Categories