Angular 2 Rendering Performance in a Select - javascript

We have 2 lists, a short one and a large one. The large one loads the date based on the selection in the short one.
In the example, most of the elements in the large list are selected (380 out of 400) initially. After a new selection is made in the short list, data in the large list should be cleared and loaded again.
Now the difference lies in the
// await this.delayExecution(1);
line in the parent component. Uncommenting await (even with 1 ms) changes the execution flow in a way that the second list reacts immediately.
The transpiled code JavaScript:
ParentComponent.prototype.selectionChanged = function (data) {
return __awaiter(this, void 0, void 0, function () {
return __generator(this, function (_a) {
console.log('waiting');
this.dataSelectedLarge = [];
// await this.delayExecution(1);
this.dataToSetLarge = [];
console.log('changed');
return [2 /*return*/];
});
});
};
And uncommented:
ParentComponent.prototype.selectionChanged = function (data) {
return __awaiter(this, void 0, void 0, function () {
return __generator(this, function (_a) {
switch (_a.label) {
case 0:
console.log('waiting');
this.dataSelectedLarge = [];
return [4 /*yield*/, this.delayExecution(1)];
case 1:
_a.sent();
this.dataToSetLarge = [];
console.log('changed');
return [2 /*return*/];
}
});
});
};
So clearing the selection
this.dataSelectedLarge = [];
and the data
this.dataToSetLarge = [];
without some kind of delay forces new rendering of the list which takes long time, while with await rendering happens in an instant.
The example is here: Angular 2 Performance Select
The questions are:
why causes this behavior
what would be the proper implementation
In the real app, we use ChangeDetectionStrategy.OnPush in the child component.

The performance problem seems to be the removal of all data AND the selection at the same time, meaning in the same Change Detection Cycle.
After running some experiments with zones.jsan NgZone I figured out, that the only difference is an extra Change Detection between this.dataSelectedLarge = []; and this.dataToSetLarge = [];.
By removing the Selection, forcing the Change Detection using injected ChangeDetectorRef.detectChanges(); and removing the Data after that, view gets update immediately and there is not need for async/wait.
Finally, the method is as simple as that
selectionChanged(data: any) {
this.dataSelectedLarge = [];
this.chdRef.detectChanges();
this.dataToSetLarge = [];
}

Related

High frequency updates to PouchDB (Document update conflict)

I have a method for getting/updating state that's stored within PouchDB. This method gets called by a constructor of an element to assign a user-friendly unique tag to the element. The simplified version of the code looks like this:
var tagList = [ /* set of dictionary words to cycle through */ ];
function generateTag(id) {
return db.get('tags').then(function (tagData) {
var tag = '', remainder = tagData.tagCount, quotient;
while (remainder >= tagList.length) {
quotient = Math.floor(remainder / tagList.length);
tag += tagList[quotient - 1];
remainder -= tagList.length * quotient;
}
tag += tagList[remainder];
tag = tag.charAt(0).toLowerCase() + tag.slice(1);
tagData.tagCount++;
tagData.tags[tag] = id;
db.put(tagData);
return tag;
}).catch(function (err) {
console.error(err);
});
}
class Element {
constructor() {
var self = this;
generateTag('element' + Date.now()).then(function (tag) {
self.tag = tag;
});
}
}
This logic works as expected when there is a delay between creating elements. But when elements are created in a rapid burst (i.e. a for loop), the db.get call for 2nd, 3rd, and consecutive elements gets called before the db.put operation for first element finishes, resulting in "Document update conflict" messages for consecutive elements. At first I thought PouchDB's conflict resolution would automatically handle this for me, but I was wrong.
Maybe I'm not understanding proper way of handling such cases, or is there a better way of writing this? What I need is for consecutive db.get calls to effectively block until the ongoing db.put from previous operation finishes. I was thinking perhaps even keeping a static link to the promise corresponding to last PouchDB operation on 'tags' object, such that instead of db.get('tags') I'd run tagsPromise.then(function () { return db.get('tags'); }), but I'm still a rookie with promises and don't know if that would be a desirable way of addressing this issue or if this issue is even a real issue or something I imposed on myself by not sticking with a better approach?
UPDATE:
It looks like modifying the logic to always return a promise and always start with a "singleton" promise instead of db.get('tags') in generateTag function as I mentioned does fix the issue, still want to understand if there is a better approach.
To others interested, this is how I rewrote the above logic using the tagPromise approach I mentioned in the update (if there is a better answer from PouchDB experts, I'll accept that instead):
var tagList = [ /* set of dictionary words to cycle through */ ];
var tagPromise = db.get('tags');
function generateTag(id, callback) {
tagPromise = tagPromise.then(function() {
return db.get('tags');
}).then(function (tagData) {
var tag = '', remainder = tagData.tagCount, quotient;
while (remainder >= tagList.length) {
quotient = Math.floor(remainder / tagList.length);
tag += tagList[quotient - 1];
remainder -= tagList.length * quotient;
}
tag += tagList[remainder];
tag = tag.charAt(0).toLowerCase() + tag.slice(1);
tagData.tagCount++;
tagData.tags[tag] = id;
callback(tag);
return db.put(tagData);
}).catch(function (err) {
console.error(err);
});
}
class Element {
constructor() {
var self = this;
generateTag('element' + Date.now(), function (tag) {
self.tag = tag;
});
}
}
I find it hard to follow what you are trying to achieve. Could you post the tags document and examples of the elements and tags?
At first glance, getting and updating a single 'tags' document looks quite smelly, I think it would make much more sense to use a view. But for an informed response, I'd need more details please! Thanks!

RxJS, how to poll an API to continuously check for updated records using a dynamic timestamp

I am new to RxJS and I am trying to write an app that will accomplish the following things:
On load, make an AJAX request (faked as fetchItems() for simplicity) to fetch a list of items.
Every second after that, make an AJAX request to get the items.
When checking for new items, ONLY items changed after the most recent timestamp should be returned.
There shouldn't be any state external to the observables.
My first attempt was very straight forward and met goals 1, 2 and 4.
var data$ = Rx.Observable.interval(1000)
.startWith('run right away')
.map(function() {
// `fetchItems(modifiedSince)` returns an array of items modified after `modifiedSince`, but
// I am not yet tracking the `modifiedSince` timestamp yet so all items will always be returned
return fetchItems();
});
Now I'm excited, that was easy, it can't be that much harder to meet goal 3...several hours later this is where I am at:
var modifiedSince = null;
var data$ = Rx.Observable.interval(1000)
.startWith('run right away')
.flatMap(function() {
// `fetchItems(modifiedSince)` returns an array of items modified after `modifiedSince`
return fetchItems(modifiedSince);
})
.do(function(item) {
if(item.updatedAt > modifiedSince) {
modifiedSince = item.updatedAt;
}
})
.scan(function(previous, current) {
previous.push(current);
return previous;
}, []);
This solves goal 3, but regresses on goal 4. I am now storing state outside of the observable.
I'm assuming that global modifiedSince and the .do() block aren't the best way of accomplishing this. Any guidance would be greatly appreciated.
EDIT: hopefully clarified what I am looking for with this question.
Here is another solution which does not use closure or 'external state'.
I made the following hypothesis :
fetchItems returns a Rx.Observable of items, i.e. not an array of items
It makes use of the expand operator which allows to emit values which follow a recursive relationship of the type x_n+1 = f(x_n). You pass x_n+1 by returning an observable which emits that value, for instance Rx.Observable.return(x_n+1) and you can finish the recursion by returning Rx.Observable.empty(). Here it seems that you don't have an ending condition so this will run forever.
scan also allows to emit values following a recursive relationship (x_n+1 = f(x_n, y_n)). The difference is that scan forces you to use a syncronous function (so x_n+1 is synchronized with y_n), while with expand you can use an asynchronous function in the form of an observable.
Code is not tested, so keep me updated if this works or not.
Relevant documentation : expand, combineLatest
var modifiedSinceInitValue = // put your date here
var polling_frequency = // put your value here
var initial_state = {modifiedSince: modifiedSinceInitValue, itemArray : []}
function max(property) {
return function (acc, current) {
acc = current[property] > acc ? current[property] : acc;
}
}
var data$ = Rx.Observable.return(initial_state)
.expand (function(state){
return fetchItem(state.modifiedSince)
.toArray()
.combineLatest(Rx.Observable.interval(polling_frequency).take(1),
function (itemArray, _) {
return {
modifiedSince : itemArray.reduce(max('updatedAt'), modifiedSinceInitValue),
itemArray : itemArray
}
}
})
You seem to mean that modifiedSince is part of the state you carry, so it should appear in the scan. Why don-t you move the action in do into the scan too?. Your seed would then be {modifiedSince: null, itemArray: []}.
Errr, I just thought that this might not work, as you need to feed modifiedSince back to the fetchItem function which is upstream. Don't you have a cycle here? That means you would have to use a subject to break that cycle. Alternatively you can try to keep modifiedSince encapsulated in a closure. Something like
function pollItems (fetchItems, polling_frequency) {
var modifiedSince = null;
var data$ = Rx.Observable.interval(polling_frequency)
.startWith('run right away')
.flatMap(function() {
// `fetchItems(modifiedSince)` returns an array of items modified after `modifiedSince`
return fetchItems(modifiedSince);
})
.do(function(item) {
if(item.updatedAt > modifiedSince) {
modifiedSince = item.updatedAt;
}
})
.scan(function(previous, current) {
previous.push(current);
return previous;
}, []);
return data$;
}
I have to run out to celebrate the new year, if that does not work, I can give another try later (maybe using the expand operator, the other version of scan).
How about this:
var interval = 1000;
function fetchItems() {
return items;
}
var data$ = Rx.Observable.interval(interval)
.map(function() { return fetchItems(); })
.filter(function(x) {return x.lastModified > Date.now() - interval}
.skip(1)
.startWith(fetchItems());
That should filter the source only for new items, plus start you off with the full collection. Just write the filter function to be appropriate for your data source.
Or by passing an argument to fetchItems:
var interval = 1000;
function fetchItems(modifiedSince) {
var retVal = modifiedSince ? items.filter( function(x) {return x.lastModified > modifiedSince}) : items
return retVal;
}
var data$ = Rx.Observable.interval(interval)
.map(function() { return fetchItems(Date.now() - interval); })
.skip(1)
.startWith(fetchItems());

How to (and should we) test UI element visibility in jasmine?

I have a function that hides and shows items on my page based on what a factory provides me:
function toggleMenuItems(config) {
// hide all our elements first
$(".js-quickMenuElement").hide();
config.data = config.data || [];
config.data.forEach(function (d) {
if (d === config.viewConfigCatalog.CalendarLink) {
$("#CalendarLink.js-quickMenuElement").show();
}
if (d === config.viewConfigCatalog.ProductCreation) {
$("#ProductCreation.js-quickMenuElement").show();
}
// etc etc etc
});
};
We've been using Jasmine for our javascript unit tests and we're discussing whether we should test this function.
Some say that we don't need to because testing this is coupling the view to the javascript test, but at the same time, if instead of jquery .show and .hide functions those were wrappers, or other functions we would test them.
Following on this what would be the best way to test this?
Making a wrapper function that takes in a string and injects the string name in the jQuery select seems wrong.
Another option we thought of is spying on ($.fn, "show") but that would only let us test if show was called X amount of time and not what was hidden...
Thanks,
You can use jQuery to test the visibility of an element.
$(element).is(":visible");
code taken from a related question
Of course in doing this as you say you're coupling the view with the test. You could move the logic which determines the outcome of this function into a separate function and then test that functions result instead.
** Edit **
Below illustrates what I meant regarding simplification with a KVP list, and you could write a test for the function which gets the value from the KVP.
var config = {
data: [],
viewConfigCatalog: {
CalendarLink: "CalendarLink",
ProductCreation: "ProductCreation",
}
};
var kvp = [{
name: config.viewConfigCatalog.CalendarLink,
value: "#CalendarLink.js-quickMenuElement"
}, {
name: config.viewConfigCatalog.ProductCreation,
value: "#ProductCreation.js-quickMenuElement"
}];
function getSelectorString(name) {
var i = kvp.length;
while (i--) {
var pair = kvp[i];
if (pair.name === name)
return pair.value;
}
return null;
}
function toggleMenuItems(config) {
// hide all our elements first
$(".js-quickMenuElement").hide();
config.data = config.data || [];
config.data.forEach(function(d) {
$(getSelectorString(d)).show();
});
};
document.writeln(getSelectorString(config.viewConfigCatalog.CalendarLink)+'<br/>');
document.writeln(getSelectorString(config.viewConfigCatalog.ProductCreation)+'<br/>');
document.writeln(getSelectorString("hi"));

knockout validation unique in list

I'm trying to validate an entry in a list to be unique from all other entries in the list using ko.validation, but I'm having issues with validation running when it shouldn't.
I have an editable list (a ko.observableArray), and each item in that array is a view model with a ko.observable on it:
var vm = function (data) {
var self = this;
self.items = ko.observableArray();
_.each(data.words, function (word) {
self.items.push(new listItemVm({parent: self, word: word.word}));
});
};
var listItemVm = function (data) {
var self = this;
self.parent = data.parent;
self.word = ko.observable(data.word);
};
Then I add some validation to listItemVm.word ko.observable. I want each one to be unique:
var listItemVm = function (data) {
var self = this;
self.parent = data.parent;
self.word = ko.observable(data.word).extend({
validation: {
validator: function (name, params) {
console.log("validating " + name);
// word we are editing must be different from all other words
// uncommenting this next line causes the behaviour
// I would expect because params.parent.items()
// is not called
//return true;
var allWords = params.parent.items();
// exclude current view model we are editing
var otherWordViewModels = _.filter(allWords, function (row) {
return row !== params.currentRow;
});
var otherWords = _.map(otherWordViewModels, function (item) {
return item.word();
});
return !_.contains(otherWords, name);
},
message: 'Must be unique',
params: {
currentRow: self,
parent: self.parent
}
}
});
};
I give it some data, and wrap it in some HTML: http://jsfiddle.net/9kw75/3/
Now, this does work - the validation runs correctly and shows invalid when the values of the two inputs are equal - but have a look in the console on that fiddle. Why does the validation routine run three five times on load, and why do both fields validate when just one value updates?
On page load
Expected: validation runs once for each input field.
Actual: validation runs three times for one input, and twice for the other.
On value update (either input field)
Expected: validation runs for altered input field only
Actual: validation runs for both input fields
It's worth noting that this strange behaviour is only observed after reading params.parent.items() in the validator. If the return is commented out, the behaviour I would expect is observed.
I believe the way this works is that the "validator" function is used in a computed observable. Thus, any observables that are read as it executes are now dependencies for the computed. Since you are reading each item's word observable in this function, each one triggers validation for all of the others.
It makes sense that it works this way, though in the case of your particular application, it doesn't make sense. You could use peek to read the observables while not triggering the dependency detection:
var allWords = params.parent.items.peek();
// ...
var otherWords = _.map(otherWordViewModels, function (item) {
return item.word.peek();
});

AngularJS : Asynchronously initialize filter

I'm having trouble trying to initialize a filter with asynchronous data.
The filter is very simple, it needs to translate paths to name, but to do so it needs a correspondance array, which I need to fetch from the server.
I could do things in the filter definition, before returning the function, but the asynchronous aspect prevents that
angular.module('angularApp').
filter('pathToName', function(Service){
// Do some things here
return function(input){
return input+'!'
}
}
Using a promise may be viable but I don't have any clear understanding on how angular loads filters.
This post explains how to achieve such magic with services, but is it possible to do the same for filters?
And if anyone has a better idea on how to translate those paths, I'm all ears.
EDIT:
I tried with the promise approch, but something isn't right, and I fail to see what:
angular.module('angularApp').filter('pathToName', function($q, Service){
var deferred = $q.defer();
var promise = deferred.promise;
Service.getCorresp().then(function(success){
deferred.resolve(success.data);
}, function(error){
deferred.reject();
});
return function(input){
return promise.then(
function(corresp){
if(corresp.hasOwnProperty(input))
return corresp[input];
else
return input;
}
)
};
});
I'm not really familliar with promises, is it the right way to use them?
Here is an example:
app.filter("testf", function($timeout) {
var data = null, // DATA RECEIVED ASYNCHRONOUSLY AND CACHED HERE
serviceInvoked = false;
function realFilter(value) { // REAL FILTER LOGIC
return ...;
}
return function(value) { // FILTER WRAPPER TO COPE WITH ASYNCHRONICITY
if( data === null ) {
if( !serviceInvoked ) {
serviceInvoked = true;
// CALL THE SERVICE THAT FETCHES THE DATA HERE
callService.then(function(result) {
data = result;
});
}
return "-"; // PLACEHOLDER WHILE LOADING, COULD BE EMPTY
}
else return realFilter(value);
}
});
This fiddle is a demonstration using timeouts instead of services.
EDIT: As per the comment of sgimeno, extra care must be taken for not calling the service more than once. See the serviceInvoked changes in the code above and the fiddles. See also forked fiddle with Angular 1.2.1 and a button to change the value and trigger digest cycles: forked fiddle
EDIT 2: As per the comment of Miha Eržen, this solution does no logner work for Angular 1.3. The solution is almost trivial though, using the $stateful filter flag, documented here under "Stateful filters", and the necessary forked fiddle.
Do note that this solution would hurt performance, as the filter is called each digest cycle. The performance degradation could be negligible or not, depending on the specific case.
Let's start with understanding why the original code doesn't work. I've simplified the original question a bit to make it more clear:
angular.module('angularApp').filter('pathToName', function(Service) {
return function(input) {
return Service.getCorresp().then(function(response) {
return response;
});
});
}
Basically, the filter calls an async function that returns the promise, then returns its value. A filter in angular expects you to return a value that can be easily printed, e.g string or number. However, in this case, even though it seems like we're returning the response of getCorresp, we are actually returning a new promise - The return value of any then() or catch() function is a promise.
Angular is trying to convert a promise object to a string via casting, getting nothing sensible in return and displays an empty string.
So what we need to do is, return a temporary string value and change it asynchroniously, like so:
JSFiddle
HTML:
<div ng-app="app" ng-controller="TestCtrl">
<div>{{'WelcomeTo' | translate}}</div>
<div>{{'GoodBye' | translate}}</div>
</div>
Javascript:
app.filter("translate", function($timeout, translationService) {
var isWaiting = false;
var translations = null;
function myFilter(input) {
var translationValue = "Loading...";
if(translations)
{
translationValue = translations[input];
} else {
if(isWaiting === false) {
isWaiting = true;
translationService.getTranslation(input).then(function(translationData) {
console.log("GetTranslation done");
translations = translationData;
isWaiting = false;
});
}
}
return translationValue;
};
return myFilter;
});
Everytime Angular tries to execute the filter, it would check if the translations were fetched already and if they weren't, it would return the "Loading..." value. We also use the isWaiting value to prevent calling the service more than once.
The example above works fine for Angular 1.2, however, among the changes in Angular 1.3, there is a performance improvement that changes the behavior of filters. Previously the filter function was called every digest cycle. Since 1.3, however, it only calls the filter if the value was changed, in our last sample, it would never call the filter again - 'WelcomeTo' would never change.
Luckily the fix is very simple, you'd just need to add to the filter the following:
JSFiddle
myFilter.$stateful = true;
Finally, while dealing with this issue, I had another problem - I needed to use a filter to get async values that could change - Specifically, I needed to fetch translations for a single language, but once the user changed the language, I needed to fetch a new language set. Doing that, proved a bit more tricky, though the concept is the same. This is that code:
JSFiddle
var app = angular.module("app",[]);
debugger;
app.controller("TestCtrl", function($scope, translationService) {
$scope.changeLanguage = function() {
translationService.currentLanguage = "ru";
}
});
app.service("translationService", function($timeout) {
var self = this;
var translations = {"en": {"WelcomeTo": "Welcome!!", "GoodBye": "BYE"},
"ru": {"WelcomeTo": "POZHALUSTA!!", "GoodBye": "DOSVIDANYA"} };
this.currentLanguage = "en";
this.getTranslation = function(placeholder) {
return $timeout(function() {
return translations[self.currentLanguage][placeholder];
}, 2000);
}
})
app.filter("translate", function($timeout, translationService) {
// Sample object: {"en": {"WelcomeTo": {translation: "Welcome!!", processing: false } } }
var translated = {};
var isWaiting = false;
myFilter.$stateful = true;
function myFilter(input) {
if(!translated[translationService.currentLanguage]) {
translated[translationService.currentLanguage] = {}
}
var currentLanguageData = translated[translationService.currentLanguage];
if(!currentLanguageData[input]) {
currentLanguageData[input] = { translation: "", processing: false };
}
var translationData = currentLanguageData[input];
if(!translationData.translation && translationData.processing === false)
{
translationData.processing = true;
translationService.getTranslation(input).then(function(translation) {
console.log("GetTranslation done");
translationData.translation = translation;
translationData.processing = false;
});
}
var translation = translationData.translation;
console.log("Translation for language: '" + translationService.currentLanguage + "'. translation = " + translation);
return translation;
};
return myFilter;
});

Categories