Big performance issues/crashing using backbone on mobile phones - javascript

we have built our service dailymus.es to be mobile friendly, but we are hitting on a range of performance issues when accessing it on the mobile phone.
Specifically, it crashes after a few "pages" and when we have a lot of content on the page.
I am suspecting that we have too many event handlers and/or memory leaks. What methods do you use to eliminate these problems with Backbone?

I suggest you test your site using Google Chrome's Developers Console. Use the Profile tab to examine the state of the heap.
Most leaks of backbone models/views are due to not detaching the DOM events from views and the binding (on) events from models.
Make sure to override the remove method of your backbone view and make sure you .off() from everything you set to .on(). Don't forget to call remove on sub-views.
To find leaks:
Take a snapshot
Run your code to create a view and then remove it
Take another snapshot
Compare the snapshots to find the new objects created who weren't released.
More about the Google Chrome Heap Profiler

Backbone wastes a lot of memory which is the hardest thing for mobile. There's a lot of techniques for object pooling on the DOM elements, updating elements instead of recreating templates, limit loading images until the last minute, holding any updates until right before the paint cycle.
Mobile web can be performant if the memory is managed properly. PrefView is a good example and can get 50FPS on long scroll list on an iPad mini. https://github.com/puppybits/BackboneJS-PerfView

Related

Google Tag Manager in Single Page Application, Memory Leak?

Things are getting interesting...
In a multi-page application, tags are injected by Tag Manager and cleared on every page change/reload.
In a single page application, tags are injected and keep living in the DOM until a manual page reload.
I have just over 70 tags getting fired up on different dataLayer events. Tag Manager injects them all at the bottom of the DOM and they are never removed. If a user views many products, he will end up with hundreds of scripts tags and iFrames in the DOM. This seems to be causing serious memory leaks.
How can I prevent this? Is there something I don't understand using GTM in SPAs? I've searched but haven't found much information. Either people don't care about memory leaks or I'm doing something wrong.
Is there a way to clear all old script tags and iFrames so the garbage collector can do its job and free up some memory?
Thanks for your help, hopefully this thread can help more people facing the same problem as SPAs are getting more popular.
To anyone getting a memory leak using GTM and SPAS, here are the pitfalls:
Do not use GTM Click on element triggers. They bind click events on everything in the DOM and the garbage collector can't do its job, causing memory leaks.
Do not use Custom Javascript variables in GTM macros/variables as they are anonymous and a new copy of the functions are created in the memory on every event you trigger.
Of course, use History instead of page view to trigger page views as there are no refresh in single page applications.
Hope it helps some people facing the same problem
Out of the box Google Tag Manager is configured for traditional server rendered web pages.
You need to do some additional configuration for Google Tag Manager to work properly with an SPA. Here is a guide for doing this:
https://www.analyticsmania.com/post/single-page-web-app-with-google-tag-manager/

Ways to identifies memory leaks' origin in Angular app

I'm fairly new to browser's memory management, and memory leaks.
I am making a web app using Angular.js which uses web services to get Model.
Plus, all controllers are linked to templates with ngRoutes except one. It is the controller of my app's main menu which also provide additional logic to router. I inject into each controller this MenuController.
One part of my app needs to check often if there are new informations, to make report. I implemented this as a function based on a $timeout which fires himself inside the function.
I have some trouble cause I found out that my app has some memory leaks and makes crash the browser after 5-6 hours of non refresh. I can see it in Chrome and Firefox but I don't know how to solve this issue.
Which steps should I go forward ?
Thanks.
For memory leak in Angular, I would recommend reading this article.
Also check the way you are using ng-repeat (if you do so) in your app… this is a common source of leak when you don't use the 'track by' syntax.

Synchronizing Website (DOM) on multiple devices

I need to invent a method to synchronize multiple devices that are connected to a Website.
Lets say I have a Tablet and a Laptop and change some elements of the dom on the laptop. Then the
website on the Tablet should change accordingly.
At first I need a way to monitor the dom changes. I read a lot in forums and have also tried to work with browser-synch (https://github.com/shakyShane/browser-sync) but it does not monitor all events. However it may happen that the websites are not in synch.
I have also tried to use the new MutationServer techniques. But I am not sure how exactly to update the websites on the other devices:
In case a new node has been added to the DOM I first have to determine its position inside the tree; then send all the informations to the other clients (i guess via nodejs and socketio). The clients have to put the newly created node in the right position of their tree.
In case a node has been edited or removed I have to react on it as well...
So my question is: Do you have any ideas or literature to solve my problem?
I just need some good hints to start because I am not really sure which method leads me to the "best" solution. I want to invent an efficient solution as well.
Sure, I could monitor the DOM by checking x times in a second whether changes occured. But this method is not really efficient especially not for mobile devices.
Hope you guys can help me and lead me to the right direction.
Best regards.
note: There is nothing ready and this is not so simple to do.
Capture all DOM maybe this will help you: AcID DOM Inspector
Capture "event Listeners" use Visual Event 2
Avoid capture frames (or use them) because security issues (cross-origin) may occur.
In my view it is not necessary to capture the entire document, it would only be necessary to capture inserts the "BODY" and "HEAD"
And avoid complex structures, perhaps the events (eventslistiners) need not be captured, only you would need to have a library that adds already existing events. Unless the events are created by customers, in which case you will have to capture them.
To transmit them you have to use Ajax and shared with DataBase (SQLServer, Mysql, MongoDB, etc).
From my point of view:
If you want to create is an application just to share data, for example someone posted a "new photo" or a "new comment", would be better to make the transmission of which was inserted in the "database" for "JSON" and clients read the "JSON" your main "javascript library" would dynamically generate the contents.
Thus would transmit much less making things faster and easier to implement process.

Performance and Memory optimization hints of an knockout based application

We have made a web client where you can pushpin markers on a map and multiple users can comment on each of these markers. For the map we use leaflet and (what matters more) Knockout for the ViewModel of these Pushpins and the comments on it.
So the data model isn't too complicated: Each Pushpin has a Lat/Lon, Title, some Metadata (who and when created it) and an array of comments (each with Username, Timestamp, Text).
There are a couple of computeds in the view model (firstComment, lastComment, etc..), that Knockout has to keep up to date, so I think these are slowing it down a lot.
Everytime the app starts, we download the whole set of Pushpins (over 600 right now) as JSON and initialize the Knockout view model with it. The JSON already has about 1,2 MByte which lasts already 6 seconds to download. The initialization of the Knockout View Model then needs over 20 seconds. I created a splashscreen with some animated GIF, so the user doesn't think that the App doesn't work, but as Pushpins are getting more this behaviour gets worse.
Javascript also needs a lot of memory to build up the model. I think memory is needed for the Knockout model, and also for the markers for the leaflet layer, which is its own model associated with my knockout objects.
In Firefox Memory rises up to about 700MB when I open my web app. In IE9 its over 1 GB (!). Also when the knockout model is built up (mainly creating Knockout observables and pushing them to an observable array) browsers are not reacting anymore. In IE the website (and my splashscreen) isn't rendered at all until Knockout has done its job. In Firefox the website gets rendered first, then it freezes until the model is built up, then it comes back again. In Chrome its the same as it is in IE.
Another issue is, that memory does not get freed except when I close the browser. With every site reload another GB is allocated so I can easily fill up every Byte of my 8GB Laptop by refreshing my site eight times ... :(
I already thought about extending our REST Server API with some kind of pagination, and to lazy load much of the data as late as possible. But first I would like to know, what Knockout offers to get my model with a lot of data (in fact it isn't much at the moment, but it could get much) up and running without freezing the browser for a couple of seconds.
How do you handle a large amount of knockout objects in the browsers memory and how to you lazy bind it?

Do any web browsers garbage collect removed dom elements? (as opposed to Javascript objects)

If one made a web application that never refreshed a page but was built completely from the first page plus Javascript requests, thereby creating and destroying elements as required, would any of the browsers reuse the memory used by the obsolete dom elements?
Is this planned in any browsers yet?
I'm thinking that full blown extJS apps would be very sensitive to this kind of memory leakage.
Is there any truly effective re-use strategy to mitigate this problem?
I'm not referring to Javascript object garbage collection here, only removed DOM elements, but I'm not sure if that is essentially the same thing in the end.
It looks like Chrome does this: http://jsfiddle.net/GaPLT/1/.
Memory usage:
Start: 45K
After adding: 60K
After removing: 49K
Short answer is it depends on your JavaScript engine.
This is how Chrome's V8 does it http://code.google.com/apis/v8/design.html#garb_coll

Categories