Backbone.js + Heap snapshot google chrome - javascript

I want to know if this is normal in a single page app
If you go to: http://todomvc.com/examples/backbone/ and take some heap snapshots of adding and removing the todos -- even if I remove all previous added todos, the memory of the heap snapshot increases every time.
Is this normal?
Should this go back to the initial value if I remove all todos?
Thank you

Should this go back to the initial value if I remove all todos?
Yes and no.
It should go back to it's initial value (or close) but that won't happen untill a garbage collection is actually triggered, which doesn't look like has happened in your case. You can trigger it manually under the "Timeline" tab by clicking the little trashcan icon.
Do that while recording a timeline (and check the memory checkbox) to see the heap usage drop down again.
You'll notice that the number of nodes doesn't drop all the way down to where it was when the page initially loaded and it keeps rising if you keep adding/removing todos and triggering garbage collection several times. That can be an indication a small leak and could be something to investigate further.

It can be a leak or it can be that the application just caches some data structures. Also heap usage may grow as V8 will produce more code, e.g. as a result of function optimizations. Such code may survive several GC phases before it gets collected even if the function is not called any more. This is complicated and in general you shouldn't worry about that as VM should take care of that. Just keep in mind that VM may allocate some internal data structures for its own needs, they are usually collected over time. To see if JavaScript heap growth is valid you can record heap allocations timeline ("Profiles panel" > "Record Heap Allocations"). This will allow you to filter objects by allocation time and then decide if these objects should have survived or not:

Related

Whether to worry about a potential memory-leak or not

I'm building a library for AngularJS that aims at transforming a HTML-table into an excel-like sheet. I've been worrying about performance and I've been running some tests, and I'm curious whether the following profile is something I need to worry about or not. See the memory profile below.
To get this profile, I'm "scrolling" through the table rows (see the basic example, click on the top row, push and hold arrow down key). The example contains 5 rows, whereas the graph was produced with a much longer list: 300 rows. After scrolling to the middle of the list I would pause, force a garbage collection, and continue scrolling towards the end.
In a nutshell, every time you move down to the next row, a bunch of input elements are created and some AngularJS stuff (a new scope, some watchers) is being connected to them. The opposite is done for the row that is being transformed back into an ordinary table row element (should be, of course if something is leaking, its probably in here)
With respect to memory: I guess, intuitively, it should make sense that, given it is well implemented, memory related to transforming a tr element into the editable row state should be freed once that row is transformed back into its original state (that is: the static tr state)
Analysing the graph, there's two things that grab my attention.
You can clearly see a climbing saw-tooth pattern which a strong indicator for a memory leak. A bunch of stuff is being freed, but not all. => stuff is leaking when transitioning through the table rows
However, when forcefully doing a GC, everything is gone. So it's not a memory leak after all, because if it were, it wouldn't be gone at this point. Note: afaik I can't easily analyse what is accumulating during the saw-tooth phase because doing a snapshot forces a GC, which apparently clears everything as we can see here.
Thus my questions:
Since a forced GC cleans it up it is not a leak (?). But intuitively seen it should be freed after row transition: is it chrome doing complex stuff under the hood, or does it hint at bad code (can I rewrite stuff to promote a more complete GC after each transition?)
Should I even worry about everything that is rising during saw-tooth phase as it is cleaned after a forced GC anyway?
[edit]
If you want to check out the code, you can do so here. Most of the meat is in
table-editor-cell.directive.js
table-editor-row.directive.js
table-editor.directive.js

How to clear browser memory with Javascript

I have a simple single-page app that scrolls a bunch of photos indefinitely, meant to be run on displays for days at a time.
Because of the large number of scrolling pics, the memory use in Chrome keeps growing. I want a way to programmatically reduce the memory consumption at intervals (every few hours).
When I stop the animation programmatically, the memory footprint still doesn't go down. Even when I reload the page with location.reload();, it doesn't go down.
Is there a way to do this? How I can I "clear everything" programmatically that has the same effect as closing the tab?
FYI, in Firefox there isn't a memory issue. Just in Chrome. The code is super simple, uses requestAnimationFrame to animate two divs across the screen constantly. I'm not accumulating references anywhere. My question isn't about my code specifically, but rather about general ways to reset the tab's memory if it can be done.
Please find out with chrome or Firefox memory debugger, where the memory leak is.Then, when you find, just think, how you can clean this objects.
Reasonable causes of memory leaks are:
You are loading big images, you need to resize it on server, or simply
draw it to smaller canvases
You have too many dom elements (for
an example more than 10000)
You have some js objects, that are
growing up, and you don't clean it.
When in task manager you will see, that usage of memory is very high. You can see what going on with memory at firefox if you put to address bar about memory , and then press button "Measure".
You can use location.reload(true), with this it clears all the memory(caches etc). Moreover if you want to further clear the storage of a browser then you can use localStorage.clear() in your javascript code. This clears the items stored in the localStorage if you have saved something like this. localStorage.myItem = "something".

Locating detached DOM tree memory leak

I'm having trouble diagnosing a detached DOM tree memory leak in a very large single-page web app built primarily with Knockout.
I've tweaked the app to attach a dummy FooBar object to a particular HTML button element which should be garbage collected as the user moves to a different "page" of the app. Using Chrome's heap snapshot function, I can see that an old FooBar instance (which should have been GC'ed) is still reachable from its HTMLButtonElement in a (large) detached DOM tree.
Tracing the references via the retaining tree panel, I follow the chain taking decreasing distance from the GC root. However, at some point my search reaches a dead end at a node distance 4 from the root (in this case)! The retaining tree reports no references to this node at all, yet somehow knows it is four steps from the GC root.
Here is the part of the retaining tree which has me puzzled (the numbers on the right are distances from the root):
v foobar in HTMLButtonElement 10
v [4928] in Detached DOM tree / 5643 entries 9
v native in HTMLOptionElement 8
v [0] in Array 7
v mappedNodes 6
v [870] in Array 5
v itemsToProcess in system / Context 4
context in function itemMovedOrRetained()
context in function callCallback()
The retaining tree doesn't show the references here at distance 3 or above.
Can anyone explain this to me? I was hoping I'd be able to follow the reference chain back up to the offending part of the JavaScript app code -- but this has my stymied!
First of all - do not use delete as one of the comments suggested. Setting a reference to null is the right way to dispose of things. delete breaks the "hidden class". To see it yourself, run my examples from https://github.com/naugtur/js-memory-demo
Rafe, the content you see in profiler is often hard to understand. The bit you posted here does seem odd and might be a bug or a memory leak outside of your application (browsers leak too), but without running your app it's hard to tell. Your retaining tree ends in a context of a function and it can be retained by a reference to that function or some other function sharing the context. It might be too complicated for the profiler to visualize it correctly.
I can help you pinpoint the problem though.
First, go to Timeline tab in devtools and use it to observe the moment your leak happens. Select only memory allocation and start recording. Go through a scenario that you expect to leak. The bars that remain blue are the leaks. You can select their surrounding in the timeline and focus on their retaining tree. The most interesting elements in detached dom trees are the red ones - they're referenced from the outside. The rest is retained because whatever element in a tree is referenced, it has references to everything else (x.parentNode)
If you need more details, you can take multiple snapshots in the profiler, so that you have a snapshot before and after the cause of the leak (that you found with the timeline - you now know the exact action that causes it). You can then compare those in the profiler - there's a "compare" view. which is more comprehensible than others.
You can also save your heap snapshots from the profiler and post them online, so we could take a look. There's a save link on each of them in the list to the left.
Profiling memory is hard and actually requires some practice and understanding of the tools.
You can practice on some examples from my talk:
http://naugtur.pl/pres/mem.html#/5/2
but the real complete guide to using memory profiler is this doc:
https://developer.chrome.com/devtools/docs/javascript-memory-profiling#looking_up_color_coding
Updated link: https://developers.google.com/web/tools/profile-performance/memory-problems/memory-diagnosis

Measuring memory usage of a web page

I'm trying to measure the memory usage of my site with the memory section of the Timeline tab in Chrome developer tools.
At various points I hit the garbage can button to force a garbage collection. The problem is the graph suddenly goes limp, and stops all measurements. Eventually, after I start doing other things it starts measuring again, but I never see the exact spot / value on the graph where I hit the GC button.
The first two downward slopes start immediately after I hit the garbage collect button, and they later just sort of connect to a new current value after I've been working.
Question is:
Is there a way to force this graph to keep or start measuring? Alternately, is there a simple way in JavaScript to console.log the current memory usage value?
As a related question, is there a way to point to a spot on the graph and see the exact memory usage at that point?
Timeline records events that happen on the renderer side. Each event record also has "memory usage" field. Timeline uses these numbers for the memory graph. So if there are no events for a time interval then memory graph shows nothing.
From the other side if renderer does nothing then the memory size doesn't change.
If you absolutely sure that you need memory data then you can setup a timer that does nothing.
For example you can execute in console setInterval(function() {}, 1000);
In that case Timeline will get timer event with the memory usage data and will draw the memory graph.

How to measure memory usage and efficiency?

I have a web app that uses a lot of JavaScript and is intended to run non-stop (for days/weeks/months) without a page reload.
However, Chrome is crashing after a few hours. Safari doesn't crash as often, but it does slow down considerably.
How can I check whether or not the issues are with my code, or with the browser itself? And what can I do to resolve these issues?
Using Chrome Developer Profile Tools you can get a snapshot of what's using your CPU and get a memory snapshot.
Take 2 snaps shots. Select this first one and switch to comparison as shown below
The triangle column is the mathmatical symbol delta or change. So if your deltas are positive, you are creating more objects in memory. I'd would then take another snapshot after a given period of time, say 5 minutes. Then compare the results again. Looking at delta
If your deltas are constant, you are doing an good job at memory manageemnt. If negative, your code is clean and your used objects are able to be properly collected, again a great job.
If your deltas keep increasing, you probably have a memory leak.
Also,
document.getElementsByTagName('*'); // a count of all DOM elements
would be useful to see if you are steadily increasing you DOM elements.
Chrome also has the "about:memory" page, but I agree with IAbstractDownVoteFactory - developer tools are the way to go!

Categories