Locating detached DOM tree memory leak - javascript

I'm having trouble diagnosing a detached DOM tree memory leak in a very large single-page web app built primarily with Knockout.
I've tweaked the app to attach a dummy FooBar object to a particular HTML button element which should be garbage collected as the user moves to a different "page" of the app. Using Chrome's heap snapshot function, I can see that an old FooBar instance (which should have been GC'ed) is still reachable from its HTMLButtonElement in a (large) detached DOM tree.
Tracing the references via the retaining tree panel, I follow the chain taking decreasing distance from the GC root. However, at some point my search reaches a dead end at a node distance 4 from the root (in this case)! The retaining tree reports no references to this node at all, yet somehow knows it is four steps from the GC root.
Here is the part of the retaining tree which has me puzzled (the numbers on the right are distances from the root):
v foobar in HTMLButtonElement 10
v [4928] in Detached DOM tree / 5643 entries 9
v native in HTMLOptionElement 8
v [0] in Array 7
v mappedNodes 6
v [870] in Array 5
v itemsToProcess in system / Context 4
context in function itemMovedOrRetained()
context in function callCallback()
The retaining tree doesn't show the references here at distance 3 or above.
Can anyone explain this to me? I was hoping I'd be able to follow the reference chain back up to the offending part of the JavaScript app code -- but this has my stymied!

First of all - do not use delete as one of the comments suggested. Setting a reference to null is the right way to dispose of things. delete breaks the "hidden class". To see it yourself, run my examples from https://github.com/naugtur/js-memory-demo
Rafe, the content you see in profiler is often hard to understand. The bit you posted here does seem odd and might be a bug or a memory leak outside of your application (browsers leak too), but without running your app it's hard to tell. Your retaining tree ends in a context of a function and it can be retained by a reference to that function or some other function sharing the context. It might be too complicated for the profiler to visualize it correctly.
I can help you pinpoint the problem though.
First, go to Timeline tab in devtools and use it to observe the moment your leak happens. Select only memory allocation and start recording. Go through a scenario that you expect to leak. The bars that remain blue are the leaks. You can select their surrounding in the timeline and focus on their retaining tree. The most interesting elements in detached dom trees are the red ones - they're referenced from the outside. The rest is retained because whatever element in a tree is referenced, it has references to everything else (x.parentNode)
If you need more details, you can take multiple snapshots in the profiler, so that you have a snapshot before and after the cause of the leak (that you found with the timeline - you now know the exact action that causes it). You can then compare those in the profiler - there's a "compare" view. which is more comprehensible than others.
You can also save your heap snapshots from the profiler and post them online, so we could take a look. There's a save link on each of them in the list to the left.
Profiling memory is hard and actually requires some practice and understanding of the tools.
You can practice on some examples from my talk:
http://naugtur.pl/pres/mem.html#/5/2
but the real complete guide to using memory profiler is this doc:
https://developer.chrome.com/devtools/docs/javascript-memory-profiling#looking_up_color_coding
Updated link: https://developers.google.com/web/tools/profile-performance/memory-problems/memory-diagnosis

Related

Whether to worry about a potential memory-leak or not

I'm building a library for AngularJS that aims at transforming a HTML-table into an excel-like sheet. I've been worrying about performance and I've been running some tests, and I'm curious whether the following profile is something I need to worry about or not. See the memory profile below.
To get this profile, I'm "scrolling" through the table rows (see the basic example, click on the top row, push and hold arrow down key). The example contains 5 rows, whereas the graph was produced with a much longer list: 300 rows. After scrolling to the middle of the list I would pause, force a garbage collection, and continue scrolling towards the end.
In a nutshell, every time you move down to the next row, a bunch of input elements are created and some AngularJS stuff (a new scope, some watchers) is being connected to them. The opposite is done for the row that is being transformed back into an ordinary table row element (should be, of course if something is leaking, its probably in here)
With respect to memory: I guess, intuitively, it should make sense that, given it is well implemented, memory related to transforming a tr element into the editable row state should be freed once that row is transformed back into its original state (that is: the static tr state)
Analysing the graph, there's two things that grab my attention.
You can clearly see a climbing saw-tooth pattern which a strong indicator for a memory leak. A bunch of stuff is being freed, but not all. => stuff is leaking when transitioning through the table rows
However, when forcefully doing a GC, everything is gone. So it's not a memory leak after all, because if it were, it wouldn't be gone at this point. Note: afaik I can't easily analyse what is accumulating during the saw-tooth phase because doing a snapshot forces a GC, which apparently clears everything as we can see here.
Thus my questions:
Since a forced GC cleans it up it is not a leak (?). But intuitively seen it should be freed after row transition: is it chrome doing complex stuff under the hood, or does it hint at bad code (can I rewrite stuff to promote a more complete GC after each transition?)
Should I even worry about everything that is rising during saw-tooth phase as it is cleaned after a forced GC anyway?
[edit]
If you want to check out the code, you can do so here. Most of the meat is in
table-editor-cell.directive.js
table-editor-row.directive.js
table-editor.directive.js

Understanding Node count using Timeline Chrome Developer Tools

I am developing a web application, in which a div is created on every mousemove event.
When profiling using Chrome Developer Tool Timeline, I see an increase of Node Counts (green line), but a really small number of Detached DOM tree when moving the mouse.
When the mouse is not being moved, the Node Counts are steady an never decrease/increase.
I would like to know:
How exactly Node count (green line) works? Does provide memory information cumulatively respect the start of the recording?
I was suspected a DOM memory leak, but taking a HEAP I see few Detached DOM tree. What could be issue for a steady increase of Node count?
Does Node count effect the total memory of the JS application?
What is the difference between Document DOM tree / xxx entries and Object Count?
EDIT:
After some research, I suspect that Node Counts going up does not necessary represents a leak of memory in this case (also Running Chrome/Task Manager I see the JS Memory stable and not continuously increasing).
It most probably represents an in memory usage by the browser, when in fact I do not move the mouse for 30 seconds or open another tab/window, the Garbage Collector kick in and memory is being cleared as in the next image.
By the way any expert advice on this is very very welcome :)
Interesting:
Javascript memory and leak problems
https://developers.google.com/web/tools/chrome-devtools/profile/memory-problems/memory-diagnosis

Backbone.js + Heap snapshot google chrome

I want to know if this is normal in a single page app
If you go to: http://todomvc.com/examples/backbone/ and take some heap snapshots of adding and removing the todos -- even if I remove all previous added todos, the memory of the heap snapshot increases every time.
Is this normal?
Should this go back to the initial value if I remove all todos?
Thank you
Should this go back to the initial value if I remove all todos?
Yes and no.
It should go back to it's initial value (or close) but that won't happen untill a garbage collection is actually triggered, which doesn't look like has happened in your case. You can trigger it manually under the "Timeline" tab by clicking the little trashcan icon.
Do that while recording a timeline (and check the memory checkbox) to see the heap usage drop down again.
You'll notice that the number of nodes doesn't drop all the way down to where it was when the page initially loaded and it keeps rising if you keep adding/removing todos and triggering garbage collection several times. That can be an indication a small leak and could be something to investigate further.
It can be a leak or it can be that the application just caches some data structures. Also heap usage may grow as V8 will produce more code, e.g. as a result of function optimizations. Such code may survive several GC phases before it gets collected even if the function is not called any more. This is complicated and in general you shouldn't worry about that as VM should take care of that. Just keep in mind that VM may allocate some internal data structures for its own needs, they are usually collected over time. To see if JavaScript heap growth is valid you can record heap allocations timeline ("Profiles panel" > "Record Heap Allocations"). This will allow you to filter objects by allocation time and then decide if these objects should have survived or not:

DOM nodes not garbage collected

I have a question about using Chrome's developer tools to debug memory leaks in a single-page web application.
According to Google's documentation, after taking a heap snapshot you'll see red and yellow detached DOM nodes. The yellow nodes are those still being reference by JavaScript, and effectively represent the cause of the leak. The red nodes are not directly referenced in JavaScript, but they're still "alive"—likely because they're part of a yellow node's DOM tree.
I've been able to fix several memory leaks by drilling down on all the yellow nodes in my heap snapshots and finding where in our code there was still a reference to them. However, now I've got a situation I'm not sure how to handle: Only red nodes are showing up in my heap snapshot!
If there is no JavaScript reference to these nodes, what are some other reasons that they wouldn't be garbage collected? Also, why does it say there are 155 entries but only show 60? I'm wondering if Chrome simply isn't showing one or more yellow nodes:
Per your request, adding this as an asnwer. Have you looked at more details on any of these DOM elements to see which DOM elements they are and perhaps that gives you a clue as to what code would have ever referenced them. One source of references that trips some people up is closures that you are done with, but are still alive for some reason.

Finding Memory leaks in Javascript code

I am using Chrome Tools to figure out cause of memory leaks on my webpage. I have watched following you tube video to learn how to use Profile option in Chrome developers tool to narrow down cause of memory leaks.
https://www.youtube.com/watch?v=L3ugr9BJqIs
I used the 3 step snapshot view approach to pinpoint an object that is retained between snapshot 1 and snapshot 3. When i clicked on the node, that is not highlighted in yellow, in the comparison window, i get following object tree. As per the video i can ignore any objects in grey (the system objects) and ones in Yellow background as they are reachable by Javascript and will be garbage collected. Based on that information, i cannot figure out why the object is still retained in the memory between snapshot 1 and snapshot3. The highlighted row(in blue background) is also highlighted in yellow which means it can be ignored that row also.
Any help or pointer greatly appreciated.

Categories