I have a web app that uses a lot of JavaScript and is intended to run non-stop (for days/weeks/months) without a page reload.
However, Chrome is crashing after a few hours. Safari doesn't crash as often, but it does slow down considerably.
How can I check whether or not the issues are with my code, or with the browser itself? And what can I do to resolve these issues?
Using Chrome Developer Profile Tools you can get a snapshot of what's using your CPU and get a memory snapshot.
Take 2 snaps shots. Select this first one and switch to comparison as shown below
The triangle column is the mathmatical symbol delta or change. So if your deltas are positive, you are creating more objects in memory. I'd would then take another snapshot after a given period of time, say 5 minutes. Then compare the results again. Looking at delta
If your deltas are constant, you are doing an good job at memory manageemnt. If negative, your code is clean and your used objects are able to be properly collected, again a great job.
If your deltas keep increasing, you probably have a memory leak.
Also,
document.getElementsByTagName('*'); // a count of all DOM elements
would be useful to see if you are steadily increasing you DOM elements.
Chrome also has the "about:memory" page, but I agree with IAbstractDownVoteFactory - developer tools are the way to go!
Related
I have a simple single-page app that scrolls a bunch of photos indefinitely, meant to be run on displays for days at a time.
Because of the large number of scrolling pics, the memory use in Chrome keeps growing. I want a way to programmatically reduce the memory consumption at intervals (every few hours).
When I stop the animation programmatically, the memory footprint still doesn't go down. Even when I reload the page with location.reload();, it doesn't go down.
Is there a way to do this? How I can I "clear everything" programmatically that has the same effect as closing the tab?
FYI, in Firefox there isn't a memory issue. Just in Chrome. The code is super simple, uses requestAnimationFrame to animate two divs across the screen constantly. I'm not accumulating references anywhere. My question isn't about my code specifically, but rather about general ways to reset the tab's memory if it can be done.
Please find out with chrome or Firefox memory debugger, where the memory leak is.Then, when you find, just think, how you can clean this objects.
Reasonable causes of memory leaks are:
You are loading big images, you need to resize it on server, or simply
draw it to smaller canvases
You have too many dom elements (for
an example more than 10000)
You have some js objects, that are
growing up, and you don't clean it.
When in task manager you will see, that usage of memory is very high. You can see what going on with memory at firefox if you put to address bar about memory , and then press button "Measure".
You can use location.reload(true), with this it clears all the memory(caches etc). Moreover if you want to further clear the storage of a browser then you can use localStorage.clear() in your javascript code. This clears the items stored in the localStorage if you have saved something like this. localStorage.myItem = "something".
I am developing a web application, in which a div is created on every mousemove event.
When profiling using Chrome Developer Tool Timeline, I see an increase of Node Counts (green line), but a really small number of Detached DOM tree when moving the mouse.
When the mouse is not being moved, the Node Counts are steady an never decrease/increase.
I would like to know:
How exactly Node count (green line) works? Does provide memory information cumulatively respect the start of the recording?
I was suspected a DOM memory leak, but taking a HEAP I see few Detached DOM tree. What could be issue for a steady increase of Node count?
Does Node count effect the total memory of the JS application?
What is the difference between Document DOM tree / xxx entries and Object Count?
EDIT:
After some research, I suspect that Node Counts going up does not necessary represents a leak of memory in this case (also Running Chrome/Task Manager I see the JS Memory stable and not continuously increasing).
It most probably represents an in memory usage by the browser, when in fact I do not move the mouse for 30 seconds or open another tab/window, the Garbage Collector kick in and memory is being cleared as in the next image.
By the way any expert advice on this is very very welcome :)
Interesting:
Javascript memory and leak problems
https://developers.google.com/web/tools/chrome-devtools/profile/memory-problems/memory-diagnosis
I want to know if this is normal in a single page app
If you go to: http://todomvc.com/examples/backbone/ and take some heap snapshots of adding and removing the todos -- even if I remove all previous added todos, the memory of the heap snapshot increases every time.
Is this normal?
Should this go back to the initial value if I remove all todos?
Thank you
Should this go back to the initial value if I remove all todos?
Yes and no.
It should go back to it's initial value (or close) but that won't happen untill a garbage collection is actually triggered, which doesn't look like has happened in your case. You can trigger it manually under the "Timeline" tab by clicking the little trashcan icon.
Do that while recording a timeline (and check the memory checkbox) to see the heap usage drop down again.
You'll notice that the number of nodes doesn't drop all the way down to where it was when the page initially loaded and it keeps rising if you keep adding/removing todos and triggering garbage collection several times. That can be an indication a small leak and could be something to investigate further.
It can be a leak or it can be that the application just caches some data structures. Also heap usage may grow as V8 will produce more code, e.g. as a result of function optimizations. Such code may survive several GC phases before it gets collected even if the function is not called any more. This is complicated and in general you shouldn't worry about that as VM should take care of that. Just keep in mind that VM may allocate some internal data structures for its own needs, they are usually collected over time. To see if JavaScript heap growth is valid you can record heap allocations timeline ("Profiles panel" > "Record Heap Allocations"). This will allow you to filter objects by allocation time and then decide if these objects should have survived or not:
This is the fiddle: http://jsfiddle.net/36mdt/
After about 10-20 seconds, the display starts to freeze randomly and shortly after crashes. I cannot reproduce this in Firefox.
Profiling reveals nothing unusual.
http://jsfiddle.net/3pbdQ/ shows there is definitely a memory leak. Even a 1 FPS, the memory usage climes 5 megabytes a frame.
On a side note, this example really shows how Math.random() is really not so random.
I've done only 2 performance improvements and it doesn't crash after 5 mins (also seems to be not leaking memory). Checkout http://jsfiddle.net/3pbdQ/3/
Don't calculate the size inside each iteration
Use timeouts instead of freezing interval.
Use bitwise operator for flooring a number
Profiling reveals nothing unusual.
Chrome Profiler doesn't work with WebWorkers, AFAIK. As per conversation with Paul Irish:
"Check about:inspect for shared workers, also you can do console.profile() within the worker code (I THINK) and capture those bits. The "cleans up" is the garbage collector: if after the cleanup there is still a growing line of excess memory, then thats a leak."
And
On a side note, this example really shows how Math.random() is really
not so random.
It is well known there are no perfect random algorithms, but anyway the bunch of grouped colors you see is because you're not setting canvas.height and canvas.width, and it differs from CSS values.
EDIT: Still leaking memory, I don't know why, about after 10 secs it 'cleans up'. Exceeds my knowledge, but works smoothly at 60 FPS (var TIME = 16)
Depending on the system and browser version you use some steps may vary although I tried my best to provide common steps that are compatible with most systems.
Disable Sandbox:
1. Right click Google Chrome desktop icon.
2. Select Properties.
3. Click Shortcut > Target.
4. Add "--no-sandbox"
5. Click Apply | OK.
6. Download and install ZombieSoftFix.
7. Check and resolve conflicts detected.
Disable Plug-Ins:
1. Type "about:plugins" in Address Bar.
2. Press ENTER.
3. Disable all plug-ins displayed in the list page.
Clear Temporary Files:
1. Click Wrench.
2. Select More Tools | Clear browsing data.
3. Check-up all boxes, click "Clear browsing data" button to confirm the process.
Thanks & Regards.
This is an unfortunate, known Chrome bug.
I've been building Conway's Life with javascript / jquery in order to run it in a browser Here. Chrome, Firefox and Opera or Safari do this pretty fast so preferably don't use IE for this. IE9 is ok though.
While generating the new generations of Life I am storing the previous generations in order to be able to walk back through the history. This works fine until a certain point when memory fills up, which makes the browser(tab) crash.
So my question is: how can I detect when memory is filling up? I am storing an array for each generation in an array which forms the history of generations. This takes massive amounts of memory which crashes the browser after a few thousands of generations, depending on available memory.
I am aware of the fact that javascript can't check the amount of available memory but there must be a way...
I doubt that there is a way to do it. Even if there is, it would probably be browser-specific. I can suggest a different way, though.
Instead of storing all the data for each generation, store snapshots taken every once in a while. Since the Conway's Game of Life is deterministic, you can easily re-generate future frames from a given snapshot. You'll probably want to keep a buffer of a few frames so that you can make rewinding nice and smooth.
In reality, this doesn't actually solve the problem, since you'll run out of space eventually. However, if you store every n frames, your application will last n times longer, which might just be long enough. I would recommend that you impose some hard limits on how far into the past you can rewind so that you have a cap on how much you have to store. Determine that how many frames that would be (10 minutes at 30 FPS = 18000 frames). Then, divide frames by how many frames you can store (profile various web browsers to figure this out) and that is the interval between snapshots you should use.
Dogbert pretty much nailed it. You can't know exactly how much available memory there is but you can know how potentially large your dataset will be.
So, take the size of each object stored in the array, multiply by array dimensions and that's the size of one iteration. Multiply that by the desired number of iterations to see how much space total it will take, and adjust accordingly.
Or, inspired by Travis, simply run the pattern in reverse from the last known array. It is deterministic after all.