I'm new to JavaScript and come from C++ background. This will sound silly but I can't find how to delete objects created with new in JavaScript.
Here's an example:
function Article (id) {
this.content = db.get('article', "id:" + id);
...
}
var article = new Article(5);
Every instance of Article allocates memory as it gets data from the database (in my case content of the article). This leads my application to quickly grow to gigabytes in size of memory usage.
How do I release memory in JavaScript? I found delete but it appears to delete array and hash elements rather than Objects.
Just remove all references to the object, and it will be garbage collected (when the JS engine does some garbage collection).
article = undefined; // or some other value
It happens automatically through garbage collection at some point in the future, or never. It is non-deterministic, unlike RAII in C++.
Related
If I have the following code:
function MyClass() {
this.data = {
// lots of data
};
}
var myClassInstace = new MyClass();
var myobj = {
num:123,
str:"hello",
theClass:myClassInstance
};
I know it's absolutely necessary to do:
myobj.theClass = null;
To free up myClassInstance and its data property for GC. However, what should I do with myobj.num and myobj.str? Do I have to give them a value of null too? Does the fact that they're primitive change anything regarding GC?
The JavaScript runtime that implements garbage collection will be able to collect items as soon as values are no longer reachable from code. This is true for object references as well as primitives. The details of the exact moment the item is collected varies by implementation, but it is not even necessary to set your object references to null (as you state) unless you need the object cleaned up sooner than the natural termination of the current function.
This all ties into the fundamental concept of "scope" and the Scope Chain. When an item is no longer in any other objects scope chain it can be collected. Understanding this clearly will answer this question and also help to understand closures, which are scenarios where items stay in memory longer than you might have expected.
There are a lot of "it depends here", ranging from what your code is doing to what browser you're running in. However, if your object is JIT compiled to not use a map for its attributes, then the number should be an 8 byte double stored inline inside the object. Nulling it will do nothing.
The string and the myclass instance will be a pointer to memory allocated outside the object (since a string can be arbitarily many bytes, it can't be stored inside the object. A compiler could conceivably store one instance of the string in memory and never free it, however). Nulling them can allow the garbage collector to free them before the main object goes out of scope.
However, the real question is why you're worried about this. Unless you have profiled your code and identified garbage collection or memory leaks as a problem, you should not be trying to optimize GC behavior. In particular, unless your myobj object is itself going to be live for a long time, you should not worry about nulling fields. The GC will collect it when it goes out of scope.
setting to undefined (not null) will work however delete is better example delete myobj.theClass
Just to avoid misunderstanding I will say that there is no way to really delete an object from memory in JavaScript. you delete it's references or set them to undefined so that the GC can do it's work and really delete.
I'm a novice to this kind of javascript, so I'll give a brief explanation:
I have a web scraper built in Nodejs that gathers (quite a bit of) data, processes it with Cheerio (basically jQuery for Node) creates an object then uploads it to mongoDB.
It works just fine, except for on larger sites. What's appears to be happening is:
I give the scraper an online store's URL to scrape
Node goes to that URL and retrieves anywhere from 5,000 - 40,000 product urls to scrape
For each of these new URLs, Node's request module gets the page source then loads up the data to Cheerio.
Using Cheerio I create a JS object which represents the product.
I ship the object off to MongoDB where it's saved to my database.
As I say, this happens for thousands of URLs and once I get to, say, 10,000 urls loaded I get errors in node. The most common is:
Node: Fatal JS Error: Process out of memory
Ok, here's the actual question(s):
I think this is happening because Node's garbage cleanup isn't working properly. It's possible that, for example, the request data scraped from all 40,000 urls is still in memory, or at the very least the 40,000 created javascript objects may be. Perhaps it's also because the MongoDB connection is made at the start of the session and is never closed (I just close the script manually once all the products are done). This is to avoid opening/closing the connection it every single time I log a new product.
To really ensure they're cleaned up properly (once the product goes to MongoDB I don't use it anymore and can be deleted from memory) can/should I just simply delete it from memory, simply using delete product?
Moreso (I'm clearly not across how JS handles objects) if I delete one reference to the object is it totally wiped from memory, or do I have to delete all of them?
For instance:
var saveToDB = require ('./mongoDBFunction.js');
function getData(link){
request(link, function(data){
var $ = cheerio.load(data);
createProduct($)
})
}
function createProduct($)
var product = {
a: 'asadf',
b: 'asdfsd'
// there's about 50 lines of data in here in the real products but this is for brevity
}
product.name = $('.selector').dostuffwithitinjquery('etc');
saveToDB(product);
}
// In mongoDBFunction.js
exports.saveToDB(item){
db.products.save(item, function(err){
console.log("Item was successfully saved!");
delete item; // Will this completely delete the item from memory?
})
}
delete in javascript is NOT used to delete variables or free memory. It is ONLY used to remove a property from an object. You may find this article on the delete operator a good read.
You can remove a reference to the data held in a variable by setting the variable to something like null. If there are no other references to that data, then that will make it eligible for garbage collection. If there are other references to that object, then it will not be cleared from memory until there are no more references to it (e.g. no way for your code to get to it).
As for what is causing the memory accumulation, there are a number of possibilities and we can't really see enough of your code to know what references could be held onto that would keep the GC from freeing up things.
If this is a single, long running process with no breaks in execution, you might also need to manually run the garbage collector to make sure it gets a chance to clean up things you have released.
Here's are a couple articles on tracking down your memory usage in node.js: http://dtrace.org/blogs/bmc/2012/05/05/debugging-node-js-memory-leaks/ and https://hacks.mozilla.org/2012/11/tracking-down-memory-leaks-in-node-js-a-node-js-holiday-season/.
JavaScript has a garbage collector that automatically track which variable is "reachable". If a variable is "reachable", then its value won't be released.
For example if you have a global variable var g_hugeArray and you assign it a huge array, you actually have two JavaScript object here: one is the huge block that holds the array data. Another is a property on the window object whose name is "g_hugeArray" that points to that data. So the reference chain is: window -> g_hugeArray -> the actual array.
In order to release the actual array, you make the actual array "unreachable". you can break either link the above chain to achieve this. If you set g_hugeArray to null, then you break the link between g_hugeArray and the actual array. This makes the array data unreachable thus it will be released when the garbage collector runs. Alternatively, you can use "delete window.g_hugeArray" to remove property "g_hugeArray" from the window object. This breaks the link between window and g_hugeArray and also makes the actual array unreachable.
The situation gets more complicated when you have "closures". A closure is created when you have a local function that reference a local variable. For example:
function a()
{
var x = 10;
var y = 20;
setTimeout(function()
{
alert(x);
}, 100);
}
In this case, local variable x is still reachable from the anonymous time out function even after function "a" has returned. If without the timeout function, then both local variable x and y will become unreachable as soon as function a returns. But the existence of the anonymous function change this. Depending on how the JavaScript engine is implemented, it may choose to keep both variable x and y (because it doesn't know whether the function will need y until the function actually runs, which occurs after function a returns). Or if it is smart enough, it can only keep x. Imagine that if both x and y points to big things, this can be a problem. So closure is very convenient but at times it is more likely to cause memory issues and can make it more difficult to track memory issues.
I faced same problem in my application with similar functionality. I've been looking for memory leaks or something like that. The size of consumed memory my process has reached to 1.4 GB and depends on the number of links that must be downloaded.
The first thing I noticed was that after manually running the Garbage Collector, almost all memory was freed. Each page that I downloaded took about 1 MB, was processed and stored in the database.
Then I install heapdump and looked at the snapshot of the application. More information about memory profiling you can found at Webstorm Blog.
My guess is that while the application is running, the GC does not start. To do this, I began to run application with the flag --expose-gc, and began to run GC manually at the time of implementation of the program.
const runGCIfNeeded = (() => {
let i = 0;
return function runGCIfNeeded() {
if (i++ > 200) {
i = 0;
if (global.gc) {
global.gc();
} else {
logger.warn('Garbage collection unavailable. Pass --expose-gc when launching node to enable forced garbage collection.');
}
}
};
})();
// run GC check after each iteration
checkProduct(product._id)
.then(/* ... */)
.finally(runGCIfNeeded)
Interestingly, if you do not use const, let, var, etc when you define something in the global scope, it seems be an attribute of the global object, and deleting returns true. This could cause it to be garbage collected. I tested it like this and it seems to have the intended impact on my memory usage, please let me know if this is incorrect or if you got drastically different results:
x = [];
process.memoryUsage();
i = 0;
while(i<1000000) {
x.push(10.5);
}
process.memoryUsage();
delete x
process.memoryUsage();
this is my code, I do not know if it good for prevent leaking memory ? help and how can I test for leaking memory?
var Test = function () {
this.ar = [];
this.List = function () {
return this.ar;
}
this.Add = function (str) {
this.ar.push(str);
}
}
use:
var t = new Test();
t.Add("One");
t.Add("Two");
t.Add("Three");
alert(JSON.stringify(t.List()));
t = undefined;
alert(JSON.stringify(t.List() ));
Setting t to undefined will clear that reference to the object. If there are no other references to that object in your code, then the garbage collector will indeed free up that Test() object. That's how things work in javascript. You don't delete an object, you just clear any references to it. When all references are gone, the object is available for garbage collection.
The actual delete keyword in javascript is used only to remove a property from an object as in delete t.list.
Different browsers have different tools available for keeping track of memory usage. The most universal, blackbox way I know of for a test is to run a cycle over and over again where you assign very large objects (I often use big strings) into your test (to consume noticable amounts of memory) with some sort of setTimeout() in between some number of runs (to let the garbage collector have some cycles) and then just watch the overall memory usage of the browser. As long as the overall memory usage doesn't keep going up and up as you keep doing more and more runs then you must not have a noticeable leak.
Individual browsers may have more comprehensive measuring tools available. Info here for Chrome.
I want to have literally a Dictionary<Node, Object>
This is basically an ES6 WeakMap but I need to work with IE8.
The main feature I want is
minimize memory leaks
O(1) lookup on Object given Node.
My implementation:
var uuid = 0,
domShimString = "__domShim__";
var dataManager = {
_stores: {},
getStore: function _getStore(el) {
var id = el[domShimString];
if (id === undefined) {
return this._createStore(el);
}
return this._stores[domShimString + id];
},
_createStore: function _createStore(el) {
var store = {};
this._stores[domShimString + uuid] = store;
el[domShimString] = uuid;
uuid++;
return store;
}
};
My implementation is O(1) but has memory leaks.
What's the correct way to implement this to minimize memory leaks?
In an article I wrote recently, ES 6 - A quick look at weak maps, I explained how jQuery is able to make data() leak free. It basically generates an expando property name, jQuery.expando. When you attach data to an element, the data is pushed to an internal cache array, and the element is given the expando property with a value of the index of the data in the cache. Something similar to this:
element[jQuery.expando] = elementId;
The way to prevent circular references is to not attach objects directly to elements as expandos. If a reference to the element remains in code, then that element cannot be garbage collected even if it is removed from the DOM. However, preventing circular references doesn't plug the leak entirely - there's still data left in the array if the element is removed from the DOM and garbage collected. So jQuery clears the array on page unload, as well as removing data from the array if elements are removed from the DOM using its own methods like remove(). It keeps the data alive for detach().
The reason jQuery does this, is because there is no weak map equivalent, it's kind of shimmable in ES 5 but not in ES 3. As explained in my article, WeakMap is made for exactly this kind of situation, but the only available implementation is in Firefox 6 and above and, with the spec not being finalized, even that shouldn't be used in production environments.
Another thing to take from my article is that certain elements will not allow you to attach expando properties — <object> and <embed> are the two culprits named and shamed in the jQuery source code. For these element's, you're pretty much screwed and jQuery just will not let you use data on them.
Basic circular reference memory leaks occur in reference counted implementations when two object's properties hold direct references to each other. So DOMObject holds a reference to JSObject and vice versa. Assuming there are no other references to either object, they'd both have a permanent reference count of 1 and the GC would not mark them for collection.
Older browsers (IE6) wouldn't break these circular references, even on page unload, whilst newer browsers are able to break many of these circular references by recognizing the patterns that cause them. jQuery.cache and similar patterns partially void memory leaks because DOMObject never holds a reference to JSObject so, even when JSObject holds a reference to DOMObject, the GC can still mark JSObject for collection when there are no more references to it. Once the GC has collected JSObject, the reference count for DOMObject will be reduced, freeing that up for collection also.
Although IE 8+ and other reference counting browsers may be able to break many circular reference patterns (around 400 were fixed for IE 8), the likelihood of leaks is only reduced. For instance, I've seen a huge leak in one of my own apps in IE 8, when working with script elements and JSONP. The best solution is to plan for the worst and, without WeakMap(), the best you can do is use the jQuery data pattern. Sure, you might be risking having orphaned objects, but this is the lesser of two evils.
I'm currently writing a node.js/socket.io application but the question is general to javascript.
I have an associative array that store a color for each client connection. Consider the following:
var clientColors = new Array();
//This execute each new connection
socket.on('connection', function(client){
clientColors[client.sessionId] = "red";
//This execute each time a client disconnect
client.on('disconnect', function () {
delete clientColors[client.sessionId];
});
});
If I use the delete statement, I fear that it will make a memory leak as the property named after client.sessionId value(associative arrays are objects) won't be deleted, its reference to its value will be gonne but the property will still exist in the object.
Am I right?
delete clientColors[client.sessionId];
This will remove the reference to the object on object clientColors. The v8 garbage collector will pick up any objects with zero references for garbage collection.
Now if you asked whether this created a memory leak in IE4 then that's a different question. v8 on the other hand can handle this easily.
Seeing as you deleted the only reference the property will also be gone. Also note that objects are not "associative arrays" in javascript since ordering is implementation specific and not garantueed by the ES specification.
Since clientColors[client.sessionId] is a primitive value (a string) in this case, it will be cleared immediately.
Let's consider the more complicated case of foo[bar] = o with o being a non-primitive (some object). It's important to understand that o is not stored "inside" foo, but somewhere in an arbitrary memory location. foo merely holds a pointer to that location. When you call delete foo[bar], that pointer is cleared, and it's up to the garbage collector to free the memory taken by o.
BTW: You shouldn't use Array when you want an associative array. The latter is called Object in Javascript and is usually instantiated using the short-hand quasi-literal {}
If you are using the V8 engine or nodejs/io, it may not lead to a leak but it is always advisable to prevent leaks.
Just delete it
delete clientColors[client.sessionId];
Or set it to null
clientColors[client.sessionId] = null;
Which will also cascade to any prototypically inherited objects.
This way there is almost no probability of starting a leak.