By Using JavaScript "Undefined" (Variable) we can Free system memory? - javascript

I am working for an internet speed test application. The site is calculating internet speed using JavaScript . When downloading larger files 250Mb+ The browser crashes. (Chrome , opera , Firefox ). It is saving all test files to DOM. I saw that the memory usage of the browser goes up like crazy while testing 100Mbps + speed connections.
My question: is there any way to limit the memory usage of browsers? Or there is any limit on using browser DOM ?
After using a file, will making it "Undefined" actually delete the item from system memory ?
I saw this browser crash issue after downloading 250Mb + data (next file 250 Mb so 512mb saved in DOM)

You do not need specific data to test network speed. What matters is the size of the data (here, 250 mb). Also, to be sure you are testing for the real speed, there is the additional requirement of not using trivial data (that is, not using all zero, or a quickly repeating pattern).
You may generate 1 mb of random data (or whatever amount that does not crash the application) and send it 250 times : the server still sees 250mb of data, and you do not need to store anything (except that 1 mb).
This works because you are testing for speed : you do not need to check that the data sent is the same as the received data, because the integrity of your data is already insured by the underlying TCP/IP protocol.
Concerning freeing javascript memory : what you can do is assigning the variable another value (var = null;), and check that you do not hold any other reference to that variable. It is then eligible to be garbage collected : the interpreter may or may not free it, depending on its implementation -- you have no control over that beyond this point.

Related

Possible to check 'available memory' within a browser?

I'm just making up a scenario, but let's say I have a 500MB file that I want to provide an html table for the client to view the data. Let's say there are two scenarios:
They are viewing it via a Desktop where they have 1.2GB available memory. They can download the whole file.
Later, they try and view this same table on their phone. We detect that they only have 27MB available memory, and so give them a warning that says "We have detected that your device does not have enough memory to view the entire table. Would you like to download a sample instead?"
Ignoring things like pagination or virtual tables, I'm just concerned about "if the full dataset can fit in the user's available memory". Is this possible to detect in a browser (even with a user's confirmation). If so, how could this be done?
Update
This answer has been answered about 6 years ago, and the question points to an answer from 10 years ago. I'm wondering what the current state is, as browsers have changed quite a bit since then and there's also webassembly and such.
Use performance.memory.usedJSHeapSize. Though it non-standard and in development, it will be enough for testing the memory used. You can try it out on edge/chrome/opera, but unfortunately not on firefox or safari (as of writing).
Attributes (performance.memory)
jsHeapSizeLimit: The maximum size of the heap, in bytes, that is available to the context.
totalJSHeapSize: The total allocated heap size, in bytes.
usedJSHeapSize: The currently active segment of JS heap, in bytes.
Read more about performance.memory: https://developer.mozilla.org/en-US/docs/Web/API/Performance/memory.
CanIUse.com: https://caniuse.com/mdn-api_performance_memory
CanIUse.com 2020/01/22
I ran into exactly this problem some time ago (a non-paged render of a JSON table, because we couldn't use paging, because :-( ), but the problem was even worse than what you describe:
the client having 8 GB of memory does not mean that the memory is available to the browser.
any report of "free memory" on a generic device will be, ultimately, bogus (how much is used as cache and buffers?).
even knowing exactly "how much memory is available to Javascript" leads to a maintenance nightmare because the translation formula from available memory to displayable rows involves a "memory size for a single row" that is unknown and variable between platforms, browsers, and versions.
After some heated discussions, my colleagues and I agreed that this was a XY problem. We did not want to know how much memory the client had, we wanted to know how many table rows it could reasonably and safely display.
Some tests we ran - but this was a couple of months or three before the pandemic, so September 2019, and things might have changed - showed the following interesting effects: if we rendered off-screen, client-side, a table with the same row, repeated, and random data, and timed how long it took to add each row, this time was roughly correlated with the device performances and limits, and allowed a reasonable estimate of the permissible number of actual rows we could display.
I have tried to reimplement a very crude such test from my memory, it ran along these lines and it transmitted the results through an AJAX call upon logon:
var tr = $('<tr><td>...several columns...</td></tr>')
$('body').empty();
$('body').html('<table id="foo">');
var foo = $('#foo');
var s = Date.now();
for (i = 0; i < 500; i++) {
var t = Date.now();
// Limit total runtime to, say, 3 seconds
if ((t - s) > 3000) {
break;
}
for (j = 0; j < 100; j++) {
foo.append(tr.clone());
}
var dt = Date.now() - t;
// We are interested in when dt exceeds a given guard time
if (0 == (i % 50)) { console.debug(i, dt) };
}
// We are also interested in the maximum attained value
console.debug(i*j);
The above is a re-creation of what became a more complex testing rig (it was assigned to a friend of mine, I don't know the details past the initial discussions). On my Firefox on Windows 10, I notice a linear growth of dt that markedly increases around i=450 (I had to increase the runtime to arrive at that value; the laptop I'm using is a fat Precision M6800). About a second after that, Firefox warns me that a script is slowing down the machine (that was, indeed, one of the problems we encountered when sending the JSON data to the client). I do remember that the "elbow" of the curve was the parameter we ended up using.
In practice, if the overall i*j was high enough (the test terminated with all the rows), we knew we need not worry; if it was lower (the test terminated by timeout), but there was no "elbow", we showed a warning with the option to continue; below a certain threshold or if "dt" exceeded a guard limit, the diagnostic stopped even before the timeout, and we just told the client that it couldn't be done, and to download the synthetic report in PDF format.
You may want to use the IndexedDB API together with the Storage API:
Using navigator.storage.estimate().then((storage) => console.log(storage)) you can estimate the available storage the browser allows the site to use. Then, you can decide whether to store the data in an IndexedDB or to prompt the user with not enaugh storage to downlaod a sample.
void async function() {
try {
let storage = await navigator.storage.estimate();
print(`Available: ${storage.quota/(1024*1024)}MiB`);
} catch(e) {
print(`Error: ${e}`);
}
}();
function print(t) {
document.body.appendChild(document.createTextNode(
t
));
}
(This snippet might not work in this snippet context. You may need to run this on a local test server)
Wide Browser Support
IndexedDB will be available in the future: All browsers except Opera
Storage API will be available in the future with exceptions: All browsers except Apple and IE
Sort of.
As of this writing, there is a Device Memory specification under development. It specifies the navigator.deviceMemory property to contain a rough order-of-magnitude estimate of total device memory in GiB; this API is only available to sites served over HTTPS. Both constraints are meant to mitigate the possibility of fingerprinting the client, especially by third parties. (The specification also defines a ‘client hint’ HTTP header, which allows checking available memory directly on the server side.)
However, the W3C Working Draft is dated September 2018, and while the Editor’s Draft is dated November 2020, the changes in that time span are limited to housekeeping and editorial fixups. So development on that front seems lukewarm at best. Also, it is currently only implemented in Chromium derivatives.
And remember: just because a client does have a certain amount of memory, it doesn’t mean it is available to you. Perhaps there are other purposes for which they want to use it. Knowing that a large amount of memory is present is not a permission to use it all up to the exclusion to everyone else. The best uses for this API are probably like the ones specified in the question: detecting whether the data we want to send might be too large for the client to handle.

How can I retrieve lots of data in chrome.storage.local?

I'm creating a Chrome extension that needs to store a lot of data. I've set the unlimitedStorage permission, and I'm successfully storing information using calls to chrome.storage.local.set. However, I'm unable to make calls to chrome.storage.local.get. When I make a call as simple as chrome.storage.local.get(null, () => {}), Chrome crashes.
My issue seems similar to chrome.storage.local.get limit, except that that issue was about a much smaller amount of data, and turned out to be a fluke. When I call chrome.storage.local.getBytesInUse(null, i => console.log(i)), I get a result of 176031461. (Admittedly, 180 MB is a lot more than Chrome extensions should typically use, but this extension will be running on my own machine.)
I'd like to be able to save all of this data into a JSON file, but to do that, it seems I need to bring it into memory first, and the only way to do that is through chrome.storage.local.get. I'm trying to use the method described at Chrome Extension: Local Storage, how to export to download the data, but it doesn't even get to the callback function. I'm not sure what's causing it to crash, and I don't think it's a memory limit, given that I've followed the instructions at Max memory usage of a chrome process (tab) & how do I increase it? to increase Chrome's max memory to 4 GB.
One potential solution would be to set the first parameter to something other than null and download the data in chunks, but I'd rather not do that if I don't have to.
My questions are:
What is the limit to the amount of storage that can be retrieved this way?
Is there any way for me to get all of that data at once without crashing?
Is there a better way for me to be saving files from a Chrome extension?

JS Heap recommended memory size

Is there any limit to the heap size in the chrome memory profile ?
Note: This is a Chrome only answer, see below why.
You should take a look at window.performance.memory in Chrome Dev Tools, there is a jsHeapSizeLimit attribute.
However, I'm not sure this will be the maximum value on any memory profiling y-axis
You can find more informations on MDN : https://developer.mozilla.org/en-US/docs/Web/API/Window/performance
performance.memory :
A non-standard extension added in Chrome.
Linked in MDN (not anymore) : https://webplatform.github.io/docs/apis/timing/properties/memory/
performance.memory :
Note: This property is read-only.
console.log(performance.memory)
// Would show, for example
//{
// jsHeapSizeLimit: 767557632,
// totalJSHeapSize: 58054528,
// usedJSHeapSize: 42930044
//}
Notes
usedJsHeapSize is the total amount of memory being used by JS objects including V8 internal objects, totalJsHeapSize is current size of the JS heap including free space not occupied by any JS objects. This means that usedJsHeapSize can not be greater than totalJsHeapSize. Note that it is not necessarily that there has ever been totalJsHeapSize of alive JS objects.
See the WebKit Patch for how the quantized values are exposed. The tests in particular help explain how it works.
Be careful, values are expressed without units because there isn't. This is because webkit does not want to expose system informations such as available memory size. It only provide a way to compare memory usage (for instace between two different versions of a website).
In theory the memory limit (not LocalStorage, but actual memory) is unlimited, and only bounded by the amount of RAM on the system. In practice, most web browsers will place a limit on each window (for example, 200MB). Sometimes the limit is customizable by the user. Additionally, an operating system can put a limit on the amount of memory used by an application.

Unlimited size of local storage under IE 9?

I have just re-written test for HTML5 persistent storage (localStorage) capacity (the previous one created 1 key in memory, so it was falling on memory exception). I've created also jsFiddle for it: http://jsfiddle.net/rxTkZ/4/
The testing code is a loop:
var value = new Array(10001).join("a")
var i = 1
var task = function() {
localStorage['key_'+i] = value
$("#stored").text(i*10)
i++
setTimeout(task)
}
task()
The local storage capacity under IE9, as opposite to other browsers, seems to be practically unlimited - I've managed to store over 400 million characters, and the test was still running.
Is it a feature I can rely on? I'm writing application for intranet usage, where the browser that will be used is IE 9.
Simple answer to this: no :)
Don't paint yourself into a corner. Web Storage is not meant to store wast amount of data. The standard only recommends 5 mb, some browser implement this, others less (and considering that each char takes up 2 bytes you only get half of that perceptually).
Opera let users adjust the size (12 branch, dunno about the new webkit based version) but that is a fully a user initiated action.
It's not a reliable storage when it comes to storage space. As to IE9 it must be considered a temporary flaw.
If you need large space consider File API (where you can request user approved quota for tons of megabytes) or Indexed DB instead.

Local storage read/write performance in Mozilla Firefox

I am testing different browsers on how do they read/write large amounts of data to/from local storage. Sample data is 1500 customer records with some set of data for everyone (first name, last name, some ids for their location, type, etc). Test application is built on GWT paltform.
And what I noticed is that IE8, IE9, Chrome improved their performance in at least 30% after moving to loading data from local storage (rather than from web server). And only Firefox (5.0) is the one who worsened the results (around 30% slower). Remote web-server was used to bring some sort of reality into experiment.
The difference between browsers is almost invisible on small data chunks (100-200 records) and the resulting time is also about to be the same. But large amounts reveal the problem.
I found mentioning of this issue on mozilla support site - https://support.mozilla.com/en-US/questions/750266
But still no solution or workaround there how to fix it.
Javascript profiling shows that calls to function implemented in GWT StorageImpl.java class
function $key(storage, index){
return index >= 0 && index < $wnd[storage].length ?
$wnd[storage].key(index) : null;
}
take the lion's share of time during execution. Which is actually storage.getItem(key) call in GWT.
To avoid this frequent calls I would rather prefer a single call to translate storage contents to the map, for example, and it might help me to save time spent on Firefox's cache I/O operations (if any). But Storage interface ( http://dev.w3.org/html5/webstorage/#storage-0 ) contains only getItem() function to receive any contents from the storage.
Any thoughts about how to force Firefox work faster?
P.S. Maybe will be useful for someone: I found FF local storage contents using addon SQLite manager, and loading webappstore.sqlite database from the drop-down list of default built-in databases.
Which version of Firefox are you testing? Your post on support.mozilla.org mentions Firefox 3.6.8, and you mention IE, so are presumably on Windows, in which case you're probably hitting https://bugzilla.mozilla.org/show_bug.cgi?id=536544 which was fixed in Firefox 4. Or are you seeing the problem in a recent Firefox?

Categories