How to Check that how much offline storage has been Used - javascript

I am using offline storage and want to clear it when storage memory is full. All I want is to know that how to get used storage. I am using the following code to clear offline storage:
localStorage.clear();

You could use localStorage.length to get the count of elements in the storage. But it is unlikely that it will directly give away the size (in bytes) of how much storage is used - unless you are storing monotonic data that lets you predict the size based on the count of keys.
But you do get a QUOTA_EXCEEDED_ERR exception when setting values if you do exceed the available limit. So, just wrap your code in a try..catch and if you do get error.type as QUOTA_EXCEEDED_ERR, then you could clear the space.
You could also iterate over all items in the storage to get its full size. But I would not do that given that it might take a bit of time as the storage increases. In case you are interested, something like this would work:
var size = 0;
for (var i = 0, len = localStorage.length; i < len; ++i) {
size += localStorage.getItem(localStorage.key(i)).length;
}
console.log("Size: " + size);
Ref: What happens when localStorage is full?
PS: I tried to get the size by iterating over the localStorage keys, and adding the size, but the browser tab crashed. So I'd say relying on the QUOTA_EXCEEDED_ERR is better.
UPDATE
You could also get a rough estimate of size using JSON.stringify(localStorage).length. Which seems faster than iterating over the keys, and doesn't crash the browser. But keep in mind that it contains additional JSON characters (such as {,},, and ") as browser would use them to wrap the keys and values. So the size would be slightly higher than the actual value.

Related

A better way to implement in-memory-cache in javascript / nodejs

I had a question. The following code has a memory leak:
let inMemoryCache = {};
app.get("/hello",(req, resp) => {
inMemoryCache[unixTimeStamp] = {"foo":"bar"}
resp.json({});
});
Isn't it? The size of the object inMemoryCache will keep on increasing with each request until it hits the ceiling and the heap size explodes.
What then is the best to implement in-memory-caches?
Manage the size of your cache somehow. For example, you could keep a fixed number of entries, and once the cache has reached its maximum size and a new entry comes in, delete the oldest (or least frequently used, or least recently used, or some other selection criteria).
Caching is harder than it seems :-)

Javascript - Chrome: childnodes maximum length [duplicate]

I want to use AJAX to load an htmlfile into a <div> I will then need to run jsMath on this. Everything I have done so far with innerHTML has been a paragraph or two, maybe a table and/or image. Nothing too fancy.
What potential problems may occur when I set innerHTML to an external 25k file, with all sorts of complex css formatting? (thanks to jsMath) I can't think of any other method of doing this, but need to know if there are any limitations.
Thanks in advance.
--Dave
I don't know about any browser specific size limits, but if you assign a string longer that 65536, Chrome splits it into many elem.childNodes, so you might have to loop over these nodes and concatenate them.
Run the below snipped in Chrome Dev Tools. It constructs a 160k string, but theDivElement.childNodes[0] gets clipped to 65536 chars.
var longString = '1234567890';
for (var i = 0; i < 14; ++i) {
longString = longString + longString;
}
console.log('The length of our long string: ' + longString.length);
var elem = document.createElement('div');
elem.innerHTML = longString;
var innerHtmlValue = elem.childNodes[0].nodeValue;
console.log('The length as innerHTML-childNodes[0]: ' + innerHtmlValue.length);
console.log('Num child nodes: ' + elem.childNodes.length);
Result: (Chrome version 39.0.2171.95 (64-bit), Linux Mint 17)
The length of our long string: 163840
The length as innerHTML-childNodes[0]: 65536
Num child nodes: 3
But in Firefox, innerHTML doesn't split the contents into many nodes: (Firefox version 34.0, Linux Mint 17)
"The length of our long string: 163840"
"The length as innerHTML-childNodes[0]: 163840"
"Num child nodes: 1"
So you'd need to take into account that different browsers handle childNodes differently, and perhaps iterate over all child nodes and concatenate. (I noticed this, because I tried to use innerHTML to unescape a > 100k HTML encoded string.)
In fact, in Firefox I can create an innerHTML-childNodes[0] of length 167 772 160, by looping to i < 24 above. But somewhere above this length, there is an InternalError: allocation size overflow error.
There's nothing to prevent you from doing this technically. The biggest issue will be page load time. Be sure to include some sort of indication that the data is loading or it will look like nothing's happening.
In the application I am currently working on, I have not had any problems in any browser setting innerHTML to a string of 30k or more. (Don't know what the limit is)
The only kind of limits that are on this type of thing are purely bandwidth and processor related. You should make sure you don't have a low timeout set on your ajax request. You should also test on some lower speed computers to see if there is a memory issue. Some old browsers can be pretty unforgiving of large objects in memory.
You'll probably want to profile this with a tool like dynatrace ajax or speed tracer to understand how setting innerHTML to a really huge value affects performance. You might want to compare it with another approach like putting the new content in an iframe, or paginating the content.
your limit will be most likely the download limit set from your web server. usually a couple of MBs.Several web frameworks allows increasing this size but you cant just do that because that would mean increase buffer size which is not a good thing.

Maximum size of an Array in Javascript

Context: I'm building a little site that reads an rss feed, and updates/checks the feed in the background. I have one array to store data to display, and another which stores ID's of records that have been shown.
Question: How many items can an array hold in Javascript before things start getting slow, or sluggish. I'm not sorting the array, but am using jQuery's inArray function to do a comparison.
The website will be left running, and updating and its unlikely that the browser will be restarted / refreshed that often.
If I should think about clearing some records from the array, what is the best way to remove some records after a limit, like 100 items.
The maximum length until "it gets sluggish" is totally dependent on your target machine and your actual code, so you'll need to test on that (those) platform(s) to see what is acceptable.
However, the maximum length of an array according to the ECMA-262 5th Edition specification is bound by an unsigned 32-bit integer due to the ToUint32 abstract operation, so the longest possible array could have 232-1 = 4,294,967,295 = 4.29 billion elements.
No need to trim the array, simply address it as a circular buffer (index % maxlen). This will ensure it never goes over the limit (implementing a circular buffer means that once you get to the end you wrap around to the beginning again - not possible to overrun the end of the array).
For example:
var container = new Array ();
var maxlen = 100;
var index = 0;
// 'store' 1538 items (only the last 'maxlen' items are kept)
for (var i=0; i<1538; i++) {
container [index++ % maxlen] = "storing" + i;
}
// get element at index 11 (you want the 11th item in the array)
eleventh = container [(index + 11) % maxlen];
// get element at index 11 (you want the 11th item in the array)
thirtyfifth = container [(index + 35) % maxlen];
// print out all 100 elements that we have left in the array, note
// that it doesn't matter if we address past 100 - circular buffer
// so we'll simply get back to the beginning if we do that.
for (i=0; i<200; i++) {
document.write (container[(index + i) % maxlen] + "<br>\n");
}
Like #maerics said, your target machine and browser will determine performance.
But for some real world numbers, on my 2017 enterprise Chromebook, running the operation:
console.time();
Array(x).fill(0).filter(x => x < 6).length
console.timeEnd();
x=5e4 takes 16ms, good enough for 60fps
x=4e6 takes 250ms, which is noticeable but not a big deal
x=3e7 takes 1300ms, which is pretty bad
x=4e7 takes 11000ms and allocates an extra 2.5GB of memory
So around 30 million elements is a hard upper limit, because the javascript VM falls off a cliff at 40 million elements and will probably crash the process.
EDIT: In the code above, I'm actually filling the array with elements and looping over them, simulating the minimum of what an app might want to do with an array. If you just run Array(2**32-1) you're creating a sparse array that's closer to an empty JavaScript object with a length, like {length: 4294967295}. If you actually tried to use all those 4 billion elements, you'll definitely crash the javascript process.
You could try something like this to test and trim the length:
http://jsfiddle.net/orolo/wJDXL/
var longArray = [1, 2, 3, 4, 5, 6, 7, 8];
if (longArray.length >= 6) {
longArray.length = 3;
}
alert(longArray); //1, 2, 3
I have built a performance framework that manipulates and graphs millions of datasets, and even then, the javascript calculation latency was on order of tens of milliseconds. Unless you're worried about going over the array size limit, I don't think you have much to worry about.
It will be very browser dependant. 100 items doesn't sound like a large number - I expect you could go a lot higher than that. Thousands shouldn't be a problem. What may be a problem is the total memory consumption.
I have shamelessly pulled some pretty big datasets in memory, and altough it did get sluggish it took maybe 15 Mo of data upwards with pretty intense calculations on the dataset. I doubt you will run into problems with memory unless you have intense calculations on the data and many many rows. Profiling and benchmarking with different mock resultsets will be your best bet to evaluate performance.

How to prevent "A script on this page is causing Internet Explorer to run slowly" without changing MaxScriptStatements in the registry?

We are using Bing and/or Google javascript map controls, sometimes with large numbers of dynamically alterable overlays.
I have read http://support.microsoft.com/kb/175500/en-us and know how to set the MaxScriptStatments registry key.
Problem is we do not want to programmatically set this or any other registry key on users' computers but would rather achieve the same effect some other way.
Is there another way?
Hardly anything you can do besides making your script "lighter". Try to profile it and figure out where the heaviest crunching takes place, then try to optimize those parts, break them down into smaller components, call the next component with a timeout after the previous one has finished and so on. Basically, give the control back to the browser every once in a while, don't crunch everything in one function call.
Generally a long running script is encountered in code that is looping.
If you're having to loop over a large collection of data and it can be done asynchronously--akin to another thread then move the processing to a webworker(http://www.w3schools.com/HTML/html5_webworkers.asp).
If you cannot or do not want to use a webworker then you can find your main loop that is causing the long running script and you can give it a max number of loops and then cause it to yield back to the client using setTimeout.
Bad: (thingToProcess may be too large, resulting in a long running script)
function Process(thingToProcess){
var i;
for(i=0; i < thingToProcess.length; i++){
//process here
}
}
Good: (only allows 100 iterations before yielding back)
function Process(thingToProcess, start){
var i;
if(!start) start = 0;
for(i=start; i < thingToProcess.length && i - start < 100; i++){
//process here
}
if(i < thingToProcess.length) //still more to process
setTimeout(function(){Process(thingToProcess, i);}, 0);
}
Both can be called in the same way:
Process(myCollectionToProcess);

innerHTML size limit

I want to use AJAX to load an htmlfile into a <div> I will then need to run jsMath on this. Everything I have done so far with innerHTML has been a paragraph or two, maybe a table and/or image. Nothing too fancy.
What potential problems may occur when I set innerHTML to an external 25k file, with all sorts of complex css formatting? (thanks to jsMath) I can't think of any other method of doing this, but need to know if there are any limitations.
Thanks in advance.
--Dave
I don't know about any browser specific size limits, but if you assign a string longer that 65536, Chrome splits it into many elem.childNodes, so you might have to loop over these nodes and concatenate them.
Run the below snipped in Chrome Dev Tools. It constructs a 160k string, but theDivElement.childNodes[0] gets clipped to 65536 chars.
var longString = '1234567890';
for (var i = 0; i < 14; ++i) {
longString = longString + longString;
}
console.log('The length of our long string: ' + longString.length);
var elem = document.createElement('div');
elem.innerHTML = longString;
var innerHtmlValue = elem.childNodes[0].nodeValue;
console.log('The length as innerHTML-childNodes[0]: ' + innerHtmlValue.length);
console.log('Num child nodes: ' + elem.childNodes.length);
Result: (Chrome version 39.0.2171.95 (64-bit), Linux Mint 17)
The length of our long string: 163840
The length as innerHTML-childNodes[0]: 65536
Num child nodes: 3
But in Firefox, innerHTML doesn't split the contents into many nodes: (Firefox version 34.0, Linux Mint 17)
"The length of our long string: 163840"
"The length as innerHTML-childNodes[0]: 163840"
"Num child nodes: 1"
So you'd need to take into account that different browsers handle childNodes differently, and perhaps iterate over all child nodes and concatenate. (I noticed this, because I tried to use innerHTML to unescape a > 100k HTML encoded string.)
In fact, in Firefox I can create an innerHTML-childNodes[0] of length 167 772 160, by looping to i < 24 above. But somewhere above this length, there is an InternalError: allocation size overflow error.
There's nothing to prevent you from doing this technically. The biggest issue will be page load time. Be sure to include some sort of indication that the data is loading or it will look like nothing's happening.
In the application I am currently working on, I have not had any problems in any browser setting innerHTML to a string of 30k or more. (Don't know what the limit is)
The only kind of limits that are on this type of thing are purely bandwidth and processor related. You should make sure you don't have a low timeout set on your ajax request. You should also test on some lower speed computers to see if there is a memory issue. Some old browsers can be pretty unforgiving of large objects in memory.
You'll probably want to profile this with a tool like dynatrace ajax or speed tracer to understand how setting innerHTML to a really huge value affects performance. You might want to compare it with another approach like putting the new content in an iframe, or paginating the content.
your limit will be most likely the download limit set from your web server. usually a couple of MBs.Several web frameworks allows increasing this size but you cant just do that because that would mean increase buffer size which is not a good thing.

Categories