I'm creating a Chrome extension that needs to store a lot of data. I've set the unlimitedStorage permission, and I'm successfully storing information using calls to chrome.storage.local.set. However, I'm unable to make calls to chrome.storage.local.get. When I make a call as simple as chrome.storage.local.get(null, () => {}), Chrome crashes.
My issue seems similar to chrome.storage.local.get limit, except that that issue was about a much smaller amount of data, and turned out to be a fluke. When I call chrome.storage.local.getBytesInUse(null, i => console.log(i)), I get a result of 176031461. (Admittedly, 180 MB is a lot more than Chrome extensions should typically use, but this extension will be running on my own machine.)
I'd like to be able to save all of this data into a JSON file, but to do that, it seems I need to bring it into memory first, and the only way to do that is through chrome.storage.local.get. I'm trying to use the method described at Chrome Extension: Local Storage, how to export to download the data, but it doesn't even get to the callback function. I'm not sure what's causing it to crash, and I don't think it's a memory limit, given that I've followed the instructions at Max memory usage of a chrome process (tab) & how do I increase it? to increase Chrome's max memory to 4 GB.
One potential solution would be to set the first parameter to something other than null and download the data in chunks, but I'd rather not do that if I don't have to.
My questions are:
What is the limit to the amount of storage that can be retrieved this way?
Is there any way for me to get all of that data at once without crashing?
Is there a better way for me to be saving files from a Chrome extension?
Related
I am having a strange problem with IndexDB in google chrome. I am saving large amounts of data to indexdb. However, the Application tab's dashboard (devtools) shows that I used more space than my data holds. I am going to explain via two screenshots:
In this image, as you can see, my data holds only 1.7 megabytes. There is nothing else stored in the IndexDB other than these two entries. However, when I switch to the "Clear Storage" section to see overall storage usage for this domain, I see something quite strange.
Here, it shows that there are 59.3 megabytes of data stored in IndexDB. I don't understand what's the issue here to be honest. I clear site data and save same data in the cache, result is the same. What is the problem here?
Chrome's implementation of Indexed DB compacts space lazily, so it's unsurprising that it often shows more data use than expected. That said, this is 10x what you have stored, which seems unusual.
You should create a minimal standalone repro and report it at https://new.crbug.com
Let's say I'm writing an extension that needs to use a few collections, let it be list/array of strings. localStorage only allows to save data as a string so my current options are:
fetch string on a tab
process it into collection
work with it
save
this has to be done EVERY TIME for each tab/instance and EVERY operation because I cant be sure in the collection integrity sine other tab may have written something else into that localStorage item (that means I also have to implement locking mechanism and respect it).
What I want is some kind of way of having shared/global collections ("variable") I can initialize on extension load (either browser start or first matching by url tab that activates extension code) that I can re-use. This way locking is somewhat more sensible and overall usage makes sense. I assume I can save/write it on browser exit.
I tried to google but could not find any relevant info about that. If my question does not make much sense than may be someone can redirect to some write up how to handle data in such cases for browser extensions?
I am working for an internet speed test application. The site is calculating internet speed using JavaScript . When downloading larger files 250Mb+ The browser crashes. (Chrome , opera , Firefox ). It is saving all test files to DOM. I saw that the memory usage of the browser goes up like crazy while testing 100Mbps + speed connections.
My question: is there any way to limit the memory usage of browsers? Or there is any limit on using browser DOM ?
After using a file, will making it "Undefined" actually delete the item from system memory ?
I saw this browser crash issue after downloading 250Mb + data (next file 250 Mb so 512mb saved in DOM)
You do not need specific data to test network speed. What matters is the size of the data (here, 250 mb). Also, to be sure you are testing for the real speed, there is the additional requirement of not using trivial data (that is, not using all zero, or a quickly repeating pattern).
You may generate 1 mb of random data (or whatever amount that does not crash the application) and send it 250 times : the server still sees 250mb of data, and you do not need to store anything (except that 1 mb).
This works because you are testing for speed : you do not need to check that the data sent is the same as the received data, because the integrity of your data is already insured by the underlying TCP/IP protocol.
Concerning freeing javascript memory : what you can do is assigning the variable another value (var = null;), and check that you do not hold any other reference to that variable. It is then eligible to be garbage collected : the interpreter may or may not free it, depending on its implementation -- you have no control over that beyond this point.
I am testing different browsers on how do they read/write large amounts of data to/from local storage. Sample data is 1500 customer records with some set of data for everyone (first name, last name, some ids for their location, type, etc). Test application is built on GWT paltform.
And what I noticed is that IE8, IE9, Chrome improved their performance in at least 30% after moving to loading data from local storage (rather than from web server). And only Firefox (5.0) is the one who worsened the results (around 30% slower). Remote web-server was used to bring some sort of reality into experiment.
The difference between browsers is almost invisible on small data chunks (100-200 records) and the resulting time is also about to be the same. But large amounts reveal the problem.
I found mentioning of this issue on mozilla support site - https://support.mozilla.com/en-US/questions/750266
But still no solution or workaround there how to fix it.
Javascript profiling shows that calls to function implemented in GWT StorageImpl.java class
function $key(storage, index){
return index >= 0 && index < $wnd[storage].length ?
$wnd[storage].key(index) : null;
}
take the lion's share of time during execution. Which is actually storage.getItem(key) call in GWT.
To avoid this frequent calls I would rather prefer a single call to translate storage contents to the map, for example, and it might help me to save time spent on Firefox's cache I/O operations (if any). But Storage interface ( http://dev.w3.org/html5/webstorage/#storage-0 ) contains only getItem() function to receive any contents from the storage.
Any thoughts about how to force Firefox work faster?
P.S. Maybe will be useful for someone: I found FF local storage contents using addon SQLite manager, and loading webappstore.sqlite database from the drop-down list of default built-in databases.
Which version of Firefox are you testing? Your post on support.mozilla.org mentions Firefox 3.6.8, and you mention IE, so are presumably on Windows, in which case you're probably hitting https://bugzilla.mozilla.org/show_bug.cgi?id=536544 which was fixed in Firefox 4. Or are you seeing the problem in a recent Firefox?
So I don't actually mean browser caching of an Ajax request using the GET method, but storing large queries (any number, likely double-digits, of 40 - 300kb queries) in the the browser's memory.
What are the unseen benefits, risks associated with this?
var response = JSON.parse(xhr.responseText);
Cache.push(response); // Store parsed JSON object in global variable `Cache`
// Time passes, stuff is done ...
if(Cache[query])
load(Cache[query])
else
Ajax(query, cache_results);
Is there an actual need? Or is it just optimization for the sake of? I'd suggest doing some profiling first and see where the bottlenecks lie. Remember that a web page session typically doesn't last that long, so unless you're using some kind of offline storage the cache won't last that long.
Not having the full view of your system it is hard to tell but I would think that potentially playing with stale data will be a concern.
Of course, if you have a protocol in place to resolve "cache freshness" you are on the right track.... but then, why not rely on the HTTP protocol to do this? (HTTP GET with ETag/Last-Modified headers)
You'll probably want to stress-test the memory usage and general performance of various browsers when storing many 300kb strings. You can monitor them in task manager, and also use performance tools like Speed Tracer and dynatrace ajax edition.
If it turns out that caching is a performance win but it starts to get bogged down when you have too many strings in memory, you might think of trying HTML5 storage or Flash storage to store the strings--that way you can cache things across sessions as well. Dojo storage is a good library for this.