setting the browser memory of client side - javascript

I have an app in which user uploads images from computer and then draw images on the canvas.
In chrome and firefox i'm using filereader. But if user uploads a very large image, it doesn't load properly or doesn't load and can't be drawn onto the canvas.
I have tried it by accessing the same file direct from computer and it works fine. So is there any way to increase the browser memory so that large images load properly?
Or there is some other problem!!!

In my experience you can reliably bet on 5MB minimum for the platforms you mention above. Keep your data below that level and you should be pretty safe.
Read this article. http://diveintohtml5.ep.io/storage.html it has some nice nuggets of info, but it's not all accurate, especially the part that says you cant up the limit.
I know for a fact that on iPhone once you reach the limit the phone will ask the user if they want to allow more space. (Sort of accurate, but not entirely)
On Android platforms the heap memory limit is set at 12MB. Not sure about the other platforms. Since you are going to be running in some kind of webcontainer (Webkit or other) I wouldn't worry too much about it. The containers themselves are pretty good at managing memory and implementing file caches to minimize their footprint.
I recommend you leave the memory optimizations and such for last. Who knows, you might not even need it. Dont optimize prematurely.
PS:
Look at Phonegap: http://phonegap.com/

Related

The canvas has been tainted by cross-origin data local image

This question has been asked a lot, but I just don't understand why this is happening to me.
Basically, I have a canvas, and an image, and when I try to do this:
var canvas = document.getElementById('somecanvas');
var ctx = canvas.getContext('2d');
var someimage = document.createElement('img');
someimage.setAttribute('src', 'img/someimage.png');
someimage.onload = function(){
ctx.drawImage(someimage, 0, 0, canvas.width, canvas.height);
data = ctx.getImageData(0,0,canvas.width,canvas.height);
}
I get the unsightly:
"Uncaught DOMException: Failed to execute 'getImageData' on 'CanvasRenderingContext2D': The canvas has been tainted by cross-origin data.
at HTMLImageElement.someimage.onload"
I should mention that I'm fairly new to pogramming, and even moreso to javascript. Should this be happening when I'm running it from file:\\?
I haven't found anyone having the exact same problem as me, and the explanations people got for the other questions had to do with the server the images were hosted on. But in this case it isn't hosted on a server, so I'm confused as to how it all works. Or rather, doesn't work.
For security reasons, many browsers will complain if you try to do certain things (canvas image drawing among them), if you use a file:// URL.
You really should serve both the page and the images from a local HTTP server in order to avoid those restrictions.
Ah, you've hit the CORS restriction, and I'm guessing here that you're encountering this in Google Chrome, which is notorious for being the most aggressive implementor of this. I've seen this a LOT.
CORS is a protocol, to prevent cross-origin content being inserted into a web page. It not only affects script files (as you might expect, because you don't want anyone to be able to inject malicious scripts into your web page), but also affects such resources as images and fonts.
The reason it affects images, is because malicious individuals discovered that they could use the HTML5 canvas object to copy the contents of your web page to a PNG file, and hoover personal data from that at will.You can imagine what would happen if you were engaging in Internet banking transactions whilst this happened to you!
But, and this is the annoying part you're encountering, stopping such malicious activity also impinges on legitimate uses of cross-origin resources (e.g, keeping all your images in a separate repository).
So how do you get around this?
On Firefox, you shouldn't have a problem. Firefox applies some intelligence to the matter, and recognises that images coming from the same file:// specification as your web page, are not actually cross-origin. It lets these through, so long as they're in the same directory on your hard drive as your web page.
Chrome, on the other hand, is much less permissive. It treats all such accesses as cross-origin, and implements security shutdowns the moment you try using getImageData() and putImageData() on a canvas.
There is a workaround, if you don't want to go to the trouble of installing and configuring your own local web server, but still want to use Chrome and its nice, friendly debugger. You have to create a shortcut, that points to your Chrome executable and runs it when you double click on it, but which starts Chrome up using a special command line flag:
--allow-file-access-from-files
Save this shortcut, labelling it something like "Chrome Debug Version" to remind you ONLY to use this for debugging your own code (you should never access the Internet proper with weakened security!), and you should be able to debug your code without issues from this point on.
But, if you're going to be doing a lot of debugging of this sort, the better long-term solution, is to install a web server, and configure it to serve up code from your desired directories every time you use the "localhost" URL. This is, I know, tedious and time consuming, and distracts from your desire to get coding, but once it's done, it's done and dusted, and solves your woes permanently.
If you really want to put your programming skills to the test, you could even write your own web server to do the job, using something like the node.js server side framework, but if you're new to JavaScript, that's a task you're better leaving until you have a lot more experience! But once your skills reach that point, doing that is a nice educational exercise that will also solve some of your other woes, once you've worked out how a web server works.
If you run with an established web server, you then, of course, have the fun of deciding which one involves the least headaches. Apache is powerful, but big. Hiawatha is both lightweight and secure, and would be my first choice if it wasn't for the fact that a 64-bit version is still not available (sigh), because the 32-bit version that ran on my old XP box was a joy to use. Nginx I know little about, but some people like it. Caveat emptor and all that.

Detect memory exhaustion in the browser before it crashes

When a JavaScript client application uses too much memory, the browser will either crash or throw an exception that can't be recovered from or swap like it's the 80s.
Do browsers signal that they almost reached the available memory limit for a tab?
Ideally, I'd love to be able to catch an event that I can intercept in JavaScript when the browser is running low on memory in order to automatically fall back to a light version of the application or tell my users to go buy a new computer / phone.
I know Chrome Performance Tools allow imprecise querying of the used memory, which is a first step, but probably not enough to detect memory limitations.
No, there's no cross-browser way to detect this unfortunately. This is discussed a little bit in this answer.
There is window.performance.memory but that is only available in Chrome.
I'm not aware of any really good workarounds for this either. You could perhaps check for old browsers or browsers that don't have particular features ("feature detection") and suggest that users with older browsers use your "light" version, since those are the people most likely to have low-powered devices.
Another possibility would be to see how long some particular operations take, and if they take too long then recommend the light version. Again a very blunt solution.
The answer is in speed of browser.
It doesn't show memory very correctly to prevent fingerprinting exactnes.
So, http://thisbeautiful.w3spaces.com/notbad.htm contains code to loop with an interval like this:
JavaScript:
momentum=Date.now();for(itr=1;itr<770;itr++){};momentumTwo=Date.now()-momentum;if(momentumTwo>3){ //take action
} //and every second if you wrap in into an interval
Reference this to get example.
Summary:the code sees how long it takes to loop things and, if crash, take action.
Using a program that monitors the browser system should be a better solution, as browsers themselves are not fully able do such a thing.

Fetching HTML data to load pages faster - Opera Turbo

I've discovered a pretty cool function on Opera which is fetching data on website, it shows lower resolution images and things like that and page loads fast. This is great for slow connections.
I'm interested in the background of this little function, with my basic knowledge of CSS, HTML and Javascript I don't understand how this can be done. Can anyone explain how does it work?
I mean let's say for images, it needs to download the image first and then convert it to lower resolution one so where do we "win" time here? Image is still being downloaded right?
Sad to say it's non-trivial for you to achieve what you are trying to do. If you take a look at Opera Turbo
How we squeeze out all that speed
When Opera Turbo is enabled, webpages are compressed via Opera's servers so that they use much less data than the originals. This means that there is less to download, so you can see your webpages more quickly.
Enabling Opera Turbo is as simple as clicking the Opera Turbo icon at the bottom-left of the Opera browser window. When you are on a fast connection again and Opera Turbo is not needed, the Opera browser will automatically disable it.
Your best bet is to follow up on How do I check connection type (WiFi/LAN/WWAN) using HTML5/JavaScript?, depending on their connection type, load your images accordingly, but be aware that, a connection type cannot accurately let you determine their network speed. A device can be on 3G or LTE but still getting crappy speeds from their provider.
If you really want to implement this feature and be safe to work across browsers & devices, i can suggest putting lazyload plugins like Unveil that will help with the amount of data on load. OR include a button within your page that allow the user to select a low bandwidth option, something like what gmail does
Turbo mode is really great feature of Opera.
In short, it downloads complete code and images to Opera servers and then send them to user. Turbo mode is big traffic-saver (up to 80%). Opera then decompresses data on-the-fly, while answering user's request. Well, images in Turbo mode are almost useless but still, this mode basically comes handy when you use extremely slow internet-speed.
You can check out official documentation to find more info. Also, check out my old post where I wrote about Turbo mode. There you can find more info and useful links about this topic.
Additionally, look at info on opera-turbo tag here, on StackOverflow.

HTML5 LocalStorage limitations when combined with offline cache & JS memory on smartphones

I've asked question about JavaScript memory and HTML5 LocalStorage limitations on smartphones, however the problem became a bit more specific.
I need to store for offline usage a lot more data, big part of it are dictionaries. I had an idea to store dictionaries in JavaScript (which loads simply array data into JS variable) which will be cached for offline usage. Business data for offline will be stored in LocalStorage. Additionally JS memory will hold some cache for online usage, to prevent loading same entity more than one time from server.
So I have question, if using big offline cache (say 4MB) and storing a lot in memory affects the storage available for LocalStorage? Say it can become limited to 3MB because of heavy offline cache usage. Does someone has experience with such applications and had to deal with problems with particular browsers on mobile devices?
The answer to similar to mine question Application Cache Manifest + Local Storage Size Limit does not provide the information I need since as fair as I understood the author has tested offline cache limit and LocalStorage limit separately.
Even more I'm worried about JS memory applications, I fear the browser can be closed without even warning. Testing on one device would not mean the application would not crash on another, less powerfull.
So please write, if you have tested the limits of mobile browsers. Post which only give clues where to search further or describe test scenarios which hasn't finished with effort will also be appreciated. The topic is quite new so I'm aware most of researches are drafts only.
update
I've updated my referred question about LocalStorage limitations, with test on Opera Mobile 11, in which I was able to store much above 5MB limit.
Also according to the post Increase iPad cache over 50 MB? at least on iPad it is possible to store 50MB of data, hope I would do test on iPhone soon.
In almost all cases, mobile devices have a hard, 5meg limit for all storage for a domain. That includes all of the local/session storage, indexed db, websql, application cache or anything else.
The best way to store data in this case is either to use local storage or WebSQL for your dictionary data, but instead of storing all of the data, store the 80% use case content, and provide an easy way to be able to load the additional information as necessary.
HTH,
PEte
After experimenting a bit, I've found out that Opera is not limiting the resources. It asks to increase LocalStorage limit, and this limit is not affected by storing files in application cache.
Firefox has configurable LocalStorage limit so in theory it is possible to store large amount of data there, also unaffected by application cache. However, Mobile Firefox is so badly written as its desktop brother and it manages resources so ineffectively that it is killed by Android when the storage is in intensive use. So I would discourage people to use Firefox, at least on Android devices.
Android Browser, on the other way, seems to have 2,5 MB limit, which is unaffected by application cache, however it is lower as it should be, according to suspitions that you have at least 5MB for use. However, this browser also fail to provide accurate GPS location (which is, AFAIK, feature not bug - security reasons), so for me this browser is no option to support.

Script concatenation performance characteristics on memory constrained devices

We are currently trying to optimize page load times on an memory and cpu constrained windows mobile device (the MC9090, 624mhz cpu, 64meg ram, running WinMo 6 and Opera Mobile 9.5). As one of the potential optimiztions, I wanted to combine external js/css files, but now we are observing that page load times actually INCREASE after concatenation.
Now, that flies in the face of everything I have seen and read about with regards to client side performance. Has anyone ever seen this before? Also, are there different rules for embedded device web optimization then desktop browsers?
Browsers can download across multiple connections (4, sometimes 6 - not sure what this version of Opera allows). By putting everything in one file it forces the browser to download all the javascript in one stream, vs. multiple ones. This could be slowing it down.
Images and other items also compete for these connections. You nelly need a tool similar to Firebug's "net" tab to view the traffic and see how the bandwidth resources are allocated.
Also, see: http://loadimpact.com/pageanalyzer.php
out of date but relevant: http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/

Categories