Browsers and large number of image files handle - javascript

The ability of a browser (Chrome for example) to handle a number of images, is limited only by the hardware of the computer on which it is displayed or also by the software itself?
I'm trying to develope an image viewer that must content lot of files that must be accesible instantly depending on the demand of the user and sometimes when i go over 350 files of 300kb the page frozens.
Thank you all for your help!!

It is probably limited by both.
There are limits to how large a struct is allowed to be for instance, because it has to fit in a fixed amount of bytes.
(see this question for instance)
(also, you're not yet running into the max size of an int so that is probably not what is happening right now.)
Besides this there are several more constraints.
That said, loading a gigantic amount of images every time you open a page is probably not a good idea.
Take some inspiration from how others (like google in their image search) have solved this problem simply by not loading the images until they are needed.

I think the best approach would be to have some small thumbnails of each image and on request of the user(click) load the bigger one, like Timothy said.
If you need it to be faster you can preload images. So for example if you have a list of images and the user scrolls through you just load the next n images. To free space you can "delete" the ones the user did already passed.

Related

Is it possible to interlace every image on my website?

I am creating a website which contains posts. Every post has an image. Feed page of the website is slow because every image takes a lot of time to be downloaded and while it is downloading it is a blank space and it does not look nice. Images are uploaded by users and are of different formats. Is it possible to somehow make it so that images first appear blurry and then they start becoming less and less blurry as they load (I’ve read it is called interlacing)?
I have tried searching on the internet but didn’t find anything on the topic but the usage of .jpeg format which does not seem to work because even the .jpegs I have as post images are not loading interlaced and I cannot exactly force people to use only that format.
Images take time to load because they are relatively large files and the data in them has to be transferred over the network.
Interlacing is a a feature of the JPEG format that gives visible process as the image loads by reordering the image data so it doesn't run sequentially (Wikipedia has diagrams and animations which might be helpful in understanding that.)
If you want interlacing in the image then you need to make sure it is in the image. You can't really change that outside the image.
While it is impractical to ask users to upload images encoded that way, you can write server side code to convert images to meet you standards. When converting them you could also perform other optimisations (such as reducing the dimensions or increasing the compression level) that would speed up the load time. Of course, this will have an impact on image quality which might not be desirable. (Consider Flickr, for example, which provides a degraded image (with smaller file size) by default but allows switching to higher quality or original images on demand).
You could also consider displaying placeholder images while the full version loads. These could also be generated server side from the upload images and have extremely high levels of compression and resizing so you just display a small number of large pixels that give a vague impressions of the average colour over large chunks of the image until the real image is available to the browser.

Optimizing online photo gallery for retina devices

I have an existing website (a photo blog) that loads the majority of the photos from Flickr. I'd like to enhance the experience for users with high resolution screens and load higher res versions of photos, but I'm not sure what would be the best way to go.
Since the images in question are not UI elements, there is no sensible way to solve this problem with CSS. That leaves either client side JavaScript, or a server side find-and-replace for specific image patterns (since they come from Flickr, it's easy to detect and easy enough to figure out the url for a double-sized image).
For client side, my concern is that even the regular sized images are 500-800 KB in size, there there can be 10-30 images per gallery, causing lots of excess bandwidth use for retina users.
For server side, it's obviously tricky to determine if the request comes from a retina device or not. One idea I had (which I have yet to test out), was to run a JavaScript function that checks window.devicePixelRatio and sets a cookie accordingly, and then on each successive page request the server would know if the device is high res or not. That leaves the entry page with non-retina images, but at least all the next ones will have high res images loaded right away.
Are there any pitfalls to this proposed solution? Are there better ways to handle it?
You can generate CSS on server that will have links to both regular size and double-sized images in background-image property. This CSS can easily be different for every page by including it in <style> tag and referencing classes/ids that only exist on this page. This will deal with traffic issue, since majority of modern browsers don't load pictures for other DPIs.
Other solution (though worse) will be not just set cookie, but load images with JavaScript. This will solve the initial first page issue, but slow down the page rendering a little.

javaScript video with images

I know that it's possible to take a series of images and use javascript to set opacity from 1 to 0 very quickly for each one in rapid secession. I have actually done it before very successfully, although I only used 41 images at approx. 720p.
My question though, is whether or not it would be practical to make an entire video (4-10 minutes long) using only html, css, and javascript. Obviously that would be too many images to leave in the cache, so you'd have to empty the cache every so often of particular images, and also this would require a pretty good internet connection, but do you think it would be worth while to attempt?
are there any obvious pro's and cons that you could think of for trying this?
(to be clear, I'm not asking for the code to achieve it as I have already developed most of it, just what you think in terms of practicality of it as opposed to youtube or vimeo etc...)
Plainly put, I don't think it's practical.
Here's a few reasons why:
Number of images. Videos tend to be around 30 frames per second.
For a 4 min video, that's roughly 7,200 images.
Download time. Downloading 7,200 images at 720 pixels high would require a
significant amount of bandwidth, not only for the user, but also for
the server.
DOM load. 7,200 images would require just as many DOM
elements positioned correctly behind each other.
Rendering. Each
time we do an animation (fadeout), the browser has to calculate what
element is visible and what that means to the user (pixel color,
etc.)
Of course, we can optimize this:
Download the images that is needed at that specified time. If the connection can download 30 images/second, then we load what we need and then initiate the play sequence.
Remove elements when it's done. As the sequence goes on, we can assume that those images are no longer needed. When those images are done, we can destroy those elements and thus free up a few resources.
Put images on multiple sub-domains/locations. Browsers tend to download 1 asset per domain at a single moment. Modern browsers may download 6 or more. But if we were to split those assets to multiple subdomains, we can increase that number of simultaneous downloads and thus increase response time.
Create large sprites. Have the sequence on one or several large sprites, and use Javascript to correctly position the element. This allows us to have a single/few downloads and removes the overhead from multiple downloads.
Videos are encoded so that the browser doesn't have to do all those pixel calculations and animations. Each frame isn't a full image, but rather a delta between the current and next frame. (I'm overly simplifying this.)
But this method is constantly used for pieces that need interactivity, or want to get past iOS auto-play hangups. Again, not practical for large sequences, but definitely doable for shorter sequences.

Javascript runs slowly in Safari / iPad2 after loading 200mb worth of new Images(). Why might this be?

Does anyone know why Javascript performance would be affected by the loading of lots of external JPG/PNG images into HTML5 Image() objects, totalling approx 200Mb-250Mb. Performance also seems to be affected by cache. Ie. if the cache is full(-ish) from previous browsing the performance on the current site is greatly reduced.
There are 2 says i can crudely solve it.
clear cache manually.
minimise browser, wait about 20 secs and re-open the browser after which time the iOS/browser has reclaimed the memory and runs the JS as it should.
I would have expected the iOS to reclaim required memory to run the current task, but it seems not. Another workaround is to load 200Mb of 'cache clearing' images into Image() objects, then remove these by setting the src = "". This does seem to help, but its not an elegant solution...
please help?
First and foremost read the excellent post on LinkedIn Engineering blog. Read it carefully and check if there are some optimizations that you can also try in your application. If you tried all of them and that still haven't solved your performance issues read on.
I assume that you have some image gallery or magazine-style content area on your page
How about having this image area in a separate iframe? What you could do then is this:
Have two iframes. Only one should be visible and active in time.
Load images into first iframe. Track the size of loaded images. If exact size tracking is hard
numberOfLoadedImages * averageImageSize
might be a pretty good aproximation.
As that number approaches some thresshold start preloading the currently visible content into second iframe.
Flip the visibility of iframes so the second one becomes active.
Clear the inner content of the first frame.
Repeat the whole procedure as necessary.
I don't know for sure if this would work for you but I hope that WebKit engine on iPad clears the memory of frames independently.
EDIT: It turned out you're writing a game.
If it's a game I assume that you want to have many game objects on the screen at the same time and you won't be able to simply unload some parts of them. Here are some suggestions for that case:
Don't use DOM for games: it's too memory-heavy. Fortunately, you're using canvas already.
Sprite your images. Image sprites not only help reducing the number of requests. They also let you reduce the number of Image objects and keep the per-file overhead lower. Read about using sprites for canvas animations on IE blog.
Optimize your images. There are several file size optimizers for images. SmushIt is one of them. Try it for your images. Pay attention to other techniques discussed in this great series by Stoyan Stefanov at YUI blog.
Try vector graphics. SVG is awesome and canvg can draw it on top of canvas.
Try simplifying your game world. Maybe some background objects don't need to be that detailed. Or maybe you can get away with fewer sprites for them. Or you can use image filters and masks for different objects of the same group. Like Dave Newton said iPad is a very constrained device and chances are you can get away with a relatively low-quality sprites.
These were all suggestions related to reduction of data you have to load. Some other suggestions that might work for you.
Preload images that you will need and unload images that you no longer need. If your game has "levels" or "missions" load sprites needed only for current one.
Try loading "popular" images first and download the remaining once in background. You can use separate <iframe> for that so your main game loop won't be interrupted by downloads. You can also use cross-frame messaging in order to coordinate your downloader frame.
You can store the very most popular images in localStorage, Application Cache and WebSQL. They can provide you with 5 mb of storage each. That's 15 megs of persistent cache for you. Note that you can use typed arrays for localStorage and WebSQL. Also keep in mind that Application Cache is quite hard to work with.
Try to package your game as a PhoneGap application. This way you can save your users from downloading a huge amount of data before playing the game. 200 megs as a single download just to open a page is way too much. Most people won't even bother to wait for that.
Other than that your initial suggestion to override cache with your images is actually valid. Just don't do it straight away. Explore the possibilities to reduce the download size for your game instead.
I managed to reduce the impact by setting all the images that aren't currently in the viewport to display:none. This was with background images though and I haven't tested over 100Mb of images, so can't say whether this truly helps. But definitely worth of trying.

Can I resize images using JavaScript (not scale, real resize)

I need to dynamically load and put on screen huge number of images — it can be something like 1000–3000. Usually these pictures are of different size, and I'm getting their URLs from user. So, some of these pictures can be 1024x800 or 10x40 pixels.
I wrote a good JS script showing them nicely on one page (ala Google Images Search style), but they are still very heavy on RAM used (a hundred 500K images on one page is not good), so I thought about the option of really resizing images. Like making an image that’s 1000x800 pixels something like 100x80, and then forget (free the ram) of the original one.
Can this be done using JavaScript (without server side processing)?
I would suggest a different approach: Use pagination.
Display, say, 15 images. Then the user click on 'next page' and the next page is shown.
Or, even better, you can script that when the user reaches the end of the page the next page is automatically loaded.
If such thing is not what you want to do. Maybe you want to do a collage of images, then maybe you can check CSS3 transforms. I think they should be fast.
What you want to do is to take some pressure from the client so that it can handle all the images. Letting it resize all the images (JavaScript is client side) will do exactly the opposite because actually resizing an image is usually way more expensive than just displaying it (and not possible with browser JS anyway).
Usually there is always a better solution than displaying that many items at once. One would be dynamic loading e.g. when a user scrolls down the page (like the new Facebook profiles do) or using pagination. I can't imagine that all 1k - 3k images will be visible all at once.
There is no native JS way of doing this. You may be able to hack something using Flash but you really should resize the images on the server because:
You will save on bandwidth transferring those large 500K images to the client.
The client will be able to cache those images.
You'll get a faster loading page.
You'll be able to fit a lot more thumbnail images in memory and therefore will require less pagination.
more...
I'm (pretty) sure it can be done in browsers that support canvas. If this is a path you would like to take you should start here.
I see to possible problems with the canvas approach:
It will probably take a really long time (relatively speaking) to resize many images. Because of this, you're probably going to have to look into utilizing webworkers.
Will the browser actually free up any memory if you remove the image from the DOM and/or delete/null all references to those images? I don't know.
And some pretty pictures of a canvas-resized image:
this answer needs a ninja:--> Qk

Categories