Many many sites uses this technique (facebook, google as well)
For example, open facebook.com
Save this page (not as *.MHTM but HTML with images) (mean login page)
It saves:
facebook_files(dir)
facebook.html(file)
Then inside the folder, You can see one GIF file which containts all primary images for the page.
The question is: How to read many chunks inside one file??
And how to call this approach?
Those images are called "sprites". Take a look a this article on them.
The basic idea is that whenever you want to use an image from the sprite, you have an element which just shows part of the big sprite image. So each "image" in your page is actually a div with this image as the background, just offset so the right part shows through.
The main advantage is that your page needs fewer requests and so loads faster.
There are some tools online that make using sprites easier. I haven't used any of them so can't recommend one, but using a tool would save you a lot of work.
This is what you call "spriting", like the spriting used in arcade games (one image of a character with it's different positions). Basically it's one huge chunk of image containing smaller images.
The advantage of this approach is that instead of 100 different HTTP requests for 100 tiny gifs (which causes overhead), you only need to request one huge image containing this 100 gifs. Then instead of using <img> per image, you use the CSS background instead, then use background-position to align the right image "through" the container to show the right image.
Related
I'm looking to do something on the lines of how niice.co load their images via lazy loading.
As you can see from their website when scrolling, before lazy loading the image the background of the div where the image is placed is actually the dominant colour of the image.
I've looking around and found some plugins such as http://briangonzalez.github.io/jquery.adaptive-backgrounds.js/ but don't know how they're merging the two.
Will not work. The image isn't available while loading. You can check the color in your backend. Then: Save the color in a db and output the hex as data-attribute and use it as background in javascript or directly in the element style. Or another solution: Use the adaptive background libary if the color isn't saved. Then: Call a backend API and save the Hex. At the next request you can use the hex with javascript or in the element style.
Edit:
Thanks Empiric for the HowTo Link: https://manu.ninja/dominant-colors-for-lazy-loading-images
One more performance Tipp: Doesn't load all images at one time. It's a better UX to split the requests. If you have 8 images, 4 per line, load the first 4 images. If they are finshed: Load the second line and so on. One more important thing: If you have small profil images etc. (like stackoverflow) doesn't load this images while the content images are loading. The visitors focus point is the most important point.
I have a website which contains some images in index.php
The problem I am facing is the whole page is not loading at once, I think images are taking some time to load
So what I have done is, I am showing an loading image at first and then after some time I am showing the page, that resolves the problem. But I am curious to know is there any other better way to do this?
I prefer to optimise the hell out of my images.
PNG images
You can use pngcrush to optimise your PNG files for you, but personally I find that once I'm done with it pngcrush only succeeds in making it bigger.
Use Indexed-PNG wherever possible. This will limit you to 256 colours, and most graphics editors won't allow partial transparency in Indexed-PNG (but it is possible - you just need the right editor. I use a custom PHP script with the GD image library) but you can expect to drop file size down to just a tiny fraction of what it was.
Reduce the amount of colours overall. PNG compression works best with blocks of the same colour, so reducing the number of colours improves compression.
GIF images
Especially for animations, there's a lot of things you can do.
Reduce the number of frames. Avoid duplicate frames at all costs, and just set the previous frame to have a longer display time.
Use combine rather than replace if possible. You will again have problems with transparent areas, but by using combine you can have each subsequent frame only change the stuff that... changes. This avoids the redundancy of re-writing the entire image if only a small part changes. GIMP has a useful filter "Animation > Optimize for GIF" which will do this for you.
Reduce colours as much as possible. GIF is limited to 256 colours, but if you can limit yourself to 32 or so, you'll get a much smaller file.
Using the above techniques, I once managed to shove 8MB of raw image data into a 125kb animated GIF.
JPG images
JPG is great for photos, but cameras have a tendency to write MASSIVE files.
Play around with the compression factor. Start at around 40%, and slowly bring it up until it looks acceptable. GIMP will show you a preview and the resulting filesize, so make use of that to find an acceptable compromise.
Scale the image down. You don't need 9 megapixels or however massive resolution cameras take now...
The above should help you reduce the amount of size taken by your images. Obviously, you should also cache images appropriately, so they only need to be retrieved once. Also make sure that you specify width and height on image elements so that the browser can reserve the space for them and avoid jumping around as they load...
And you should be pretty good.
It's hard to say what other options are available without knowing what the page looks like, but one option is to reserve space for the images so that the page text renders quickly in the correct position, and the images then load later.
I have several 1000s of images on disk that I need to display a subset of, given user search criteria. What this means is I could be showing 100s at one time. Each image is large- 3510x1131 to be exact. And I need to do some light image processing before they are displayed (adjusting brightness and contrast).
My application to do this is a Web.API app using jQuery.
My first pass was to just pass the URLs to the browser and then size the images and make those adjustments using pixastic. Works, but it's pretty slow.
I was thinking it would be much faster if I could resize the image before doing the adjustments, but I dont want to create copies and then link to those. Maybe there is a way to generate the images on the fly and serve them up? Via REST perhaps? Bottom line is I need a LOSSY image resize, not just setting width and height using css.
Are there tools out there that I am missing or has anyone done something similar? I think what Im looking for is exactly like google image search results- but I don't know how they generate those thumbnails- and if I were to adjust brightness/contrast am I doing it on the thumbail (saving time) or the full size image?
Creating something I'm hoping is easy using Easel.js, but still a novice. Simply want to create a 10X10 image grid where each of the images will cross dissolve to another paired image at random intervals.
Can anyone point me to any similar examples or will I need to hard code this using only javascript instead?
Eventually I will load these images from a server but first I need a working demo without any server ajax calls, just load images locally.
If by cross dissolve you mean fade into each other, then you might want to take a look at the createjs.AlphaMaskFilter or the createjs.AlphaMapFilter.To load an image take a look here: The Image doesn't show up on the Canvas
There really is no difference between an image on a server and a local image, it might be even easier from a server.
I need to dynamically load and put on screen huge number of images — it can be something like 1000–3000. Usually these pictures are of different size, and I'm getting their URLs from user. So, some of these pictures can be 1024x800 or 10x40 pixels.
I wrote a good JS script showing them nicely on one page (ala Google Images Search style), but they are still very heavy on RAM used (a hundred 500K images on one page is not good), so I thought about the option of really resizing images. Like making an image that’s 1000x800 pixels something like 100x80, and then forget (free the ram) of the original one.
Can this be done using JavaScript (without server side processing)?
I would suggest a different approach: Use pagination.
Display, say, 15 images. Then the user click on 'next page' and the next page is shown.
Or, even better, you can script that when the user reaches the end of the page the next page is automatically loaded.
If such thing is not what you want to do. Maybe you want to do a collage of images, then maybe you can check CSS3 transforms. I think they should be fast.
What you want to do is to take some pressure from the client so that it can handle all the images. Letting it resize all the images (JavaScript is client side) will do exactly the opposite because actually resizing an image is usually way more expensive than just displaying it (and not possible with browser JS anyway).
Usually there is always a better solution than displaying that many items at once. One would be dynamic loading e.g. when a user scrolls down the page (like the new Facebook profiles do) or using pagination. I can't imagine that all 1k - 3k images will be visible all at once.
There is no native JS way of doing this. You may be able to hack something using Flash but you really should resize the images on the server because:
You will save on bandwidth transferring those large 500K images to the client.
The client will be able to cache those images.
You'll get a faster loading page.
You'll be able to fit a lot more thumbnail images in memory and therefore will require less pagination.
more...
I'm (pretty) sure it can be done in browsers that support canvas. If this is a path you would like to take you should start here.
I see to possible problems with the canvas approach:
It will probably take a really long time (relatively speaking) to resize many images. Because of this, you're probably going to have to look into utilizing webworkers.
Will the browser actually free up any memory if you remove the image from the DOM and/or delete/null all references to those images? I don't know.
And some pretty pictures of a canvas-resized image:
this answer needs a ninja:--> Qk