I am creating a website which contains posts. Every post has an image. Feed page of the website is slow because every image takes a lot of time to be downloaded and while it is downloading it is a blank space and it does not look nice. Images are uploaded by users and are of different formats. Is it possible to somehow make it so that images first appear blurry and then they start becoming less and less blurry as they load (I’ve read it is called interlacing)?
I have tried searching on the internet but didn’t find anything on the topic but the usage of .jpeg format which does not seem to work because even the .jpegs I have as post images are not loading interlaced and I cannot exactly force people to use only that format.
Images take time to load because they are relatively large files and the data in them has to be transferred over the network.
Interlacing is a a feature of the JPEG format that gives visible process as the image loads by reordering the image data so it doesn't run sequentially (Wikipedia has diagrams and animations which might be helpful in understanding that.)
If you want interlacing in the image then you need to make sure it is in the image. You can't really change that outside the image.
While it is impractical to ask users to upload images encoded that way, you can write server side code to convert images to meet you standards. When converting them you could also perform other optimisations (such as reducing the dimensions or increasing the compression level) that would speed up the load time. Of course, this will have an impact on image quality which might not be desirable. (Consider Flickr, for example, which provides a degraded image (with smaller file size) by default but allows switching to higher quality or original images on demand).
You could also consider displaying placeholder images while the full version loads. These could also be generated server side from the upload images and have extremely high levels of compression and resizing so you just display a small number of large pixels that give a vague impressions of the average colour over large chunks of the image until the real image is available to the browser.
Related
I have a website which contains some images in index.php
The problem I am facing is the whole page is not loading at once, I think images are taking some time to load
So what I have done is, I am showing an loading image at first and then after some time I am showing the page, that resolves the problem. But I am curious to know is there any other better way to do this?
I prefer to optimise the hell out of my images.
PNG images
You can use pngcrush to optimise your PNG files for you, but personally I find that once I'm done with it pngcrush only succeeds in making it bigger.
Use Indexed-PNG wherever possible. This will limit you to 256 colours, and most graphics editors won't allow partial transparency in Indexed-PNG (but it is possible - you just need the right editor. I use a custom PHP script with the GD image library) but you can expect to drop file size down to just a tiny fraction of what it was.
Reduce the amount of colours overall. PNG compression works best with blocks of the same colour, so reducing the number of colours improves compression.
GIF images
Especially for animations, there's a lot of things you can do.
Reduce the number of frames. Avoid duplicate frames at all costs, and just set the previous frame to have a longer display time.
Use combine rather than replace if possible. You will again have problems with transparent areas, but by using combine you can have each subsequent frame only change the stuff that... changes. This avoids the redundancy of re-writing the entire image if only a small part changes. GIMP has a useful filter "Animation > Optimize for GIF" which will do this for you.
Reduce colours as much as possible. GIF is limited to 256 colours, but if you can limit yourself to 32 or so, you'll get a much smaller file.
Using the above techniques, I once managed to shove 8MB of raw image data into a 125kb animated GIF.
JPG images
JPG is great for photos, but cameras have a tendency to write MASSIVE files.
Play around with the compression factor. Start at around 40%, and slowly bring it up until it looks acceptable. GIMP will show you a preview and the resulting filesize, so make use of that to find an acceptable compromise.
Scale the image down. You don't need 9 megapixels or however massive resolution cameras take now...
The above should help you reduce the amount of size taken by your images. Obviously, you should also cache images appropriately, so they only need to be retrieved once. Also make sure that you specify width and height on image elements so that the browser can reserve the space for them and avoid jumping around as they load...
And you should be pretty good.
It's hard to say what other options are available without knowing what the page looks like, but one option is to reserve space for the images so that the page text renders quickly in the correct position, and the images then load later.
The ability of a browser (Chrome for example) to handle a number of images, is limited only by the hardware of the computer on which it is displayed or also by the software itself?
I'm trying to develope an image viewer that must content lot of files that must be accesible instantly depending on the demand of the user and sometimes when i go over 350 files of 300kb the page frozens.
Thank you all for your help!!
It is probably limited by both.
There are limits to how large a struct is allowed to be for instance, because it has to fit in a fixed amount of bytes.
(see this question for instance)
(also, you're not yet running into the max size of an int so that is probably not what is happening right now.)
Besides this there are several more constraints.
That said, loading a gigantic amount of images every time you open a page is probably not a good idea.
Take some inspiration from how others (like google in their image search) have solved this problem simply by not loading the images until they are needed.
I think the best approach would be to have some small thumbnails of each image and on request of the user(click) load the bigger one, like Timothy said.
If you need it to be faster you can preload images. So for example if you have a list of images and the user scrolls through you just load the next n images. To free space you can "delete" the ones the user did already passed.
I know converting any image format to SVG is not an easy task, and it is not something I am pursueing for complex images. Let me explain why I am asking this question. As a web designer, there are elements in a site that I will commonly use an image for. There include simple geometric shapes, fleur-de-lis, some simple logos etc. The cost of these HTTP requests is relatively small (we are talking KB's here).
However, I am interested if these requests could be even smaller by using svg or other canvas elements instead of an image format for simple images. Has there been any research or testing to compare SVG vs. an image? Is it possible that I can make HTTP requests even smaller by using canvas for simple elements on my page? If so it would be great news. I could even start creating libraries of canvas images to re-use and share with others.
For simple shapes gzipped SVG will be pretty small and in modern browsers it's quite usable.
However, for page loading performance number of requests is a very big factor, so you'll get significant performance boost if you use CSS sprite sheets regardless of the image format.
If by canvas you mean storing shapes as JavaScript drawing commands (or a library that issues them based on some JSON) then it's unlikely to be big bandwidth saving compared to gzipped SVG (SVG has quite efficient format for defining paths, and gzip removes overhead of XML quite well).
You will have to wait for JS to load and either insert several canvases all over the document or burn CPU on compressing generated images and building data: URLs, so I don't think it'd be faster overall than using SVG or optimized PNG.
I think this will depend on the case.
Sprites can help.. but what if your sprite contains some simple lines yet must be 500px * 500px... then a canvas (or even svg) will indeed be a lot smaller. On top of that, external javascripts usually are cached (even better than images). In that regard, they indeed help slim down the weight.
However, if you'd expect IE to also 'work', then you might need to start to provide some html5/canvas shims and yes that would increase the weight.
Edit:
Look at this jsfiddle example (I whipped up for this question) where a full-size 'granite' background is created with a radial gradient.
As you can see (in the source) this is a LOT less bytes then a separate image (and will alway's be pixel-perfect).
On the other hand, it also (depending on the complexity) take some calculation-time that regular images wouldn't take (after the necessary bytes are downloaded to the client).
Edit2:
I found a link where they did a little test.
In our sprite example, the raw SVG file was 2445 bytes. The PNG
version was only 1064 bytes, and the double-sized PNG for double-pixel
ratio devices was 1932 bytes. On first appearance, the vector file
loses on all accounts, but for larger images, the raster version more
quickly escalates in size.
SVG files are also human-readable due to being in XML format. They
generally comprise a very limited range of characters, which means
they can be heavily Gzip-compressed when sent over HTTP. This means
that the actual download size is many times smaller than the raw file
— easily beyond 30%, probably a lot more. Raster image formats such as
PNG and JPG are already compressed to their fullest extent.
However, note all this is still depending how complex the image is: as an extreme counter-example.. try to describe a full color photo of a forrest (leaves..) versus simply a jpg..
The size always depends on the specific images, sorry but you have to test it at your own. There is no way to compare this in general.
I was wondering if there is any other ways to compress my images or any script that would load the page faster / or the the images behind the scenes?
The site is very interactive and using very high quality layers of images for the main layout. I have already saved for web devices in Photoshop and re-compressed using ImageOptim, some are jpeg but the majority are png24 to maintain transparancy, they are all set in CSS.
I have used jpegs and css sprites where i can but there is one in particular of a tree illustration streching the full site length, that is really slowing up the loading time, is there any I could compress these images further or code them differently that I missed?
Any help would be great thanks!
You said you are spriting. That is good.
You can also use tools such as PNGcrush which attempt to make files smaller by dropping things such as meta data.
You should also send far distant expiry headers and use a cache breaker on your images, to ensure the images won't be downloaded again if unnecessary.
In Photoshop, choose file-> save for web, you will be able to find the best compromise between size and quality.
Do you really need the transparency there? PNG transparency is unsupported on some browsers and makes the page processing intensive and slow even on high end computers, depending on image size and quantity of layers. If you can show something of your site maybe someone can give more hints about how to optimize it.
You can compress them on the fly with Apache if that's your web server. One of many available articles on the subject: http://www.samaxes.com/2009/01/more-on-compressing-and-caching-your-site-with-htaccess/
I need to dynamically load and put on screen huge number of images — it can be something like 1000–3000. Usually these pictures are of different size, and I'm getting their URLs from user. So, some of these pictures can be 1024x800 or 10x40 pixels.
I wrote a good JS script showing them nicely on one page (ala Google Images Search style), but they are still very heavy on RAM used (a hundred 500K images on one page is not good), so I thought about the option of really resizing images. Like making an image that’s 1000x800 pixels something like 100x80, and then forget (free the ram) of the original one.
Can this be done using JavaScript (without server side processing)?
I would suggest a different approach: Use pagination.
Display, say, 15 images. Then the user click on 'next page' and the next page is shown.
Or, even better, you can script that when the user reaches the end of the page the next page is automatically loaded.
If such thing is not what you want to do. Maybe you want to do a collage of images, then maybe you can check CSS3 transforms. I think they should be fast.
What you want to do is to take some pressure from the client so that it can handle all the images. Letting it resize all the images (JavaScript is client side) will do exactly the opposite because actually resizing an image is usually way more expensive than just displaying it (and not possible with browser JS anyway).
Usually there is always a better solution than displaying that many items at once. One would be dynamic loading e.g. when a user scrolls down the page (like the new Facebook profiles do) or using pagination. I can't imagine that all 1k - 3k images will be visible all at once.
There is no native JS way of doing this. You may be able to hack something using Flash but you really should resize the images on the server because:
You will save on bandwidth transferring those large 500K images to the client.
The client will be able to cache those images.
You'll get a faster loading page.
You'll be able to fit a lot more thumbnail images in memory and therefore will require less pagination.
more...
I'm (pretty) sure it can be done in browsers that support canvas. If this is a path you would like to take you should start here.
I see to possible problems with the canvas approach:
It will probably take a really long time (relatively speaking) to resize many images. Because of this, you're probably going to have to look into utilizing webworkers.
Will the browser actually free up any memory if you remove the image from the DOM and/or delete/null all references to those images? I don't know.
And some pretty pictures of a canvas-resized image:
this answer needs a ninja:--> Qk