I would like to have my background images ready before the HTML loads.
But, what is happening is some of my HTML elements like input elements are being loaded before the image is loaded in the background.
I have looked at lot of questions where they show how to preload the image. But, that some how doesn't solve my issue.
This is not what you want to do.
For the user this will result in decreased usability and a lack of responsiveness. You'll lose visitors very quickly making them wait for something. Take it the opposite direction and fill in the empty spot with a color while the image loads in the background.
However, if you really want to do this, there is a way to line them all up and load at the same time. You can convert your images to Base64 encoding and paste the code directly into the HTML or CSS.
Here is a very good article on Base64 encoding images and including them inline. Essentially what you'll be doing is turning a picture into text, pasting that text directly in your HTML, JS and/or CSS and then the browser turns it back into an image.
Use your favorite image editor and get your images down to < 32KB, then upload them to this page that will convert it into a long string of characters. If they are larger than 32KB you'll notice browser performance issues as well as incompatibility with Internet Explorer.
You'll take that string of characters and paste it directly where you would normally put the image URL. So if it's in HTML as a standalone image, you would say:
<img src="data:image/png;base64,iVBORw0KGgoAAAA... etc" />
For a CSS image (background-image, for example) it would follow this format:
background-image: url(data:image/png;base64,iVBORw0KGgoAAAA... etc);
Now, depending on how many you include in each file and how large they are, your site will not be very fast to download at first but after caching it won't be a noticeable issue.
You have to delay the loading of the html part. Use any javascript image preloading technique, like this one:
var x=new Image();
x.src="whatever.jpg";
...and then, use $.load(...) to load any further html into a div that's initially empty.
Related
I'm using a lazyload mechanism that only loads the relevant images once they're in the users viewport.
For this I've defined a data-src attribute which links to the original image and a base64 encoded placeholder image as src attribute to make the HTML valid.
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-src="/path/to/image.png" alt="some text">
I noticed that chrome caches the base64 string but the string is quite long and bloats my HTML (I have a lot of images on a page).
So my question is if it's better to use a small base64 encoded or a 1px x 1px placeholder image?
Note:
For SEO purposes the element must be an img. Also my HTML must be valid, so a src attribute is required.
You can use this shorter (but valid!) image in the src tag (1x1 pixel GIF):
data:image/gif;base64,R0lGODlhAQABAAD/ACwAAAAAAQABAAACADs=
Note that if you gzip your HTML (which you should), the length of the string won't be that important because repetitive strings compress well.
Depending on your needs you might want to use a color for the 1x1 pixel (results in shorter gif files). One way to do this is using Photoshop or a similar tool to create the 1x1 pixel GIF in the right color, and then using a tool like ImageOptim to find the best compression. There's various online tools to convert the resulting file to a data URL.
I'd use the placeholder in your situation.
Using the base64 encoded image kind of defeats the purpose of lazy loading since you're still having to send some image data to the browser. If anything this could be detrimental to performance since the image is downloaded as part of the original HTTP request, rather than via a separate request as a browser might make with an image tag and URL.
Ideally if it's just a 'loading' placeholder or something similar I'd create this in CSS and then replace it with the loaded image when the user scrolls down sufficiently as to invoke the loading of that particular image.
I noticed that chrome caches the base64 string but the string is quite long and bloats my HTML (I have a lot of images on a page).
If that is the case, consider placing a 'real' src attribute pointing to always the same placeholder. You do need an extra HTTP request, but:
it will be almost certainly pipelined and take little time.
it will trigger the image caching mechanism, which base64 does not do, so that the image will actually be only decoded once. Not a great issue given today's CPUs and GPUs, but anyway.
it will also be cached as a resource, and with the correct headers, it will stay a long time, giving zero load time in all subsequent page hits from the same client.
If the number of images on a page is significant, you might easily be better off with a "real" image.
I'd go as far as to venture that it will be more compatible with browsers, spiders and what not -- base64 encoding is widely supported, but plain images are even more so.
Even compared with the smallest images you can get in base64, 26 bytes become this
src="data:image/gif;base64,R0lGODlhAQABAAD/ACwAAAAAAQABAAACADs="
while you can go from
src="/img/p.png"
all the way to
src="p.png"
which looks quite unbloaty - if such a word even exists.
Test
I have ran a very basic test
<html>
<body>
<?php
switch($_GET['t']) {
case 'base64':
$src = 'data:image/gif;base64,R0lGODlhAQABAAD/ACwAAAAAAQABAAACADs=';
break;
case 'gif':
$src = 'p.gif';
break;
}
print str_repeat("<img src=\"{$src}\"/>", $_GET['n']);
?>
</body>
</html>
and I got:
images mode DOMContentLoaded Load Result
200 base64 202ms 310ms base64 is best
200 gif 348ms 437ms
1000 base64 559ms 622ms base64 is best
1000 gif 513ms 632ms
2000 base64 986ms 1033ms gif is best
2000 gif 811ms 947ms
So, at least on my machine, it would seem I'm giving you a bad advice, since you see no advantages in page load time until you have almost two thousand images.
However:
this heavily depends on server and network setup, and even more on actual DOM layout.
I only ran one test for each set, which is bad statistics, using Firebug, which is bad methodology - if you want to have solid data, run several dozen page loads in either mode using some Web performance monitoring tool and a clone of your real page.
(what about using PNG instead of gif?)
I've experienced good results with inline SVG for responsive image placeholders as described here.
Basically, you put something like
data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 500 500'%3e%3c/svg%3e
in your <img>'s src attribute. Beware to keep viewBox values aspect ratio on par with your real image dimensions. This way your layout won't jump around causing unnecessary repaints.
I have a website which contains some images in index.php
The problem I am facing is the whole page is not loading at once, I think images are taking some time to load
So what I have done is, I am showing an loading image at first and then after some time I am showing the page, that resolves the problem. But I am curious to know is there any other better way to do this?
I prefer to optimise the hell out of my images.
PNG images
You can use pngcrush to optimise your PNG files for you, but personally I find that once I'm done with it pngcrush only succeeds in making it bigger.
Use Indexed-PNG wherever possible. This will limit you to 256 colours, and most graphics editors won't allow partial transparency in Indexed-PNG (but it is possible - you just need the right editor. I use a custom PHP script with the GD image library) but you can expect to drop file size down to just a tiny fraction of what it was.
Reduce the amount of colours overall. PNG compression works best with blocks of the same colour, so reducing the number of colours improves compression.
GIF images
Especially for animations, there's a lot of things you can do.
Reduce the number of frames. Avoid duplicate frames at all costs, and just set the previous frame to have a longer display time.
Use combine rather than replace if possible. You will again have problems with transparent areas, but by using combine you can have each subsequent frame only change the stuff that... changes. This avoids the redundancy of re-writing the entire image if only a small part changes. GIMP has a useful filter "Animation > Optimize for GIF" which will do this for you.
Reduce colours as much as possible. GIF is limited to 256 colours, but if you can limit yourself to 32 or so, you'll get a much smaller file.
Using the above techniques, I once managed to shove 8MB of raw image data into a 125kb animated GIF.
JPG images
JPG is great for photos, but cameras have a tendency to write MASSIVE files.
Play around with the compression factor. Start at around 40%, and slowly bring it up until it looks acceptable. GIMP will show you a preview and the resulting filesize, so make use of that to find an acceptable compromise.
Scale the image down. You don't need 9 megapixels or however massive resolution cameras take now...
The above should help you reduce the amount of size taken by your images. Obviously, you should also cache images appropriately, so they only need to be retrieved once. Also make sure that you specify width and height on image elements so that the browser can reserve the space for them and avoid jumping around as they load...
And you should be pretty good.
It's hard to say what other options are available without knowing what the page looks like, but one option is to reserve space for the images so that the page text renders quickly in the correct position, and the images then load later.
Many many sites uses this technique (facebook, google as well)
For example, open facebook.com
Save this page (not as *.MHTM but HTML with images) (mean login page)
It saves:
facebook_files(dir)
facebook.html(file)
Then inside the folder, You can see one GIF file which containts all primary images for the page.
The question is: How to read many chunks inside one file??
And how to call this approach?
Those images are called "sprites". Take a look a this article on them.
The basic idea is that whenever you want to use an image from the sprite, you have an element which just shows part of the big sprite image. So each "image" in your page is actually a div with this image as the background, just offset so the right part shows through.
The main advantage is that your page needs fewer requests and so loads faster.
There are some tools online that make using sprites easier. I haven't used any of them so can't recommend one, but using a tool would save you a lot of work.
This is what you call "spriting", like the spriting used in arcade games (one image of a character with it's different positions). Basically it's one huge chunk of image containing smaller images.
The advantage of this approach is that instead of 100 different HTTP requests for 100 tiny gifs (which causes overhead), you only need to request one huge image containing this 100 gifs. Then instead of using <img> per image, you use the CSS background instead, then use background-position to align the right image "through" the container to show the right image.
how can I change the way an image loads on a web page? I presume using javascript to do this. I'm looking for a way to have the picture load at a lower resolution and then get "sharper". As appose to loading downward, if that makes sense. Facebook does this with their "theater" picture pop-up window.
This is actually due to the way that the image is encoded, namely images that are interlaced will have this effect.
http://en.wikipedia.org/wiki/Interlacing_(bitmaps)
Check to see if your image editing utility has this feature, applications such as photoshop definitely will but something as simple as paint won't.
Sounds to me a lot like progressive loading in jpg images. That's something you have to adjust while creating the image. I'm only familiar with the gimp, there you have to check a checkbox while exporting to jpg. Check out this screenshot.
Another way to achieve this is to initially point the images on the webpage to smaller images and then do some stuff with some jQuery plugin. I'm not sure right now but I think there was one called jQuery.lazyload or sth. like that.
Hope it helps you!
To do this you don't need javascript. It is actually part of how you saved your image. You should make your image with progressive option. It adds a little weight on the image, but it will show parts of image as it is loading.
To do this on Photoshop:
Open your images file
Choose File
Choose Save for Web... option.
On the opened Dialog select jpg
On the upper right corner there are few options. Check 'Progressive' option
You are done!
Now replace your image and all the browser will show that image as they are loading it.
so I was browsing through this page: http://360langstrasse.sf.tv/
It basically is a Javascript-Street View, but only allowing one direction. Therefore its kinda like playing a movie.
When moving fast I notived that the images are grainy/pixelated, the same way as when browsing through Facebook.
I was wondering how to implement this?
I tried with sending small base64 encoded images in the markup, and then draw it on a canvas until the 'real' image was loaded.
This worked fine, but left me thinking if this would indeed increase performance, or do sites like facebook do it differently?
Thanks in advance for any help.
Regards Jens
Edit: Or do they only display the images differently? Have another render process than usual?
As I don't see any small images beeing loaded?
Edit 2: The below mentionewd option to load small images first is descriped nicely here: http://www.phpied.com/picassa-progressive-image-rendering/
But basically it is pretty simple.
I suppose caching (having in-page) bad-resolution images and fetching better is a real way to accomplish this.
The other way would be linking to small images in a normal way, and fetching bigger with JS - small images should load really fast, or you can subscribe to their load event (tricky in IE) and show the page (remove some overlay) once they are loaded.
BTW, instead of using canvas you can put base64 directly into src
<img src="data:image/png;base64,...
The answer is Progressive JPEG's!
One create such with imagemagick for example. This way the browser renders progressively until aborted or completed. Those images may be bigger than normal images, but not always.
Furthermore they provide the ability to be seen before completely downloaded.
Thanks for the help!