jquery - replace broken image icon while image is loading - javascript

I have a page that has a list of many <img> tag. So it takes a long time to load all images. Before loading any image, I see the broken image icon. I want to replace broken image while loading the images. I tested this answer, but it just worked when a error happens. Is there anyway for doing that with javascript or jquery?

I found a good solution on GitHub. Just use the CSS code below:
img[src=""],
img:not([src]) {
visibility: hidden;
}
Link: https://github.com/wp-media/rocket-lazy-load/issues/60

You can load a placeholder image, but then you must load that image (when you're already loading another image). If you load something like a spinner via a GET request, that should be ok since you can set cache headers from the server so the browser does not actually make any additional requests for that loading image. A way that Pinterest gets around this is by loading a solid color and the title of each of their posts in the post boxes while the images are loading, but now we're getting into a design discussion. There are multiple ways to skin a cat.
Regarding loading several images, you have to understand a couple considerations:
The time it takes to fetch and download an image.
The time it takes to decode this image.
The maximum number of concurrent sockets you may have open on a page.
If you don't have a ton of images that need to be loaded up front, consideration 3 is typically not a problem since you can optimistically load images under the fold, but if you have 100s of images on the page that need to be loaded quickly for a good user experience, then you may need to find a better solution. Why? Because you're incurring 100s of additional round trips to your server just load each image which makes up a small portion of the total loading spectrum (the spectrum being 100s of images). Not only that, but you're getting choked by the browser limitation of having X number of concurrent requests to fetch these images.
If you have many small images, you may want to go with an approach similar to what Dropbox describes here. The basic gist is that you make one giant request for multiple thumbnails and then get a chunked encoding response back. That means that each packet on the response will contain the payload of each thumbnail. Usually this means that you're getting back the base64-encoded version of the payload, which means that, although you are reducing the number of round trips to your server to potentially just one, you will have a greater amount of data to transfer to the client (browser) since the string representation of the payload will be larger than the binary representation. Another issue is that you can no longer safely cache this request on the browser without using something like IndexedDB. You also incur a decode cost when you set the background image of each img tag to a base64 string since the browser now must convert the string to binary and then have the img tag decode that as whatever file format it is (instead of skipping the base64->binary step altogether when you request an image and get a binary response back).

you can use placeholder image, which is very light weight and use that in place of each image.
same time while loading page, you can load all the images in hidden div.
then on document ready you can replace all the images with jQuery.
e.g.
HTML
<img src="tiny_placeholder_image" alt="" data-src="original_image_src"/>
<!-- upto N images -->
<!-- images are loading in background -->
<div style="display:none">
<img src="original_image_src" alt=""/>
<!-- upto N images -->
</div>
JavaScript
(function($){
// Now replace all data-src with src in images.
})(jQuery);

Related

Get an image with HTML fallback with caching?

I have a user image control, an identicon - if the user has an image set it should display it, if not it displays their initials with a coloured background (and some other HTML).
There are large numbers of these images, with lots shown at once, and also the same user might load multiple times on a page. Images can contain alpha transparencies and display against multiple different background colours. Images can be high-DPI and the fallback (and most of the other images they display next to) use SVG.
Images can change, but not often. Typically they can be cached for a few days or weeks.
I think there are two approaches to this:
JavaScript
In this model the server returns JSON that I cache in IndexedDB, and the JSON contains a data-URI for the image if set. Additional JS is required to check whether a request is already being made for the same user and avoid repeating it.
HTML <object> fallback
For this the server returns the actual image bytes, or a 404 if not found. Then if the image isn't found the content of the <object> tag renders:
#identicon,
#identicon > div { display: inline-block; width: 50px; height: 50px; line-height: 50px; text-align: center; background-color: red; overflow: hidden; }
<object id="identicon" data="api/users/joebloggs/photo" type="image/png">
<div>JB</div>
<!-- some more HTML and SVG content, snipped here -->
</object>
<object id="identicon" data="http://placehold.it/50" type="image/png">
<div>PH</div>
<!-- some more HTML and SVG content, snipped here -->
</object>
There are significant disadvantages for the JS option, as even if that JS is heavily optimised IndexedDB means callbacks (or synchronous options, like LocalStorage, tend to be slow). This involves quite a lot of JS for what should be a fairly simple task. It has one big advantage - it only asks for each user's photo once.
I like the simplicity of the <object> fallback, it has some minor drawbacks (lots of weird rules apply to how the HTML inside an object gets rendered), but one major problem: if the image isn't found it round trips the server again every time. A lot of users don't have images. Every page load gets a long stack of 404s for user images it should already know don't have one.
JS is the best way, stick with it - any way to make it less JS/overhead?
Leave the <object> with the 404s.
Add an event to the <object> that fires some JS when it fails to load the image, but I'm not sure how to stop it trying again without reverting to (1)
Change the server response to return a user image even when it fails.
This loses the elegance of the fallback and means lots of server-generated images.
This is what SO/gravatar does, but means some issues for us with high DPI screens.
Change the server response to return a transparent image when it fails.
This means the same image being cached lots of times and means layering content even for users that do have a photo. It also causes issues for images with alpha transparencies.
Adapt the service worker to handle requests for user images as a special case, caching 404 responses and skipping the network round trip if they are attempted again.
Abuse some other HTTP status response that <object> still sees as a failure to find the image and falls back to the content, but that is cached by the browser.
Some other way?
All of these involve compromises, but I'm also fairly sure that I'm re-inventing the wheel here - chances are someone (in all the identicon implementations that are already out there) has already solved this. Can anyone point me in the right direction or suggest the best way to solve the problem?
You can use a cache-control header with a 404 and the browser will honour it, and this check is much quicker than a call to localStorage or IndexedDB.
The solution is just to add the caching header to the 404 response. In .NET Core this is:
context.Response.StatusCode = (int)HttpStatusCode.NotFound;
context.Response.Headers.Add("Cache-Control", $"public,max-age={duration}");
Then it won't go back to the server until the duration has expired, even though no image was found, <object> still falls back to its content and <img> still fires an error event.

Set background-image for button to element already contained within page

Basically I want to specify the background-image for a button using something other than an image url. Being able to set the background-image to an already loaded element contained within the DOM would be ideal. This is so that I can cache a loading gif (displayed on the button) within the DOM and don't have to fetch it when the button is clicked.
I didn't think code was necessary to illustrate the problem but here is some anyway
Not ideal due to image loading on click:
that.submitButtonSelector.css('background-image', 'url(/Content/_activity/ajax-loader.gif)');
Ideal (but no obvious way to achieve)
that.submitButtonSelector.css('background-image', '#precachedImage');
If you load the loading gif via the url, it will be cached in most cases. You only need to download it once. After that it will be served up from cache. The only other thing I can think of is to use a base64 source. This has the benefit if not generating an HTTP request, but is larger when it comes to actual bytes (I don't think the larger size is slower than another HTTP request, but you can always benchmark them).
In my experience I believe base64 images are great if you need to immediately show the loading icon and if the icon is small enough, but if you don't need to show it right away, I suggest preloading the image via url with javascript and just relying on the cached version.
So in your case, if you went with base64, you could use
that.submitButtonSelector.css('background-image', 'url(data:image/gif;base64,R0lGODlhEAAQAPIAAP///wAAAMLCwkJCQgAAAGJiYoKCgpKSkiH+GkNyZWF0ZWQgd2l0aCBhamF4bG9hZC5pbmZvACH5BAAKAAAAIf8LTkVUU0NBUEUyLjADAQAAACwAAAAAEAAQAAADMwi63P4wyklrE2MIOggZnAdOmGYJRbExwroUmcG2LmDEwnHQLVsYOd2mBzkYDAdKa+dIAAAh+QQACgABACwAAAAAEAAQAAADNAi63P5OjCEgG4QMu7DmikRxQlFUYDEZIGBMRVsaqHwctXXf7WEYB4Ag1xjihkMZsiUkKhIAIfkEAAoAAgAsAAAAABAAEAAAAzYIujIjK8pByJDMlFYvBoVjHA70GU7xSUJhmKtwHPAKzLO9HMaoKwJZ7Rf8AYPDDzKpZBqfvwQAIfkEAAoAAwAsAAAAABAAEAAAAzMIumIlK8oyhpHsnFZfhYumCYUhDAQxRIdhHBGqRoKw0R8DYlJd8z0fMDgsGo/IpHI5TAAAIfkEAAoABAAsAAAAABAAEAAAAzIIunInK0rnZBTwGPNMgQwmdsNgXGJUlIWEuR5oWUIpz8pAEAMe6TwfwyYsGo/IpFKSAAAh+QQACgAFACwAAAAAEAAQAAADMwi6IMKQORfjdOe82p4wGccc4CEuQradylesojEMBgsUc2G7sDX3lQGBMLAJibufbSlKAAAh+QQACgAGACwAAAAAEAAQAAADMgi63P7wCRHZnFVdmgHu2nFwlWCI3WGc3TSWhUFGxTAUkGCbtgENBMJAEJsxgMLWzpEAACH5BAAKAAcALAAAAAAQABAAAAMyCLrc/jDKSatlQtScKdceCAjDII7HcQ4EMTCpyrCuUBjCYRgHVtqlAiB1YhiCnlsRkAAAOwAAAAAAAAAAAA==)');
body:after {
content: url(data:image/gif;base64,R0lGODlhEAAQAPIAAP///wAAAMLCwkJCQgAAAGJiYoKCgpKSkiH+GkNyZWF0ZWQgd2l0aCBhamF4bG9hZC5pbmZvACH5BAAKAAAAIf8LTkVUU0NBUEUyLjADAQAAACwAAAAAEAAQAAADMwi63P4wyklrE2MIOggZnAdOmGYJRbExwroUmcG2LmDEwnHQLVsYOd2mBzkYDAdKa+dIAAAh+QQACgABACwAAAAAEAAQAAADNAi63P5OjCEgG4QMu7DmikRxQlFUYDEZIGBMRVsaqHwctXXf7WEYB4Ag1xjihkMZsiUkKhIAIfkEAAoAAgAsAAAAABAAEAAAAzYIujIjK8pByJDMlFYvBoVjHA70GU7xSUJhmKtwHPAKzLO9HMaoKwJZ7Rf8AYPDDzKpZBqfvwQAIfkEAAoAAwAsAAAAABAAEAAAAzMIumIlK8oyhpHsnFZfhYumCYUhDAQxRIdhHBGqRoKw0R8DYlJd8z0fMDgsGo/IpHI5TAAAIfkEAAoABAAsAAAAABAAEAAAAzIIunInK0rnZBTwGPNMgQwmdsNgXGJUlIWEuR5oWUIpz8pAEAMe6TwfwyYsGo/IpFKSAAAh+QQACgAFACwAAAAAEAAQAAADMwi6IMKQORfjdOe82p4wGccc4CEuQradylesojEMBgsUc2G7sDX3lQGBMLAJibufbSlKAAAh+QQACgAGACwAAAAAEAAQAAADMgi63P7wCRHZnFVdmgHu2nFwlWCI3WGc3TSWhUFGxTAUkGCbtgENBMJAEJsxgMLWzpEAACH5BAAKAAcALAAAAAAQABAAAAMyCLrc/jDKSatlQtScKdceCAjDII7HcQ4EMTCpyrCuUBjCYRgHVtqlAiB1YhiCnlsRkAAAOwAAAAAAAAAAAA==)
}
You can have a hidden div that preloads the image into cache, and then once it's loaded you can add the background.
HTML:
<img id="img" src="/Content/_activity/ajax-loader.gif" />
CSS:
#img {
display: none;
}
jQuery:
$("#img").load(function() {
$(submitButtonSelector).click(function() {
that.submitButtonSelector.css("background-image", "/Content/_activity/ajax-loader.gif");
});
});

Get HTML elements from a document in the server and show them dynamically in the client

Context
I am making an application for showing a synchronized HTML5 slideshow to about 50 spectators in a wireless LAN with no internet access.
I run a Node.js server in one of the computers and connect with the 50 clients via Socket.IO (Btw, only one of them controlls the presentation).
The hardware is a domestic wireless 802.11b/g router and 50 mobile devices (tablets, netbooks, smartphones).
Problem
When the slideshow starts, it takes too long (about 10 minutes or more for a 5 MB slideshow) for the clients to see it, since the router has to send the complete slideshow to all the clients at the same time.
How my slideshow looks
<html>
<head>
<title>My Slideshow</title>
<script src="javascripts/slidesplayer.js"></script>
<link rel="stylesheet" href="/stylesheets/style.css">
</head>
<body>
<div id="slides-containter">
<div class="slide" id="slide_1">
<!--Contents such as images, text, video and audio sources -->
</div>
<div class="slide" id="slide_2">
<!--Contents -->
</div>
<!--A bunch of slides here-->
</div>
<script>
// Here I load the slides
</script>
</body>
</html>
What I would like to do
At the beginning, I would like to load the slides-container element completely empty.
Then, as I advance through the slideshow, I'd like to GET from the server the div representing the next slide, and append it to the DOM so that only when that is done, the client starts to download the pictures and othet stuff only for that slide (thus, decreasing significantly my network overload).
Another relevant fact is that the slideshow (including the slidesplayer.js) is automatically generated from an external software that parses PowerPoint presentations to this HTML5 format and that we will use a lot of presentations that are already made in PowerPoint.
My first impression is that I should accomplish this by using jQuery-ajax, but I don't know exactly how to do it the good way, since my idea is just copying the div.slide elements in separate files.
Update: This answer suggests using jQuery for DOM manipulation before displaying. It seems that jQuery requests the resources everytime you manipulate a DOM object, even if it is not inserted into your current DOM. So, one possible solution would be working only with strings. You can see more about this issue in this and this questions.
One solution would be to treat this as a front-end solution. The front-end should arguably only eat as much as it can take at any one time.
I'm assuming it's external resources (imagery etc) as opposed to the slideshow markup itself that's making up the most of those 5MB, in which case the DOM should not attempt to call those resources until they are necessary.
I would suggest serving the whole slide document to an ajax call but only introducing the markup to each slide as it is called. Something like this:
$.ajax('path/to/slides', {
async: false,
complete: function ajaxCallback(slidesDOM){
// Pull out the individual slides from your slideshow HTML
$slides = $(slidesDOM).find('.slide');
// For each of these...
$slides.each(function prepareSlide(){
// Store a reference to the slide's contents
var $slideContent = $($(this).html());
// Empty the contents and keep only the slide element itself
var $slideWrapper = $(this).empty();
$slideWrapper
// Put the slide where you want it
.appendTo('.slidesContainer')
// And attach some kind of event to it
// (depending on how your slideware works, you might want to bind this elsewhere)
.on('focus', function injectContent(){
// Put the content in — NOW external resources will load
$slideWrapper.append($slideContent);
// Unbind this function trigger
$slideWrapper.off('focus', injectContent);
});
})
}
});
1) You shouldn't be streaming payloads with SocketIO. Socket is made for low-load. If you need to transmit en-masse, I'd recommend using a standard HTTP AJAX request. Then, you can use Socket.IO to control which slide you are on.
2) Try AngularJS. They've basically done all the thinking for you regarding view switching (which is essentially what you are doing). They have a great tutorial, which helps alot.
3) To simplify you Socket calls, I'd recommend using ConversationJS both client and server side.
As I said in the question, manipulating DOM elements will cause the browser to download the resources, even if you don't insert the elements that use that resources in your DOM.
In my case, the best solution I could make was to use some sort of lazy loading at least for the img tags (but it could be easily extended for other tags, such as audio and video).
What I did was replacing replacing the src attribute with another name (xsrc in this case) and adding a custom empty src attribute to all img tags.
<img id="someImg" src="#" xsrc="foo.png"></img>
Then, with jQuery I changed the src attribute value to that of xsrc whenever I needed to dowload the image.
// When I want the image to be downloaded from the server
$('#someImg').attr( 'src' , $('#someImg').attr('xsrc') )
You can see more about the idea behind this in the questions I already mentioned (this and this).

How to scrape relative images

If i look at Amazon Button to add items to lists on their site - you can see it here:
http://www.amazon.co.uk/wishlist/get-button
How does it work? I'm pretty sure it scrapes the page somehow but it seems to get every image whether its a flash image, jpg or anything, even when the site in question uses relative img src as opposed to absolute full site urls
Example page below, all images shown are jpg which is cool but all img src are relative meaning no "http://blah.com" before them
http://gadgets.guardianoffers.co.uk/p-788-Casio-Solar-Powered-Edifice-Watch.html
Is there a better way to get images other than parsing the html source?
Or are they just doing a million ifs if they don't get a hit straight away?
It looks like it parses the HTML of the page and looks for what is semantically identified as the primary image, name and price. For example, if you look at a page that doesn't have any ecommerce products, for example: http://www.theglobeandmail.com/ it takes the page h1 element as the product name and the primary image (front page story image) as the product image.
So behind the scenes they are doing a lot of guessing. Using HTML 5 semantic markup, you could establish a standard for this kind of thing, but unless everyone is using it, you are just making educated guesses.

Thumbnail Generation

Is there a "good" way to generate a thumbnail for a JPEG file that doesn't have on in the EXIF information?
If a user uploads an image, I'm trying to display the thumbnail, but if there's no thumbnail in the image the display is blank (src=""). Currently we replace blank images with a loading image that gets swapped out with the thumbnail generated on the server by an AJAX call back.
Is there anything I can do client-side (jQuery & jQuery UI are already integrated, but open to new libraries if they'll help) other than displaying a loading image that gets changed by the AJAX call back?
For clarification I don't just mean taking the Base64 raw data and changing the height and width attributes most of our users upload 5MB+ files, and the text alone crashes the browser if we use the raw data. I mean actually appending a thumbnail to the EXIF, or creating it and using the created base64 data as the image source.
Apologies in advance for breaches in etiquette or lack of clarity, this is my first question on Stack Overflow.

Categories