I have a slider and am loading low grade images to facilitate a quick load.
on window ready i am loading the proper sized images
function loadImages(){
var images= document.getElementById('dhadimages').getElementsByClassName("dhadsecondimg");
var index;
for (index = 0; index < images.length; ++index) {
images[index].src= images[index].dataset.img;
}
}
window.onload = function () { loadImages(); }
but pagespeed counts this as part of the render time. For the moment i am loading when the user clicks on something and obviously this works, but why is pagespeed not detecting window.onload and stopping measurements there? Is there any techniques beside delay or any later events to bind to?
Add the async tag to your script (you may have to make this a separate script to enable you to do this). The browser executes all of your JS before "saying" it's finished with the DOM. Adding async as a tag to this script lets the browser continue building the DOM without waiting for the image loads.
Related
I am playing with html2canvas and I figured out it's rendering my DOM incorrectly.
I tried to Google but only see some information about putting html2canvas inside a setTimeout callback like this:
var delay = 1000;
$('#image').css('width', 64);
// if I stopped running the following code, the image is correctly rendered.
setTimeout(function(){
html2canvas($("#h2cwrap")[0]).then(function (canvas) {
showImage64(canvas.toDataURL());
});
},delay)
I noticed that even changing the var delay = 1000 setting would lead to different results. For example, if it's set to var delay = 1, then the image would be resized correctly but aligned wrongly. If it's set to var delay = 1000, the image would be resized and aligned both correctly.
How do I make sure html2canvas renders exactly what I see without using such weird hacks?
The setTimeout could be there to wait for the browser to load all resources including your image and styles.
You could alternatively listen for the load event which will fire after all resources are loaded.
window.addEventListener('load', (event) => {
// All resources in document loaded and parsed
console.log('page is fully loaded');
html2canvas($("#h2cwrap")[0]).then(function (canvas) {
showImage64(canvas.toDataURL());
});
});
I want to asynchronously download image, so first user sees a low resolution image, and higher resolution version is downloaded in the background. I have tried the following.
<html>
<head>
<script>
window.addEventListener('load', function () {
var kuvaEl = document.getElementById('kuva');
var r_src = kuvaEl.getAttribute('r-src');
var a_src = kuvaEl.getAttribute('a-src');
kuvaEl.setAttribute('src', r_src);
kuvaEl.setAttribute('src', a_src);
});
</script>
</head>
<body>
<img id="kuva" src="http://www.viikonloppu.com/wp-content/uploads/2014/04/lotoflaughters.com_-619x428.jpg?c3bc1b"
a-src="https://www.manitowoccranes.com/~/media/Images/news/2014/Potain-China-hi-res.jpg"
r-src="http://fuzyll.com/images/2016/angel_oak_panorama.jpg" />
</body>
</html>
But the problem is r_src download is aborted when src is change second time. I want to download both of these images in parallel, and show the r_src first (only if it downloads faster than a_src), and when the *a_src *is ready, show the a_src.
Also, is it possible to download these a_src and r_src images to the browser cache before the src is actually changed? Ideally I would like the the src change to either retrieve the image from the cache or join the pending download for that url.
I can also use jQuery. IE7 must support the implementation.
You just need to use javascript or jquery and load two version of the same image. the first will be your low res, but you will download a high res inside an hidden img tag.
When the download is complete, you just hide / delete the low res image and show the high res.
This link show some test and few way to do it. And it should support ie7 Load a low-res background image first, then a high-res one
You can use interlaced progressive JPEG format.
This method is the preferred method for handling high quality images and has been implemented by so many websites.the idea is that the compression of the image is made in such away that the when you send the image the receiver gets the image in finer and finer detail has the sending of the data progressed.
if you dont want to use the abouve technique
Have the low quality image in the src of the image. once the whole page loaded successfully,change the low quality image with high quality image
<img id="target-image" src="low-quality.jpg" data-src="high-quality.jpg" />
$(window).load(function(){
var imgSrc = $('#target-image').data('src');
$('#target-image').attr('src',imgSrc);
});
You should put your low res as default src. Then use JS to download the high res version and on download completion, change image src.
Also, good practice is to use data-* for custom attributes
If your really want a parallel download, you should replace "load" event for the "DOMContentLoaded" event. However, this will extend the time your user has to wait until page is ready. Your should keep the load event to prioritize critical assets loading (scripts and stylesheets)
window.addEventListener('load', function() {
// get all images
let images = document.getElementsByClassName("toHighRes");
// for each images, do the background loading
for (let i = 0; i < images.length; i++) {
InitHighResLoading(images[i]);
}
});
function InitHighResLoading(image) {
let hrSrc = image.dataset["hr"];
let img = new Image();
img.onload = () => {
// callback when image is loaded
image.src = hrSrc;
}
// launch download
img.src = hrSrc;
}
img {
/* only for code snippet */
max-height: 300px;
}
<img class="toHighRes"
data-hr="https://www.manitowoccranes.com/~/media/Images/news/2014/Potain-China-hi-res.jpg"
src="http://fuzyll.com/images/2016/angel_oak_panorama.jpg" />
I'm using this JavaScript to preload few images on my website.
var images = new Array()
function preload() {
for (i = 0; i < preload.arguments.length; i++) {
images[i] = new Image()
images[i].src = preload.arguments[i]
}
}
preload(
"img/1.png",
"img/hover.png",
"img/image.png",
"img/work1.png"
)
This code is linked in HEAD of the site.
But when someone is visiting my website he waits for few second's while images are loaded and in that time he sees blank (white) website until JS files are loaded. I want to make that when someone visit my website he see a "Loading progess bar" or message that say "Wati until page is loaded" etc. Without a blank index page where JavaScript is linked
Unless an accurate progress bar is actually helpful to your users you are probably better off simply use an animated gif that gets hidden after your loading functions finish. Something like this:
Put the gif on the top of your index.html file. Something like
<div id="loading-gif"><img src="/path/to/gif"></div>
Then when your content loads, simply execute something like
document.getElementById("loading-gif").style.display = 'none';
Note, license information for the above image is located here.
I have some tracking pixels on our site, that I'd like to protect against them impacting our user experience if their servers are down or slow. What's the easiest way to specify a maximum time the browser should attempt to load a given img - i.e. try for 100ms and then give up? (I'd rather not track a given customer than have a server hang on the third-party server impact our site).
You could insert the <img> with JavaScript and use setTimeout() to remove it after 100ms.
Example with jQuery:
var tracker = $("<img>", { src: trackingUrl }).appendTo(document.body);
setTimeout(function() { tracker.remove(); }, 100);
you should load them when the document is ready.
or at the lastline ( in the html). this way- it wont hurt the user experience.
document ready can be also used with jQuery.
but you can use window.load.
as a rule(not always) - all scripts should be at the end of the page.
if you want to FORCE time out KILL :
create an img tag.
attach the load event to the img (this function will set flag : downloaded=1;)
set the src.
with setTimeout Function your gonna kill the img.
how ?
if after X MS the downloaded ==0 then kill.
so : each load event( from the IMg) is setting a flag ( downloaded=1).
your timeout function dont care about nothing!!! after x MS she going to kill the img - only if the downloaded==0.
You would have to use javascript to do this, there's nothing native to HTML/HTTP that would do this on a page basis. Google around for "HTML IMG timeout".
as img.parentNode.removeChild(img)
Not all img have containers.
var img = new Image()
img.src = '...third-party server...'
setTimeout(function() {
img.removeAttribute('src')
}, 100)
You could call a server process in the IMG tag. Let it worry about timing out the load.
I would like to stop images from loading, as in not even get a chance to download, using greasemonkey. Right now I have
var images = document.getElementsByTagName('img');
for (var i=0; i<images.length; i++){
images[i].src = "";
}
but I don't think this actually stops the images from downloading. Anyone know how to stop the images from loading?
Thanks for your time and help :)
If you want to disable images downloading for all websites (which I guess you might not be doing) and are using firefox, why not just disable them in preferences? Go to the content tab and switch off "Load images automatically".
Almost all images are not downloaded. So your script almost working as is.
I've tested the following script:
// ==UserScript==
// #name stop downloading images
// #namespace http://stackoverflow.com/questions/387388
// #include http://flickr.com/*
// ==/UserScript==
var images = document.getElementsByTagName('img');
for (var n = images.length; n--> 0;) {
var img = images[n];
img.setAttribute("src", "");
}
Use a dedicated extension to manage images (something like ImgLikeOpera).
If you'd like to filter images on all browser then a proxy with filtering capabilities might help e.g., Privoxy.
I believe greasemonkey script are executed after the loading of the page, so I guess the images are loaded too.
Not entirely related, but I use this bit of code to toggle displaying of images in Firefox in the EasyGestures plugin. I am not sure if this can be translated to greasemonkey, but it might be a starting point.
var prefs = Components.classes["#mozilla.org/preferences-service;1"].
getService(Components.interfaces.nsIPrefBranch);
var nImgPref = prefs.getIntPref("permissions.default.image");
if (nImgPref == 1) {
prefs.setIntPref("permissions.default.image",2)
alert('Images off.');
} else {
prefs.setIntPref("permissions.default.image",1)
alert('Images on.');
}
I know it's not greasemonkey, but you could try the "IMG Like Opera" extension. It definitely keeps the files from downloading, and has more flexibility than just on/off.
Do you know that the images still load? Maybe you should assert it using Firebug or some such?