How to cache an image array? - javascript

I'm trying to make an Image sequence scrolling website, something like Apple Airpod Pro.
I'm facing an issue with images loading, each image will load once it displays only.
It appears that I have to preload all images on dom to cache it out so it'll look smooth when scrolling.
So I'm looking for a better way to cache my images
Here is my image array :
const MAX = 4875; //last image number
const PREFIX = "images/image-scroll/Test_"; //image location
const EXT = ".jpg"; //image format
var images = [];
for (let i = 0; i <= MAX; i += 1) {
images.push(
PREFIX + ("00000" + i).slice(-5) + EXT
);
}
Currently, I'm preloading images this way:
//Preload Images
jQuery('document').ready(function(){
jQuery.each(images, function() {
jQuery('<img />').attr('src', this).appendTo('body').hide();
});
});
This method works, but it makes the site heavy and it'll keep creating img elements every time you update the page.
Is it possible if I can just load all images to cache it without adding elements to the code?

One possible solution is to call GET requests to the image URLS directly instead of waiting for browser to call it from the DOM element. This way the DOM is not modified and the images may be cached in the browser.
The network and caching layers should work similarly to how it would if you had added it to DOM.
//Preload Images
jQuery('document').ready(function(){
jQuery.each(images, function() {
fetch(this); // uses the fetch API to do a get request. You may use the jQuery AJAX if required.
});
});

Related

Can not load images from cache

Trying to build a gallery with thumbnails.
You click a thumbnail, the main image container will show it.
It seems that when I change the URL of the container, it will ALWAYS reload the image from the Internet and not cache it.
I tried 2 approaches both have the same result.
First :
//save to cache right when script is loaded
var cacheImages =[];
preloadimgs();
function preloadimgs(){
for(var k=0;k<5;k++) {
var img = new Image();
img.src = currentStore.product.media.photos[k];
cacheImages.push(img);
}
}
//load
function thumb(arg){
var prvimgShow = cacheImages[arg].src;
var finalurl="url("+prvimgShow+")";
$("#preview").stop().animate({opacity: 0.7},100,function(){
$(this).css({'background-image': finalurl})
.animate({opacity: 1},{duration:200});
});
}
Which will result in a VERY slow reloading of that photo whenever I click a thumb.
Second(which I asked about here already):
Here I get the photo FROM the thumbnail background as a source :
//get a thumb's background photo
var prvimgShow = document.getElementById("thumb0").style.backgroundImage;
//show it
$('#thumb0').on('click', () => {
$("#preview").stop().animate({
opacity: 0.7
}, 100, function() {
$(this).css({
'background-image': prvimgShow
})
.animate({
opacity: 1
}, {
duration: 200
});
});
})
BOTH RESULT IN RELOADING OF THE IMAGE WHICH TAKES 2 SECONDS.
The images are very small (300k), and for a gallery this is a bad UX.
While I don't know internals of browser and why it is not caching, One of the solution I can think of is:
Don't use the direct url and instead download the file and make it into a blob url. Keep the reference of the blob url instead in cacheImages just as you are doing. Then use the blog url else where and resource will not be downloaded again.
You can refer How to get a File() or Blob() from an URL in javascript? for how to fetch image and convert it to a blob url.
If you want the browser to load the images from the cache, you have to configure this on the server. You can use Etag-Header or Cache-Control Header. This will let the browser know that it can load these assets from the cache.

Replace Relative linked images with absolute path images in a string using JavaScript

I have a Google Chrome extension I am building for adding new bookmarks to my bookmarks app.
One of the features of my bookmark app is allowing to save a screenshot image of the web page and up to 3 additional images.
IN the Chrome extension, the 3 additional images show as a text input to insert an image URL.
Under each input I have scraped the web page HTML to find all images in the page and I show them in a slider with previous and next arrow buttons to rotate and view all the images on the page. If the user likes one of the images on the page, they can select it in this slider which then converts the image to Base64 encoded string and uploads to my remote bookmark app server.
My problem is that in the image selector where I show the images from the web page, it shows a broken image for any image that was in the page and was linked with a relative path instead of a full path with a domain name in it.
(last image shown in the 4 images in this animated GIF below shows the 4th is a broken image)
If I view the page source and see a relative linked image like this...
Then this image will show as a broken image in my image selector/slider in my extension as it will then link to the image like this where the relative linked image ends up getting the extension URL in front of it...
Below is my JavaScript function which scrapes the HTML and grabs the images found in the page.
I need to detect when the image URL is a relative linked image and then inject the page URL in front of the image URL to make it a absolute path linked image.
Any ideas how to achieve this?
Relative image urls currently end up linking to the image with this as the "domain"... chrome-extension://pcfibleldhbmpjaaebaplofnlodfldfj.
I need to instead inject the URL of the web page in front of all relative linked images.
In my JS function below where it saves the Image URL to an array,
var img.src looks like this on relative URL's...
So If I could simply replace chrome-extension://pcfibleldhbmpjaaebaplofnlodfldfj with the webpage URL that would fix my problem.
The chrome extension URL is different though so would need to match that pattern.
JavaScript function to get all images in an HTML string:
/**
* Scrape webpage and get all images found in HTML
* #param string $htmlSource - HTML string of the webpage HTML
* #return array - array of HTML strings with list items and images inside each list item
*/
scrapeWebpageForImages: function($htmlSource) {
// HTML source code of the webpage passed into jQuery so we can work on it as an object
var $html = $($htmlSource);
// All images
var images = $('img', $html),
scanned = 0,
filtered = [],
ogtmp = '',
srcs = {};
// Grab the open graph image
var ogimage = $('meta[property="og:image"]', $html);
if( ogimage.length > 0 ) {
ogtmp = $('<img>').prop({
'src': $(ogimage).text(),
'class': 'opengraph',
'width': 1000, // High priority
'height': 1000
});
images.push(ogtmp);
}
var i = 0,
l = images.length,
result = '',
img;
// Cycle through all images
for(; i < l; i++) {
scanned += 1;
img = images[i];
// Have we seen this image already?
if( !! srcs[$(img, $html).attr('src')] ) {
// Yep, skip it
continue;
} else {
//////////////////////////////////////
///
/// NEED TO DETECT A RELATIVE LINKED IMAGE AND REPLACE WITH ABSOLUTE LINKED IMAGE URL
/// USING THE WEBPAGE URL
///
//////////////////////////////////////
// Nope, remember it
srcs[$(img, $html).attr('src')] = true;
result = '<li><img src="'+img.src+'" title="'+img.alt+'"></li>';
filtered.push(result);
}
} // end for loop
return filtered;
},
var url = "chrome-extension://pcfibleldhbmpjaaebaplofnlodfldfj/assets/xyz";
var myRe = /chrome-extension:\/\/[\w]*/g;
var match = myRe.exec(url);
if(match.length > 0) {
// Pattern matched
var path = url.substring(match[0].length);
url = 'whatever your base url is' + path;
} else {
console.log('Did not find a url.');
}

Preload images asynchronously into the browser cache when the image has to be generated on the server first

been reading stackoverflow for a couple years now, but never posted. Until today - ran into an issue I could not solve by myself and did not find a solution for.
The scenario: I have a dynamic webpage which basically shows screenshots of websites. These screenshots are generated on the fly for every new user and their URLs change. I want to preload these images into the browser cache so they're available in 0ms once the user clicks on a link. I don't want the subjective load time of the page increased, so they have to be loaded invisibly in the background.
My approach:
I used jelastic as my infrastructure to be able to scale later, then installed centOS with nginx, PHP and PhantomJS. I use PHP to query phantomJS to make the screenshots:
exec ("phantomjs engine.js ".$source." ".$filez. " > /dev/null &");
The dev/null is used to not increase the load time to the user.
I output the links to the browser. So far it works. Now I want to preload these images:
for (var i = 0; i < document.links.length; i++) {
imgArray[i] = new Image(1,1);
imgArray[i].visibility = 'hidden';
imgArray[i].src = (document.links[i].href.substr(7) + ".png");
document.links[i].href = 'javascript: showtouser("' + imgArray[i].src.substr(7) + '");';
}
Two things I proably did wrong here:
I start the image preloading before the images are generated on the server. I haven't found a way to start the caching only once the image has been generated by phantomJS. Onload event obviously does not work here.
I think my approach is not really async and it would increase the subjective loading time felt by the user
What am I doing wrong? I'm an ISP guy, I suck at javascript :/
You're approach was async. What you needed was a mechanism to identify image that didn't load, and retry.
This script will preload images, retry if failed, and hide links of images that don't exist even after retries (demo):
var links = Array.prototype.slice.call(document.links, 0); // Converting links to a normal array
links.forEach(prepareImageLink);
function prepareImageLink(link) {
var retries = 3; // current image number of retries
var image = new Image();
/** option: hide all links then reveal those with preloaded image
link.style.visibility = 'hidden';
image.onload = function() {
link.style.visibility = '';
};
**/
image.onerror = function() {
if(retries === 0) {
link.style.visibility = 'hidden'; // hide the link if image doesn't exist
//link.remove(); // option - remove the link if image doesn't exist
return;
}
retries--;
setTimeout(function() {
image.src = image.src + '?' + Date.now(); // for image to reload by adding the current dateTime to url
}, 1000); // the amount of time to wait before retry, change it to fit you're system
};
image.src = (link.href + ".png"); // add .substr(7) in your case
/** This is instead of 'javascript: showtouser("' + imgArray[i].src.substr(7) + '");'; which won't work on some browsers **/
link.addEventListener('mouseover', function() {
document.getElementById('image').src = image.src; // change to your showtouser func
});
}

javascript: parsing JSON to create new classes with background-image preloaded

I am developing a browser game using MVC4 and have been able to deploy to my website for the first time tonight. I'm sloshing through the differences between running the site off of localhost and running it from the website.
Right now the first thing I do is load all of the static content on the page--navigation bars, icons, the world map, and some basic javascript to interact with it. Then, I call four AJAX functions that return information about various elements on my 2d map, such as cities and their x,y coordinate, name, and a small flag icon.
Upon completing the AJAX function I begin to parse the JSON returned. For each item in the JSON, I add a new 20x20 pixel class using code like this:
function loadSuccessCityTiles(data) {
var cityTiles = jQuery.parseJSON(data);
var cityTileCount = cityTiles.length;
$(".CityTile").remove();
for (var i = 0; i < cityTiles.length; i++) {
var imgName = "../Images/City_Tiles/Small/" + cityTiles[i].Type + "";
$("#scrollWindow").append("<div class = 'CityTile' style = 'left: " + Math.floor(cityTiles[i].X)*tileSize + "px; top: " + Math.floor(cityTiles[i].Y)*tileSize + "px; background-image: url(" + imgName + ");'></div>") ;
}
}
As you can see, I append a new class using that has its background-image set to the appropriate image url (that is what Type means). This works great on localhost, but when I run it on the website I discover that the images don't load.
I'm not quite sure how to proceed here. I have a lot of small image elements (each in separate files) and there is no guarantee that all will be used, so it is wasteful to load them all. What would be a good solution in this case? One thought I have had is to return JSON data on which image files will be used and then preload those images via javascript all before creating the classes that will use them.
Why does this happen on the server but not localhost? Clearing my cache before reloading on localhost does not recreate this problem. Is it because there is no need for a download as all the files are already on your hard drive?
Does anyone have a suggestion on a better way to do this all together?
A better option could be to use a single image sprite as a background image for each tile. Use the Type as a class name, and set the background x,y positioning in CSS. The single image will have a large file size, but it will be a single request that could easily be cached.
As for it not working, what errors are you getting? What do you see in the network tab in Chrome dev tools for the image request?

Changing image extension with javascript/jquery only?

I'm making a simple slider to show off artwork for a friend of mine. I'm really only familiar with javascript/jquery, so I'm not 100% comfortable using something else right now.
Since my friend doesn't have any programming knowledge, I'm trying to keep this really simple for her to update (i.e., automating creating new images whenever she adds a new one to the folder). She will upload images to a folder and will have to number them (i.e., 1.jpg, 2.jpg). My javascript uses a for loop to loop through numbers (she will have to update the loop whenever she adds a new image) and insert them into the file name. HOWEVER this limits her to only uploading one type of file. Is there someway to change the extension only using javascript?
This is what I have so far:
function callImages(){
//create the image div
$('.artslider').append('<div class="image"></div>');
//create the files array
var files = [];
//start the loop, starting position will have to be updated as images are added
for (i=8;i>=0;i--){
//create the img src for a jpg img
var imgJPG = 'arts/'+i+'.jpg';
//find the natural width of the image after it loads to see if it actually exists
var imgWidth = $('imgJPG').load().naturalWidth;
//if the width is undefined, replace the jpg extension with gif
if (imgWidth===undefined){
var imgGIF = imgJPG.replace('jpg', 'gif');
files[i] = '<img src="'+imgGIF+'" class="artsliderimg"/>';
}
//otherwise keep the jpg extension
else {
files[i] = '<img src="'+imgJPG+'" class="artsliderimg"/>';
}
//then add the images to the img div
$('.image').append(files[i]);
}
};
The problem with this if/else is that it will only create a gif image. If you switch the order, it will only create a jpg image.
edit: here's what this code produces: https://googledrive.com/host/0B1lNgklCWTGwV1N5cWNlNUJqMzg/index.html
The problem is with this bit of code:
var imgJPG = 'arts/'+i+'.jpg';
var imgWidth = $('imgJPG').load().naturalWidth;
imgWidth will always be undefined.
Firstly you are passing in the string 'imgJPG' instead of the parameter imgJPG. Secondly I think you have misunderstood jQuery selectors, this is used for selecting HTML elements, inputting a file path into here will not achieve anything. Thirdly I think you have misunderstood the load function, this is used for loading data from the server into a HTML element.
I would suggest using a function like below to check if the image exists:
function urlExists(url) {
var http = jQuery.ajax({
type:"HEAD",
url: url,
async: false
});
return http.status == 200;
}
Then in your code:
if (!urlExists(imgJPG)){
var imgGIF = imgJPG.replace('jpg', 'gif');
files[i] = '<img src="'+imgGIF+'" class="artsliderimg"/>';
}
else {
files[i] = '<img src="'+imgJPG+'" class="artsliderimg"/>';
}

Categories