Preloading/prefetching large sets of remote images via JavaScript - javascript

I am using the JavaScript (shown below) to preload/prefetch images. The use case is that I have client-side searchable lists of up to 30,000 records, and when a user uses the search I want the avatar/image associated with the search result to render immediately.
I'm running into apparent browser performance issues (browser and other JS code starts to become less responsive) when using the code below with large arrays of images (ie. 5,000 images).
Does anyone have suggestions for how to optimize this code to perform better? Or, is there a way to do this in the background using a Service Worker?
function prefetchImages(arr) {
var cache = document.createElement("CACHE");
cache.style = "position:absolute;z-index:-1000;opacity:0;";
document.body.appendChild(cache);
for (var i=0; i<arr.length; i++) {
var img = new Image();
img.src = arr[i];
img.style = "position:absolute";
cache.appendChild(img);
}
}
prefetchImages(['https://avatars.githubusercontent.com/u/9919?s=200&v=4']

Related

Javascript variable VS Object storage for large data script execution

I'm actually running a JS script (ES6 and Guzzle) in the browser (it have to run in a browser, no NodeJS).
This script is calling some xml files, and store results for later usage (I output the convert and output then process it to be able to import it in a database).
So this script will generate an array containing thousands and thousands of small JS object (from XML parsing).
As the script take really long to run, I'm looping on my URL array (I have a list of all file URL), and storing query result into a classic JS variable, and local storage after jsonEncode. As it's JSON-encoded, the localStorage value is erased every time and a new bigger string is saved for the same key.
My question :
Is it better to use only a classic variable? Or only the local storage?
Is there any other way to store a large amount of data for a script? (temporary blob, text file, DOM append...)
From my tests, after 3-4k files queried and result stored, the browser starts to slow down a lot and drastically reduce the number of HTTP request/minutes.
Thanks !
Notes :
It have to run in a browser (I need some dynamic DOM data, it's an internal dashboard that display stats, with user inputs for live settings).
It need to run only on latest Chrome or Firefox
the localStorage value is erased every time and a new bigger string is saved for the same key.
This deserialization-append-serialization process is what slows down the page. Instead you could store each entry in its own key, that way appending is much more performant:
class PersistentArray {
constructor(name) {
this.name = name;
this.length = +localStorage.getItem(name) || 0;
}
push(value) {
set(this.length, value);
}
set(index, value) {
if(index >= this.length)
localStorage.setItem(this.name, this.length = index + 1);
localStorage.setItem(this.name + index, JSON.stringify(value));
}
get(index) {
return JSON.parse(localStorage.getItem(this.name + index));
}
*[Symbol.iterator] () {
for(let i = 0; i < this.length; i++)
yield this.get(i);
}
}
That way you can easily push values as:
const pages = new PersistentArray("pages");
// ... somewhen
pages.push({ value: "whatever" });
When all the data is there:
// Turn into a real in memoy array:
const result = [...pages];
// Dynamically load:
for(const page of pages)
console.log(page);

Iterate through multiple web pages in JS with sleep in between

I was checking some simple solutions for showing multiple web pages for some dashboard and currently fighting with simple HTML page with javascript inside to achieve what I want to see there.
var urls = new Array();
urls[0] = "https://stackoverflow.com/"
urls[1] = "https://www.google.com"
var arrayLength = urls.length;
for (var i = 0; i < arrayLength; i++) {
window.location.assign(urls[i]);
sleep(3000);
}
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds) {
break;
}
}
}
Currently this page opens only first page (after some time) and looks like it doesn't do iteration trough other pages. Maybe you could help me to make it work? I want to rotate those pages forever on screen (will add some infinite while loop after making this part working).
Currently this page opens only first page (after some time) and looks
like it doesn't do iteration trough other pages.
Once you change your window.location, and go to the first url from the array, you are losing all of your JS code (as it is not present in just opened url any more).
You can do this by installing a chrome plugin (which will not lose your JS after window.location change).
The plugin will run the added JS at DOMContentLoaded (no need to attach any event listener).
I needed also to do this, check things on the page, store some information and move on to the next page. I know, this can be done with Python and other stuff but by doing this it can be done on the FE side also.
I used the localStorage to store my information.
I pasted this into the browser console to prepare all the stuff and clean the localStorage:
// clear the localStorage
localStorage.clear();
// set an array that will keep all our pages to iterate into the localStorage
localStorage.setItem(
"pages",
JSON.stringify([
"https://my-page-1.html",
"https://my-page-2.html",
"https://my-page-3.html",
"https://my-page-4.html",
])
);
// set an array that will keep our findings
localStorage.setItem("resultArray", JSON.stringify([]));
// move to the first page of the iteration
window.location.href = "https://my-page-1.html";
After doing this, I opened the plugin interface and added the following code:
(function check() {
// array saved into the localStorage that contains all the pages to iterate
const pagesArray = JSON.parse(localStorage.getItem("pages"));
// array to store your stuff
const resultArray = JSON.parse(localStorage.getItem("resultArray"));
// whatever you want to check on that page
const myFancyCondition = true;
if (myFancyCondition) {
// push any data to the array so that you can check it later
resultArray.push({
page: pagesArray[0],
message: "I found what I was looking for!",
});
}
//remove the current page from the array
pagesArray.shift();
//reset the array value after the first page was already checked
localStorage.setItem("pages", JSON.stringify(pagesArray));
//store the array data
localStorage.setItem("resultArray", JSON.stringify(resultArray));
// quit if the iteration is over and there are no more pages to check
if(!pagesArray.length) return;
//go to the next page
window.location.href = pagesArray[0];
})();
Then, to check the results you just need to read the data from the localStorage like:
JSON.parse(localStorage.getItem('resultArray'))
I hope this helps :)!

Javascript giving a TypeError variable is undefined for code (photo gallery) run on server. Same code works locally

i'm putting together a personal website as a portfolio and i'm having trouble getting a photo gallery to work. I found a free Javascript gallery (http://ettrics.com/lab/demo/material-photo-gallery/) that I decided to implement. When putting the page together locally, the javascript runs no problem, however when I upload the page to the site (which already has plenty of other javascript running) I get the following error when scrolling on the page, or when trying to 'fullscreen' one of the images by clicking on it:
TypeError: this._fullImgs is undefined
I tried to isolate the issue and found that a line of code was executing differently on the server, than locally, the excerpt is below:
Gallery.prototype._loadFullImgsDone = function() {
var imgLoad = imagesLoaded(this._fullBox);
imgLoad.on('done', function(instance) {
var imgArr = instance.images;
this._fullImgs = [];
this._fullImgDimensions = [];
this._fullImgsTransforms = [];
for (var i = 0, ii = imgArr.length; i < ii; i++) {
var rect = imgArr[i].img.getBoundingClientRect();
this._fullImgs.push(imgArr[i].img);
this._positionFullImgs.call(this, imgArr[i].img, i);
this._fullImgDimensions.push(rect);
}
this._fullImgsLoaded = true;
}.bind(this));
};
I have found that the images are being found from their source location, however
imgLoad.on('done', function(instance) {
...
executes differently. The site is located at http://http://samueller.tech/photo-best.html id anybody would like to see for themselves the error I am getting.
Thanks in advance, i'm at a complete loss of how to fix this.
I'm seeing (on that site) the resizeHandler is getting called before the images are loaded
Gallery.prototype._handleScroll = debounce(function() {
this._resetFullImg.call(this);
}, 25);
Gallery.prototype._handleResize = function() {
this._resetFullImg.call(this);
};
Then this._resetFullImg fails because there are no images loaded yet which is why this._fullImgs is empty. The code seems to have another variable called _fullImgsLoaded and probably the _resetFullImg method should do nothing if images haven't been loaded.
You could try adding that like this:
// in material-photo-gallery.js line 1379
Gallery.prototype._resetFullImg = function() {
if (!this._fullImagesLoaded) {
return
}
this._fullImgsTransforms = [];
for (var i = 0, ii = this._fullImgs.length; i < ii; i++) {
...
I don't know how this will affect the reset of the gallery code, but it might work. It makes some sense that on your production system, the page load time (with extra JS and stuff) is such that these events might get called before things are ready which is something you don't see locally.
good luck.

Huge node leak in web app under Chrome - developer tools not reporting cause

I have a rather complicated web-application that is leaking nodes at an incredibly fast rate. Using windows task manager, I can see the amount of memory used by the chrome process increasing by around 3MB a second.
My initial approach was to disable parts of the application to isolate the area causing the problem. After a little trial and error discovered the cause were some table updates in the document. The app requests data from the server every second to update several tables of data. Each table is updated in the same way. All rows are removed from the table and new rows containing the updated data are inserted. I could not see a problem when inspecting the common code that does this.
To solve this I obviously turned to the Chrome Developer Tools. The problem I have is that the tools are giving me conflicting information.
The Timeline tool shows that the number of nodes is increasing by around 28000 each second. Forced garbage collection does not reduce this back to its original level.
The Timeline tool shows the JS Heap size fluctuating over time. A forced garbage collection returns the heap to its original size (plus or minus a few 100K)
Using the "three-snapshot" technique the Profiler - Heap Snapshot tool shows no HTML or Text nodes created between snapshot 1 and snapshot 2 that are present in snapshot 3.
Comparing the snapshots shows the creation and deletion of many HTMLTableRowElement, HTMLTableCellElement, HTMLInputElement and Text nodes. No increase in node count is reported.
The Profiler - Heap Allocation tool verifies the results of the Heap Snapshot tool. No leaks of any Node type are reported at any point.
The Heap tools show small increases in the (compiled code), (array) and (system) types.
The Heap Profile tools report heap sizes of around 12MB for my "raw" javascript version of the app, and around 7MB for my closure compiler - compiled version of the application. These values do not grow much over time.
This leaves me a little confused. There is obviously a memory leak. It is reported by Windows Task Manager, and by the Timeline tool as a node leak, but the Heap profiling tools and JS Heap Timeline do not show the issue.
As far as I can tell, the HTMLTableRowElements are referenced only in two places, in the document and in an object used for lookup by value. The object is always cleared when the table is cleared. I can "cure" the issue by changing my code to create all the nodes, but never inserting them into the document, just referencing them in the object. Obviously this is not a fix because the users cannot see the data.
After 2 days of testing and investigation I am now at a loss on how to proceed. The plot thickens if you throw in IE and Firefox. These browsers do not appear to have the same memory/node leak. I also believe the problem did not previously exist with Chrome. Unfortunately there does not seem to be a way to go back to previous versions of chrome to see if it is a bug in Chrome.
Does anyone have any advice on this. Am I missing some thing, or misinterpreting the output of the developer tools? Is there a way to go back to previous chrome versions? Does this sound like a bug in Chrome? All comments welcome.
This is the code used to insert items using Google's Closure API:
/**
* Inserts and/or updates a row in the table.
*
* #param {string|number} rowId The identifier of the row in the rows_ map
* #param {boolean} insertTop If true the row is inserted at the top of the document
* #param {...goog.dom.Appendable} var_args The items to add to each cell in the row
*/
sm.ui.DataTable.prototype.updateRow = function(rowId, insertTop, var_args) {
var dom = this.getDomHelper();
// Insert the new session data
if(!this.rows_[rowId]) {
// There is no row present (simple case of create one)
// Create the table row
var row = this.rows_[rowId] = dom.createDom('tr', {'style' : 'display: none;'});
var colView = this.colView_;
var colNames = this.columnNames_;
for(var i = 0; i < colNames.length; i++) {
var cell = dom.createDom('td');
if(!colView[i]) {
goog.style.showElement(cell, false);
}
row.appendChild(cell);
}
// Add to the table if element exists
var element = this.getElement();
var tBody = element.tBodies[0];
if(element) {
this.showPage_(this.currentPage_, false);
if(insertTop) {
// Insert a row at the top of the table
tBody.insertBefore(row, tBody.rows[0] || null);
} else {
// Append to the end if insert top no set true
tBody.appendChild(row);
}
this.showPage_(this.currentPage_, true);
// Update the footer to as it may need displaying or changing
this.updateFooter_();
}
}
// Loop over the var args and set the content of each cell
// arguments will be string, Node, array of strings and Nodes
for (var i = 0; i < arguments.length - 2; i++) {
var row = this.rows_[rowId];
dom.removeChildren(row.cells[i]);
dom.append(row.cells[i], arguments[i + 2]); // Removing this line "cures" the problem.
}
};
/**
* Removes all rows from the table.
* #param {string=} opt_message Message to display in the instead of data (reset table again to clear)
*/
sm.ui.DataTable.prototype.reset = function (opt_message) {
var element = this.getElement();
var tBody = element.tBodies[0];
while(tBody.rows.length > 0) {
tBody.deleteRow(0);
}
// Reset the rows and pages
this.rows_ = {};
this.currentPage_ = 1;
this.updateFooter_();
if(opt_message) {
// Create the row to inset in the table
// Ensure it spans all the columns
var dom = this.getDomHelper();
var messageCell = dom.createDom('tr', null,
dom.createDom('td', {'colspan' : this.columnNames_.length}, opt_message));
goog.dom.append(tBody,messageCell);
}
};
This very much looks like the classic event purge bug.
If an event is attached to a node, the node is not disposed of until the event handlers are detached.
jQuery, for example, intrinsicly detaches all events from removed nodes.
The code to correctly remove a node (along with all of her descendants) is:
function walkTheDOM(node, func)
{
func(node);
node = node.firstChild;
while (node)
{
walkTheDOM(node, func);
node = node.nextSibling;
}
}
function purgeEventHandlers(node)
{
walkTheDOM(node, function (n) {
var f;
for (f in n)
{
if (typeof n[f] === "function")
{
n[f] = null;
}
}
});
}
// now you can remove the node from the DOM
This technique is directly taken from Crockford's Good Parts.
Alternatively, use jQuery which will take care of it in the very same way.

Advice on a good method to dynamically load images in JavaScript

I' m writing a web app that needs to load dynamically a lot of images.
I wrote an utility function, loadMultipleImages, that takes care of loading them and calling (possibly optional) callbacks whenever:
a single image is loaded
an error is encountered (a single image can't load)
all the images are loaded without errors
I invoke it like this (you can find a complete example here):
var imageResources = [];
var mulArgs = {multipleImages: [],
onComplete: this.afterLoading.bind(MYAPP),
onError: this.logError.bind(MYAPP)
}
imageResources = ["imageA_1.png", "imageA_2.png"];
mulArgs.multipleImages.push ({ID: "imageA_loader", imageNames : imageResources});
imageResources = ["imageB_1.png", "imageB_2.png"];
mulArgs.multipleImages.push ({ID: "imageB_loader", imageNames : imageResources});
//Lots of more images
var mImageLoader = new loadMultipleImages (mulArgs);
As soon as I create my loadMultipleImages object, it loads the images and calls the afterLoading() function after they are all loaded (or logError() if there's some problem). Then I can access to the images with:
MYAPP.afterLoading = function (loaders) {
// Just an example
var imageA_array = loaders["imageA_loader"].images;
var imageB_first = loaders["imageB_loader"].images[0];
}
By the way, I'm thinking that I'm reinventing the (possibly square) wheel. Is there a lightweight, simple library that does that better than I'm doing? Or simply a better method that spares me the burden of maintaining the loadMultipleImages code?
http://jsfromhell.com/classes/preloader

Categories