improve load time of very long scrolling page of images - javascript

Ive got 5 very long images (each 7 scrolled pages long) and Im wondering how to get the best load time?
Would the best way be to cut each image up into 7 parts and then fade them in with javascript as they come into focus? Do you think I could get away with just fading the 5 very long images in?
Trying to get it so this is responsive as well.
Cheers
KE

Image tiling/slicing seems like a reasonable way to go, if for slightly more effort and and storing/delivering assets in a format that doesn't lend it'self to their being used elsewhere - if that's even a consideration here.
If you have control of the images could you deliver them in a format/setting that provides for progressive rendering (interlacing)?
See http://nuwen.net/png.html for a very good PNG focussed introduction to the topic.
Of course, if you're loading multiple images, you might get some improvement over starting loads when the view is approaching them, when the image above them completes or some combination of the two.

Related

Preload/Lazyload Images using Javascipt as user scrolls down a page

Edit: Solved it and what I wanted had nothing to do with lazyloading. I'm leaving the original question and post as it is but removed the excerpt (since the excerpt is pretty much pointless and I'm deleting it on codesandbox.io). The answer which I came up with is below the question.
A user visits a page with 20 images. However I do not want all images to instantly load as the website will feel slow if the images are big or if there are way more images.
If the user is in a certain image, load that image and preload the next 5 images (for example) and not the rest. As the user scrolls, the website will continuously load the next 5 images. I already have intersection observer set up as well as a custom image loading component.
I know of a solution where I keep calling my server as the user scrolls. However I would not like that as it might be too excessive. (And it's an image focused/heavy website)
Preferably solved using vanilla javascript or css. If the solution might be too complicated, I wouldn't mind using a lightweight package. I have no clue as to where to start with this.
Note: The project is running on Nuxt.js
Well I coded out what I needed thanks to Kissu for the article. Although what I wanted had nothing related to lazy image, the article somehow gave me a hint on how it can be done. It was a simple use of a combination of intersection observer, conditionals and "v-if" for rendering that component
Set up intersection Observer to track which image index the user is on.
Add a variable to track the furthest image the user has scrolled to. So our website renders and keeps all previous images on the browser)
Set up a "v-if" as to render the next few images. i.e. "v-if" - Render the image of the index that the user is on + next images. E.g. To preload next 5 image
<imageComponent v-if="lastImgViewed+5 > CurrentImage"></imageComponent>
That's It! Simple! Although I don't know if it's lean or the best way to do this - performance wise.

Browsers and large number of image files handle

The ability of a browser (Chrome for example) to handle a number of images, is limited only by the hardware of the computer on which it is displayed or also by the software itself?
I'm trying to develope an image viewer that must content lot of files that must be accesible instantly depending on the demand of the user and sometimes when i go over 350 files of 300kb the page frozens.
Thank you all for your help!!
It is probably limited by both.
There are limits to how large a struct is allowed to be for instance, because it has to fit in a fixed amount of bytes.
(see this question for instance)
(also, you're not yet running into the max size of an int so that is probably not what is happening right now.)
Besides this there are several more constraints.
That said, loading a gigantic amount of images every time you open a page is probably not a good idea.
Take some inspiration from how others (like google in their image search) have solved this problem simply by not loading the images until they are needed.
I think the best approach would be to have some small thumbnails of each image and on request of the user(click) load the bigger one, like Timothy said.
If you need it to be faster you can preload images. So for example if you have a list of images and the user scrolls through you just load the next n images. To free space you can "delete" the ones the user did already passed.

Javascript Image loading Facebook style

so I was browsing through this page: http://360langstrasse.sf.tv/
It basically is a Javascript-Street View, but only allowing one direction. Therefore its kinda like playing a movie.
When moving fast I notived that the images are grainy/pixelated, the same way as when browsing through Facebook.
I was wondering how to implement this?
I tried with sending small base64 encoded images in the markup, and then draw it on a canvas until the 'real' image was loaded.
This worked fine, but left me thinking if this would indeed increase performance, or do sites like facebook do it differently?
Thanks in advance for any help.
Regards Jens
Edit: Or do they only display the images differently? Have another render process than usual?
As I don't see any small images beeing loaded?
Edit 2: The below mentionewd option to load small images first is descriped nicely here: http://www.phpied.com/picassa-progressive-image-rendering/
But basically it is pretty simple.
I suppose caching (having in-page) bad-resolution images and fetching better is a real way to accomplish this.
The other way would be linking to small images in a normal way, and fetching bigger with JS - small images should load really fast, or you can subscribe to their load event (tricky in IE) and show the page (remove some overlay) once they are loaded.
BTW, instead of using canvas you can put base64 directly into src
<img src="data:image/png;base64,...
The answer is Progressive JPEG's!
One create such with imagemagick for example. This way the browser renders progressively until aborted or completed. Those images may be bigger than normal images, but not always.
Furthermore they provide the ability to be seen before completely downloaded.
Thanks for the help!

Can I resize images using JavaScript (not scale, real resize)

I need to dynamically load and put on screen huge number of images — it can be something like 1000–3000. Usually these pictures are of different size, and I'm getting their URLs from user. So, some of these pictures can be 1024x800 or 10x40 pixels.
I wrote a good JS script showing them nicely on one page (ala Google Images Search style), but they are still very heavy on RAM used (a hundred 500K images on one page is not good), so I thought about the option of really resizing images. Like making an image that’s 1000x800 pixels something like 100x80, and then forget (free the ram) of the original one.
Can this be done using JavaScript (without server side processing)?
I would suggest a different approach: Use pagination.
Display, say, 15 images. Then the user click on 'next page' and the next page is shown.
Or, even better, you can script that when the user reaches the end of the page the next page is automatically loaded.
If such thing is not what you want to do. Maybe you want to do a collage of images, then maybe you can check CSS3 transforms. I think they should be fast.
What you want to do is to take some pressure from the client so that it can handle all the images. Letting it resize all the images (JavaScript is client side) will do exactly the opposite because actually resizing an image is usually way more expensive than just displaying it (and not possible with browser JS anyway).
Usually there is always a better solution than displaying that many items at once. One would be dynamic loading e.g. when a user scrolls down the page (like the new Facebook profiles do) or using pagination. I can't imagine that all 1k - 3k images will be visible all at once.
There is no native JS way of doing this. You may be able to hack something using Flash but you really should resize the images on the server because:
You will save on bandwidth transferring those large 500K images to the client.
The client will be able to cache those images.
You'll get a faster loading page.
You'll be able to fit a lot more thumbnail images in memory and therefore will require less pagination.
more...
I'm (pretty) sure it can be done in browsers that support canvas. If this is a path you would like to take you should start here.
I see to possible problems with the canvas approach:
It will probably take a really long time (relatively speaking) to resize many images. Because of this, you're probably going to have to look into utilizing webworkers.
Will the browser actually free up any memory if you remove the image from the DOM and/or delete/null all references to those images? I don't know.
And some pretty pictures of a canvas-resized image:
this answer needs a ninja:--> Qk

What is the best way to make a side-scrolling website load quickly?

I'm new to stack-overflow and programming so forgive me for any awkward phrasing!
I am building a side-scrolling website which is graphic-rich, and 680x9400px in size. I will be using some javascript and/or mootools to create a cool side-scrolling effect, similar to http://sursly.com.
I am web optimizing all the images used, but would like to know if anyone has any other ideas of how to speed up page loading? Is there any way to pre-load the site in horizontal sections for example?
Thanks in advance.
Using something like the jQuery Lazy Loading Plugin you can get a perceived speedup since only the visible images will be loaded. So they won't compete with off screen images.
I know you'll probably be using mootools, but it'd surprise me if they didn't have something similar.
Notice that most of the graphics on the Sursly site are pure black and white. This makes the site load dramatically faster since the files can be optimized way down.
I agree with Past One's answer, but would modify that slightly: instead of loading it as you need it, load it when you can. That is, initially load nothing but the first page. Once that has loaded, then load the second, then the third and so on.
Keep track of which parts have and have not loaded yet, and if a "page" which hasn't loaded is requested, then display a "please wait" sign and bump that page up the priority queue.
Remember to be careful with these techniques if you're interested in getting indexed by search engines.
Most websites that do this work like Google Maps does. They divide the world (or in your case, the virtual side-scrolling page) into tiles. As the user scrolls, AJAX is used to load the next tile, and it's displayed when the user reaches the edge of the currently visible tile.
You can load more than one not-yet-visible tiles on each side if you want, but it will take more client-side memory for that better user experience.

Categories