Fastest way to load many images - javascript

So I have single page web app that needs to load many images, they're all relatively small(30kb ish) but I have around a thousand of them. Right now im using a hidden div to load all of them when the user hits my website so they are all cached. What is the best way to do this? Currently it's like a thousand GET requests, would it be possible to somehow zip these and unzip them with js? Would it even make a difference to send them all in one GET?
EDIT:
Not sure if I was clear or not but this is how i'm currently doing it
<div id="hidden-div">
<img src="assets/blah.jpg">
<a ton more img tags>
</div>
Just wondering if there is a better/faster way to do this?

If you are looking to just "preload" the images so that you can use them later (in your app/page), then you can use the following HTML 5 code:
<link rel="prefetch" href="http://davidwalsh.name/wp-content/themes/walshbook3/images/sprite.png" />
See: Link prefetching is a browser mechanism, which utilizes browser idle time to download or prefetch documents that the user might visit in the near future. A web page provides a set of prefetching hints to the browser, and after the browser is finished loading the page, it begins silently prefetching specified documents and stores them in its cache. When the user visits one of the prefetched documents, it can be served up quickly out of the browser's cache.
URL, then description, then title, keywords.. and finally body.
If for gallery, then #Santosh is correct. Or http://code.msdn.microsoft.com/Infinite-Scroll-Like-Bing-bc05262b

Related

Manual file caching in JavaScript

I have an HTML based project that works with media from other websites, which embeds images / songs / videos using their direct links. The system works perfectly so far, but I wish to make a change: As a lot of assets are accessed repeatedly by viewers, it would seem more optimal to cache them in a controlled way, so whenever certain media pops up you don't need to fetch it from the origin server each time. I never did this before so I don't know if and how it can be done.
To use an oversimplification: I have an embedded photo called "image.png" inside an image element, which will show up whenever I open the site. Currently it's simply defined as:
<img scr="https://foo.bar/image.png">
Works perfectly! However I want to make sure that when my site is accessed, you don't need to fetch that image from foo.bar each time: You will keep it in a local directory after downloading it once, from which the script can fetch and work with the file independently. For Firefox for instance, this subdirectory would be inside your ~/.mozilla/firefox/my_profile directory. Ideally it can be defined using a fixed name, so no matter which URL the website is opened from it uses the same cache path instead of each mirror of the project generating its own.
First, my script must tell the browser to download https://foo.bar/image.png and store it into this cache subdirectory. After that, it would need to generate a link to embed it directly from that subdirectory, so the URL I use would now be something of the following form:
<img scr="file://path_to_cache/image.png">
How do I do those two things, in a way that's compatible across popular web browsers? As a bonus, it would be useful to know if I can limit the size of this cache directory, so once it reaches say 100 MB the oldest items will be removed to stay under that size.
You could alternately add caching to your server's .htaccess file.
This site explains how: https://www.siteground.com/kb/leverage-browser-caching/
However this does not cache the image on the user's machine, it is cached on the server for quicker response.
You could use service workers to cache images on the user's machine.
https://developers.google.com/web/ilt/pwa/lab-caching-files-with-service-worker
Hope this helps.

Multiple concurrent downloads from a webpage

Background
We have a website that delivers dynamic content via download to our customers. Currently this is done by simply making a request to another page which dynamically sets the response ContentType and streams out the file data.
The Problem
We have now been tasked with delivering multiple pieces of content at once at the click of a button (or as a page loads). We have tried various approaches:
1) Multiple iFrames on the page with a different download URL in each. This did not work in all browsers, and since our platform is targeted at mobile phones, many of the native phone browsers did not handle the iFrames at all.
2) Multiple AJAX requests for the content. This is flawed as the AJAX requests were simply returning the binary data and the page was trying to output all of this onto the page rather than deliver as a download.
3) Multiple JavaScript timeouts. This worked for up to 3 downloads, but was very unreliable because if the second Timeout function begun before the first one had started the download, then the whole thing would simply break and not continue.
At this point I'm fresh out of ideas. I tried Googling for similar solutions to the problem but didn't come up with anything and I'm starting to think that it's actually just not possible.
Note that since the content is target at mobile devices, zipping the files up and delivering all at once is not an option since the devices are often unable to decompress the content.
The question, then, is: Is there a way to reliably trigger a web browser to download multiple pieces of content at once?
It turns out that this really isn't possible on mobile devices. Most mobile browsers support the methods involved, but due to the way that each tab of a browser is threaded and paused, it was only possible to fire one download at a time, since any redirect action would interrupt the remaining javascript from processing.

How to download video in background HTML5 / JS

Is there a way to make a video download in the background (Possible on a different thread?) than I get my images and do webrequests?
Situation:
I show a video, video plays fine.
But, once I start getting 2 images at the same time from that server, the page won't response and will wait for my video to have finished loading, then it loads the images within a few seconds. (about 100kb per image).
If I try to open any other page on the same browser and the crashed page's server it won't load untill the crashed page is done loading, however any other site(For example google) will just load fine.
So is there a way to make the browser not want to download full video, or maybe just give priority to images?
This sounds like the server throttling your requests, as everything apart from scripts always load asynchronously in a browser.
It might be that you are only allowed so much bandwidth per second from the server - or so many connections - and that could be the reason why the server won't respond until your first request has finished.
Check your server configuration and perhaps have a play with it to exclude this possibility.
Maybe you can try a worker, it provides a way to execute background scripts (you will have to split your video and images in separate files), as you can see in the Use Cases it refers to "Prefetching and/or caching data for later use" as a possible scenario. Hope it helps.
More info here:
http://www.html5rocks.com/en/tutorials/workers/basics/#toc-examples

How can I use progressive enhancement techniques to progressively load a slow page?

I've got a portal page in ASP.NET MVC with a few different sections (partial views), some of which will be unavoidably slow the first time a user accesses them, because they have to pull in up-to-date data from an external internet source.
Some of this data can be loaded almost immediately, but the old "Web 1.0" design just goes to a "loading" page until all of the data is available. I'm trying to improve the user experience by immediately displaying the local data, and then using a couple of ajax updates to display the remote data a few seconds later.
Of course I want to do this using progressive enhancement in case the Javascript breaks, gets blocked, or isn't supported for whatever reason. My first thought was to use a meta refresh and disable it with javascript, but apparently that's impossible. I'm also violently opposed to the idea of a window.location redirect, because (a) it's highly perceptible, unlike a server redirect, and (b) it's well within the realm of possibility for the JS redirect to work but the ajax to still break (think IE6, ill-behaved mobile devices, etc.)
Is there some way I can build up a page that loads in stages, but still works with plain HTML?
I've solved this in the past with a couple of approaches: one is to create separate pages for the slow content, so that users who don't have or don't get JS for some reason can click a link to get the content. The experience is different, but works.
Another way to do it is to have a "show" link in the areas that are deferred. JS removes the link and inserts the content. If the user wants to see the content they click the link, triggering a refresh that will not defer the content.
As Ali points out in his comment, using iframes for the slow-loading content seems like the way to go.
If your ajax routines do more than just load the content (for example if you are reformatting the data in some way) you can go one step further and use javascript to remove the iframe on load, then use your normal ajax routines to load the content as you wish.
So users with javascript disabled still see the content but your local content loads quickly, while users with javascript enabled will have a nice ajax experience.
To remove the iframe using jquery you can do something like this:
<html>
<head>
<script src="http://code.jquery.com/jquery-latest.js"></script>
<head>
<body>
<div>Fast Loading Content...</div>
<iframe id="slowContent" src="http://example.com/slowLoadingContent.htm"></iframe>
<script type="text/javascript">
$('#slowContent').remove(); // removes both the iframe and any bound events
// execute your ajax routines to pull in the slowLoadingContent and modify as appropriate
</script>
</body>
</html>
I suppose this depends on the nature of the data that you're pulling in, but "progressive enhancement" in this case might be serving the page without the external content to clients who don't have javascript available, then pulling in and inserting content using javascript where you can. Personally I'd be concerned (slightly) with the "no javascript available" case, but not concerned with the "javascript breaks" case, because error handling in your javascript can take care of the "ajax didn't work properly" scenario.
If you absolutely must all-or-nothing render the page, you can have a "reload this page in a minute to see the rest of the content" message/link.
Otherwise "frames" (using ) or separate pages are a pretty good plan.

Preload external website content, then redirect

When leaving a website my user get a message you are now leaving, which redirects after 4 seconds. Can that time be used to somehow preload the target site's content, so that after the redirect the site appears faster?
If the site to be fetched is on your domain, you can parse the next file with JavaScript and request the assets.
If not, you can't figure out its assets (not via XHR anyway, most of the time) so you can't preload them. You could hack it by placing the site in a hidden iframe. You could also use your server as a proxy, to get a list of assets and pass it to your HTML to start preloading them.
You could try also using this meta tag...
<link rel="prefetch" href="/images/big.jpeg">
Source.
It is a lot of effort for arguably not much gain though.
You could start loading the site into an invisible <iframe>. If it's cached properly, this will reduce load time when the user actually enters the page.
This would however have all kinds of potential side effects, like autoplay music and videos starting in the background.
I would tend to leave this kind of preloading to the web browser (and/or the prefetch tag that #alex shows!)

Categories