When leaving a website my user get a message you are now leaving, which redirects after 4 seconds. Can that time be used to somehow preload the target site's content, so that after the redirect the site appears faster?
If the site to be fetched is on your domain, you can parse the next file with JavaScript and request the assets.
If not, you can't figure out its assets (not via XHR anyway, most of the time) so you can't preload them. You could hack it by placing the site in a hidden iframe. You could also use your server as a proxy, to get a list of assets and pass it to your HTML to start preloading them.
You could try also using this meta tag...
<link rel="prefetch" href="/images/big.jpeg">
Source.
It is a lot of effort for arguably not much gain though.
You could start loading the site into an invisible <iframe>. If it's cached properly, this will reduce load time when the user actually enters the page.
This would however have all kinds of potential side effects, like autoplay music and videos starting in the background.
I would tend to leave this kind of preloading to the web browser (and/or the prefetch tag that #alex shows!)
Related
Would like to understand how AirBnb is able to load a 20MB background video file so fast on their homepage. After inspecting their homepage on WebPageTest, I noticed that the video did not show up in any of the downloaded resources, which made it score so high. When I've tried this tactic, via loading the video asynchronously via AJAX, the video still shows up on WebPageTest as a downloaded resource, but just after the DOM loads. So I'm really not sure how AirBnb is able to make this work. Does anyone have an idea?
AirBnb isn't doing anything special here. They're just starting playback of media using progressive download, which just means playback starts while the video is still downloading.
On their CDN, they have uploaded some fairly large MP4 files with two important characteristics:
The indexing information (MOOV atom) has been moved to the beginning of the MP4 file
The video is encoded in a format and codec that your browser supports
Because of these characteristics, all the site has to do is tell your browser to begin playing the source URL, and it will do the right thing: it makes a web request to the CDN and begins downloading the file. As soon as enough data has been transferred to start playback, it does so.
Finally, I can't say for sure why WebPageTest doesn't show you the video MP4s that are driving the video, but they are certainly there, and the URLs look like https://a0.muscache.com/airbnb/static/Xxxxx-X1-1.mp4. I suspect they're looking at your User Agent to decide which file to send you, and are not sending any video at all to bots like Google and WebPageTest.
You're not getting the real story through WebPageTest. Instead of relying on a third party to evaluate the page in their environment, you should watch the traffic you are actually being sent using Fiddler or the Network tab on Chrome Developer Tools.
So I have single page web app that needs to load many images, they're all relatively small(30kb ish) but I have around a thousand of them. Right now im using a hidden div to load all of them when the user hits my website so they are all cached. What is the best way to do this? Currently it's like a thousand GET requests, would it be possible to somehow zip these and unzip them with js? Would it even make a difference to send them all in one GET?
EDIT:
Not sure if I was clear or not but this is how i'm currently doing it
<div id="hidden-div">
<img src="assets/blah.jpg">
<a ton more img tags>
</div>
Just wondering if there is a better/faster way to do this?
If you are looking to just "preload" the images so that you can use them later (in your app/page), then you can use the following HTML 5 code:
<link rel="prefetch" href="http://davidwalsh.name/wp-content/themes/walshbook3/images/sprite.png" />
See: Link prefetching is a browser mechanism, which utilizes browser idle time to download or prefetch documents that the user might visit in the near future. A web page provides a set of prefetching hints to the browser, and after the browser is finished loading the page, it begins silently prefetching specified documents and stores them in its cache. When the user visits one of the prefetched documents, it can be served up quickly out of the browser's cache.
URL, then description, then title, keywords.. and finally body.
If for gallery, then #Santosh is correct. Or http://code.msdn.microsoft.com/Infinite-Scroll-Like-Bing-bc05262b
Is there a way to make a video download in the background (Possible on a different thread?) than I get my images and do webrequests?
Situation:
I show a video, video plays fine.
But, once I start getting 2 images at the same time from that server, the page won't response and will wait for my video to have finished loading, then it loads the images within a few seconds. (about 100kb per image).
If I try to open any other page on the same browser and the crashed page's server it won't load untill the crashed page is done loading, however any other site(For example google) will just load fine.
So is there a way to make the browser not want to download full video, or maybe just give priority to images?
This sounds like the server throttling your requests, as everything apart from scripts always load asynchronously in a browser.
It might be that you are only allowed so much bandwidth per second from the server - or so many connections - and that could be the reason why the server won't respond until your first request has finished.
Check your server configuration and perhaps have a play with it to exclude this possibility.
Maybe you can try a worker, it provides a way to execute background scripts (you will have to split your video and images in separate files), as you can see in the Use Cases it refers to "Prefetching and/or caching data for later use" as a possible scenario. Hope it helps.
More info here:
http://www.html5rocks.com/en/tutorials/workers/basics/#toc-examples
I've seen a couple of pages load their javascript files into a page via new Image.src ='';
<script type="text/javascript">
new Image().src = "XXXX\/js\/jquery-1.7.2.min.js";
</script>
I was just wondering the benefits or purpose of this.
It's a quick and dirty way of initiating an HTTP request (as the comments on the question suggest).
There may be a minor advantage gained by initiating the download at the top of the page and then including <script src='the-same-file.js'></script> at the bottom of the page so that the file can be loaded from the browser cache.
This might allow the latency of the download to be parallelized with a parsing task. For example, the download initiated in the head might run while the body is still being parsed.
Why not just reference the file in the head using the src attribute?
If neither [the defer or async] attribute is present, then the script is fetched and
executed immediately, before the user agent continues parsing the
page.
Source (suggested reading)
In other words, this method attempts to allow the browser to download the file without incurring the blocking behavior until later in the process.
However
If this is really necessary, I would consider the defer attribute which is intended for such purposes rather than the new Image() hack.
This "optimization" could backfire depending on cache headers. You could end up making two HTTP requests or even downloading the file twice.
"In the Wild"
A quick investigation of several major sites (Google search, Gmail, Twitter, Facebook, Netflix) shows that this technique is not used to fetch JavaScript files and used very sparingly overall.
For example, Facebook appears to use it not for caching/performance but for tracking purposes when the site is (potentially maliciously) loaded into a frameset. By creating an Image instance and setting the source, they initiate an HTTP request to a page which tracks the clickjacking attempt.
This is an isolated case; under normal circumstances this script will never be run.
I want to Display the content of a webpage ( say wikipedia ) on my web page which has my custom JavaScript how shall i do that ?
I tried to use the iFrame for this but the JavaScript that i have on my page doesnt work on the Iframe but it does work on the rest of the body
How should i use the content of a different webpage on my webpage so that i can use my JavaScript on that page.
I want a page like google translator which has on top my Header and on the bottom the content of a webpage.
is it done through an iFrame or a content placeholder or ... what ?
You'll have to fetch the content from your server, build up a page around it (possibly using an <iframe>; that'd certainly be the simplest thing) and then serve it up. There might be all sorts of problems as the page tries to fetch its auxiliary files (CSS, scripts, images) because it may use relative URLs. Depending on what you know about the remote page, you'd have to do some surgery on the fetched content before sending it out to the client.
You cannot mess with content fetched from a different domain. That's why it doesn't work when you just include a frame that directly fetches the other content from the client. When you fetch the content from your server, however, the browser will be happy.
Oh, and also, note that forms or AJAX code in the fetched content may also have problems when running inside your site, because again it may use relative URLs. Even if it isn't, you may have security problems, because there's no way for a user to really log in (unless you proxy that too from your server).