Optimize load of exactly identical iframes - javascript

I need to display several exactly identical iframes. This iframes have javascript that reads from the # of the URI and then perform different request but the source code of the iframe and the fetched ressources are exactly identical.
1 iframes, including javascript, stylesheets and image is approximately 10 mb. Now imagine that I need to display up to 10 iframes on the same page. That's a page of up to 100 mb !
The problem is that neither the iframes themselves or the content they are loading are being directly cached. (not on Google Chrome at least, which is my principal target).
Here is an example of what my code looks like :
<iframe src="myiframe#1">
<iframe src="myiframe#2">
<iframe src="myiframe#3">
...
Each iframe is loading the same huge javascript file and the same request is being fired for as many iframe as I have :
We can clearly see that when the first js file finished to download, the other one (even the connections that weren't initiated yet!) are not using the cache but rather downloading it all over again. And it goes the same with every file.
What have I tried ?
Sending caching headers. Currently replying with cache-control: public, max-age=31536000 but I've even tried the more restrictive such as Cache-Control: immutable
First loading only one iframe and then, onload, loading all the other iframes. It still doesn't use the cache even though the first iframe has fully loaded and should have saved the ressources in the cache !
Cloning iframes but, and as described by the RFC, it fully reloads the iframe hence firing the requests again.
On the second load of my page however, the browser does fetch the ressources in the cache but that's not enough for me.
The desperate (and only ?) solution I can think of :
Load all the ressources and the iframe source code, then dynamically create an iframe element and inject everything in it. It would be possible but very hard. Also, probably not very performance friendly.
Note : The iframe are not on my domain but I'm doing some reverse proxy thing so I have controls over the iframes. It means I can possibly modify the source code of the iframe but since everything is minified and all it's not an easy thing to do.

Could you try to load the iframes without the # ? If the cache problem relies on the anchor, you could set it after the iframe is loaded:
<iframe src="myiframe" onload="this.src+='#1'"></iframe>
<iframe src="myiframe" onload="this.src+='#2'"></iframe>
<iframe src="myiframe" onload="this.src+='#3'"></iframe>

Related

Wistia E-v1.js script being loaded twice

So I am calling the wistia script with a script tag in my head like this:
<script charSet='ISO-8859-1' src='//fast.wistia.com/assets/external/E-v1.js' async defer data-script='wistia' />
However, when I check out the network tab on Chrome, I notice that the E-v1.js script from Wistia is being loaded twice, which is rather significant as it is a 273kb script.
The first load of the script is from https://fast.wistia.com/assets/external/E-v1.js, the location to which I have called it.
However, the second load of the script comes from an iframe, despite me not having put any iframes on the page. This iframe calls the script even on webpages which do not contain any wistia videos. The referrer is: https://fast.wistia.com/embed/iframe_shim?domain=com.
What's going on here? I assume this is some trying-to-be-helpful behaviour from wistia to lazy load their script via an iframe, but it's already loaded...
So I contacted Wistia and got an answer. Their development practices are not exactly intuitive.
Here's what the guy said:
The iframe_shim is a way of tracking the visitor_key for stats tracking, and storing that information on the fast.wistia domain rather than your domain. For a more lightweight method of doing that, you can set window.wistiaIframeShim = false in script tags on your page, and that will stop E-v1.js from loading again. Visitors will then be tracked via a cookie and localstorage directly on your domain instead of the fast.wistia.com domain. As far as I know this shouldn't be problematic, and we'll eventually be changing how that works to make it more efficient, it just hasn't been prioritized yet.
So they seem to load it twice from two different origins just to store a tiny amount of information on their own domain rather than on the client. Seems ridicuous to me, but I can confirm as of right now that all you have to do is change that window variable.
THE FIX: window.wistiaIframeShim = false

Determine if object/embed is forbidden?

You might know HN, but you maybe also do not like the fact of clicking around in many tabs/going back and forward. I thought about making some page on which both webpage and links from HN are. So I made this: http://goodfrontpage.com/direct
But there are 2 Problems:
First: How to determine if a page doesn't allow in it's http-headers to open it in something like this:
<iframe class="webpage" src="{{post.url}}" ></iframe>
or this:
<object class="webpage" data="http://asfasfasfa.com" >
<embed class="webpage" src="http://asfasffvasf.com" > </embed>
Error: Webpage not accessible!
</object>
This is true for pages like github.com, eff.org or youtube.com
Second: Is there any possibility to fetch the sites in a different way allowing me to display all pages?
If you want to embed a webpage within another one then you should use the iframe element:
<iframe src="http://asfasffvasf.com"/>
You can style this like any other block element, and set an explicit width and height.
Some pages ask browsers not to include them within iframes (using the X-Frame-Options header). I don't think there's an easy way to solve this on the client side, but you could create a simple backend or proxy to request the page you want and return the content. This gets round the iframe restriction because you're now including content from your own domain.
This does have a couple of security issues to be aware of:
You've now made a backend which can be used to download any page on the Internet. There's a denial of service vulnerability if someone makes lots of requests to download huge pages.
The pages you're including will no longer be restricted by the same origin policy. Scripts on those pages will be able to interact with everything on the parent page. This may be a problem if you plan on creating login functionality in the future.
Seems like you are looking for iframe property.It specifies an inline frame.
An inline frame is used to embed another document within the current HTML document.
<iframe src="http://asfasfasfa.com"></iframe>

A good method for client-side iframe caching

I'm trying to cache some documents client-side in order to switch between them faster.
The documents have been loaded in an iframe, so it's a question on how to cache it locally within the browser.
My method was to have a variable, item, and then do
if (item.cache) {
$('.holder', someElem).html(item.cache);
return;
}
item.cache = $('<iframe....');
$('.holder', someElem).html(item.cache);
However, this method keeps reloading the iframe src, when injected on to the holder.
Any good methods for client-side iframe caching?
The iframe doesn't actually trigger a page refresh until it has been added to the dom. I am guessing you keep an instance of the iframe but not add it to the dom until its time to see it. This method doesn't work well. I would suggest using css "display:none" to load it and hide and then show it when you need it.
HTTP has caching built in. Mark Nottingham has written a decent overview. Setting the Cache-Control and Expires headers should be enough for what you describe.

Jquery/javascript detection if iframe can be loaded

I have iframes with OpenX ads I server from another server/domain.
When something happens to this ad server the main server doesn't load whole contents because the domain that openx loads in iframe is not responding. I always thought that iframe works independently from the main site but it doesn't if the domain doesn't answer at all...
Anyway can main site detect somehow that a url in iframe is not responding and skips loading iframe and show the rest of the site?
How about loading the iFrame once the website is loaded? It's pretty easy to do this using jQuery (or even plain javascript using the window.load event).
So rather than wanting to 'detect' whether the iFrame has loaded, you can load it AFTER the rest of the website has completed loading. (sorry for excessive use of word 'load')
In jQuery, you can simply attach the url to the iFrame on the document.ready event.
A blank iFrame
<iframe id="iframe-ad" width="200"></iframe>
Simple jQuery to load the URL on document.ready
<script type="text/javascript">
$(document).ready(function() {
$("#iframe-ad").attr("src", "http://www.google.com");
});
</script>
Unfortunately there is no easy way to do this unless they are served from the same domain. I know there is a way to get let the javascript inside the iframe perform some actions on the parent document it is contained in, but I am not really sure how...
It is not possible, because of Same origin policy, there are some "gaps" in some browsers. But this is not going to recommend!
Might not make for the best experience, but you can make a local redirect file. something like:
<iframe src="http://www.mydomain.com/redir?url=http://www.theirdomain.com/ads"/>
then the redir page just returns
<script>
location.href = "${url}";
</script>
That way as long as your server is responding, everything else will continue as normal while the iframe redirects?
How about if I don't have iframe just javascript originating from different domain? If the domain is not responding javascript holds the page back not to load. Is there a way to prevent that?

Is there a way to mitigate downloading of resources (images/css and js files) with Javascript?

I have a html page on my localhost - get_description.html.
The snippet below is part of the code:
<input type="text" id="url"/>
<button id="get_description_button">Get description</button>
<iframe id="description_container" src="#"/>
When the button is clicked the src of the iframe is set to the url entered in the textbox. The pages fetched this way are very big with lots of linked files. What I am interested in the page is a block of text contained in a <div id="description"> element.
Is there a way to mitigate downloading of resources linked in the page that loads into the iframe?
I don't want to use curl because the data is only available to logged in users and the steps to take with curl to get the content is too complicated. The iframe is simple as I use this on a box which sends the right cookies to identify the request as coming from a logged in user, but the problem is that it is very wasteful to get nearly 1 MB of data to keep 1 KB of it and throw out the rest.
Edit
If the proposed method just works in Firefox it is fine, so I added Firefox tag. Also, it is possible that the answer actually is from the realm of Firefox add-on techniques, so I added that tag as well.
The problem is not that I cannot get at what I'm looking for, rather, the problem is the easy iframe method is wasteful.
I know that Firefox does allow loading only the text of a page. If you open a page and press Ctrl+U you are taken to 'view page source' window, There links behave as normal and are clickable, if you click on a link in source view, the source of the new page is loaded into the view source window, without the linked resources being downloaded, exactly what I'm trying to get. But I don't know how to access this behaviour.
Another example is the Adblock add-on. It somehow kills elements before they get loaded. With plain Javascript this is not possible. Because it only is triggered too late to intervene in good time.
The Same Origin Policy forbids any web page to access contents of any other web page in a different domain so basically you cannot do that.
However it seems that with some browsers it is allowed to access web pages content if you are trying to access it from a local web page which seems to be your case.
Safari, IE 6/7/8 are browser that allow a local web page to do so via XMLHttpRequest (source: Google Browser Security Handbook) so you may want to choose to use one of those browsers to do what you need (note that future versions of those browsers may not allow to do so anymore).
A part from this solution I only see two possibities:
If the web pages you need to fetch content from are somehow controlled by you, you can create a simpler interface to let other web pages to get the content you need (for example allowing JSONP requests).
If the web pages you need to fetch content from are not controlled by you the only solution I see is to fetch content server side logging in from the server directly (I know that you don't want to do so, but I don't see any other possibility if the previous I mentioned are not practicable)
Hope it helps.
Actually I've seen Cross Domain jQuery .load request before, here: http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
The author claims that codes like these found on that page
$('#container').load('http://google.com'); // SERIOUSLY!
$.ajax({
url: 'http://news.bbc.co.uk',
type: 'GET',
success: function(res) {
var headline = $(res.responseText).find('a.tsh').text();
alert(headline);
}
});
// Works with $.get too!
would work. (The BBC code might not work because of the recent redesign, but you get the idea)
Apparently it is using YQL wrapped into a jQuery plugin to do the trick. Now I cannot say I fully understand what he is doing there but it appears to work, and fits the bill. Once you load the data I suppose it is a simple matter of filtering out the data that you need.
If you prefer something that works at the browser level, may I suggest Mozilla's Jetpack framework for lightweight extensions. I've not yet read the documentations in its entirety but it should contain the APIs needed for this to work.
There are various ways to go about this in AJAX, I'm going to show the jQuery way for brevity as one option, though you could do this in vanilla JavaScript as well.
Instead of an <iframe> you can just use a container, let's say a <div> like this:
<div id="description_container"></div>
Then to load it:
$(function() {
$("#get_description_button").click(function() {
$("#description_container").load($("input").val() + " #description");
});
});
This uses the .load() method which takes a string in this format: .load("url selector"), then takes that element in the page and places it's content inside the container you're loading, in this case #description_container.
This is just the jQuery route, mainly to illustrate that yes, you can do what you want, but you don't have to do it exactly like this, just showing the concept is getting what you want from an AJAX request, rather than in an <iframe>.
Your description sounds like you are fetching pages from the same domain (you said that you need to be logged in and have session credentials) so have you tried to use async request via XMLHttpRequest? It might complain if the html on a page is particularly messed up but you chould still be able to get raw text via .responseText and extract what you need with a regex.

Categories