Downloading same JS file from 2 frames within parent page - javascript

I am working with a page that contains two frames. Each frame calls a page that then calls the same javascript file in a script tag. It appears that sometimes the browser will have cached the js file by the time the other frame makes its call, thus grabbing it from the cache. But, it appears that sometimes it downloads 2 copies, one for each frame. I'm trying to figure out if it would be worth calling the script once from the parent page and have each frame's page access it that way. So is it just a matter of how fast the browser happens to download the js file, if the other frame will grab it from the cache? What's the normal protocol for the major browsers on this?
Thanks for the help!

You can have the script look to see if it has any child iframes on the page, if it does, dynamically add a script block to the child document (with the same SRC). This way the main one will ALWAYS load first and the children will always use the cache.

I wouldn't worry too much about it. If by the time the second frame needs the file it's in the cache, then it'll use the cache, if not, it'll load it too. Each browser, and each version of each browser, handles caching of files differently, so just forget about it, code each frame as a page of its own with its own includes and let the browser worry about caching them.

Related

How to block a specific line of code from an iFrame?

Is there a way to block a particular line of code from being executed within a third-party website, appearing within an iFrame?
I have an embedded widget which is loading JQuery within the iFrame. I don't need the extra network request, because my site is already loading JQuery.
If I can block only one line of code (in this case, line 77) then I can prevent JQuery from being loaded again.
I imagine this action would take place prior to the iFrame being rendered.
The same-origin policy prevents you from touching any part of an iframe for a third-party website, so there's nothing you can directly do to prevent that request from being sent out. Even if you could, the iframe and your website have no shared state, so the other website will most likely break because it has no way to access your instance of jQuery. Think of what would happen if you loaded the third-party website in a new tab but blocked the request.
There are, however, a few things you can do to ensure the browser uses a cached copy of the library, which doesn't actually send a request anywhere:
If the external library is being loaded from a CDN, there's a good chance some other website has requested that same URL, so that user's browser has a cached copy of it.
Since you yourself use jQuery, you could use the other website's same version of jQuery. That way, a user's browser will have a cached copy of the file already from the CDN and no second request will be made.
Otherwise, if the website is using an old version of jQuery that you cannot yourself use or if it is being self-hosted without a CDN, there's nothing else you can do.

How to apply JavaScript changes in Firebug?

I am playing around with a JavaScript code in Firebug and I would like changes to take effect in that page. Especially when there is code inside jQuery's $.ready() function.
Some kind of refreshing the page without losing of what has been edited. Is there any way to do that?
Page changes made via Firebug or via Javascript do not persist from one page load to another. Each time a page is loaded, the original HTML, CSS and JavaScript is parsed and loaded (from cache or from the server). Any prior changes will not be there.
The only way for a dynamic page change to be still present after a refresh is for you to save the changed state to a persistent location and then rebuilt the appropriate page content from that state each time the page is loaded.
But, if you make a change to the page and store some state in a cookie, in local storage or on your server, then you can have JavaScript that runs each time the page loads that gets that state from wherever you stored it and then applies the appropriate change to the page. If you're saving the state on the server (on behalf of this particular user), then you could even have the serve modify the page contents before it is served to the browser.
You can type JavaScript code in the Firebug command line and see changes take effect on the page. You can do the same in the Firefox, Chrome, Opera and Safari DevTools.
Changes to pages done via Firebug do not persist. After a page reload the original sources will be loaded again (from the server or the browser cache).
Currently Firebug doesn't allow you to edit the code of the loaded scripts directly.
Though you can execute JavaScript code within the context of the page by using the Command Line:
Or for longer scripts you can use the Command Editor:
But again, code you executed there will be gone as soon as the page is reloaded.
To make permanent changes to the JavaScript code of a page you need to have access to the server and make them there.

Stop loading external files if they're unresponsive

My web app loads an external JS file that sometimes hangs for 30+ seconds, making my page hang in turn.
I know I can take it out of the head, or load the file from my own server, or switch services. However, I was wondering if there's a way to stop loading external files if they're unresponsive for some amount of time.
Loading JavaScript files asynchronous can be a tricky thing if you have dependencies between the files. I'd prefer placing the <script>-tag in the bottom of the <body>. This way you can put dependencies underneath it, and in the dependencies you check for a variable only availably in your slow JS-file before executing functions dependant on the slow JS-file. You also write that you could place the JS on your own server, so I guess it's on an external domain. If this is the case you can't use an ordinary XMLHttpRequest/AJAX call to load the JS-file asynchronously anyway.
If you absolutely want to load the file asynchronously and are ready to deal with the issues this might give you, then take a look at http://headjs.com/.
I don't think it is possible to link external file in regular way and specify custom timeout for its loading.
You can try to make ajax request to file address and in the same time set timeout for 30 seconds. If request stills running you can cancel it. Otherwise inside html page you may create script tag with loaded content.

AJAX App JavaScript Loading Issue

I am creating a complete ajax application where there is one base page and any pages the user navigates to within the application are loaded via ajax into a content div on the page. On the base page I include the various scripts that are needed for every page within the application (jQuery, jQuery-UI, other custom javascript files). Then on the various pages with the application I include a script or two for each page that contains the logic needed for just that page. Each of those script files have something that executes on the page ready event. The problem is that every time the user navigates to page1, the page1.js file is loaded. So, if they visit that page 10 times, that script is then loaded ten times into their browser. Looking at the Chrome script developer tools after running around the site I see tons of duplicated scripts.
I read somewhere about checking to see if the script has already been loaded using a boolean value or storing the loaded scripts in an array. But, the problem with that is that if I see the script is already loaded and I don't load it, the page ready function doesn't get fired for the page's javascript file and everything fails.
Is there an issue having the javascript file loaded over and over when the user visit the same page multiple times?
I did notice looking at the network traffic that every time we visit the page, the script is requested with a random number parameter (/Scripts/Page1.js?_=298384892398) which causes the forced request for the script file every time. I set the cache: true settings on the jQuery ajaxSetup method and that removed the parameter from the request and thus the cached version of the javascript file was loaded instead of actually making a separate HTTP request for it. But, the problem is that I don't want all the ajax requests made to be cached as content changes all the time. Is there a way to force just javascript files to be cachced but allow all other ajax requests to be not cached.
Even when I forced caching on all requests, the javascript file still showed up multiple times in the developer tools. Maybe that isn't a big deal but it doesn't seem quite right.
Any advice on how to handle this situation?
About your first question:
Every time you load a JavaScript file, the entire content gets evaluated by the browser. It solely depends on the content if you can load and execute it multiple times in a row. I'd not consider it a best practice to do so. ;)
Still i'd recommend that you find a way to check if it was already loaded and fire the "page loaded" event manually within the already present code.
For the second question: I'd assume that the script is intended to show up multiple times when including it multiple times. To give an advice on how to not cache the loaded JS i'd need to know how you loaded the code, how you do AJAX and the general jQuery setup.
After doing some more research it looks like it is actually just a Chrome issue. When you load a script via AJAX you can include the following in your code to get it to show up in the the Chrome developer tools
//# sourceURL=some-script-name
The problem is that when you navigate away from the page, the developer tools keeps the script around, but it is actually not longer referenced by the page.

How does caching work when a javascript file is loaded?

I have some tabs that are ajax powered. So everytime a tab is clicked all data is loaded including javascripts. So if they click on say Tab A then click on Tab B and finally Tab A. All Tab A scripts will be loaded twice.
Now I am wondering how does the caching work. On the second time they click on Tab A how much faster will these scripts download? Or will it be as slow as the first time?
Thanks
Assuming a fairly regular load, the script will load the first time, and be pulled from the cache from then on.
Unless you're doing something tricky.
Just like you can load a huge script on the first page request of a more traditional site, and include that script of subsequent pages, but after the first page load, the browser will (typically) just pull it from cache.
Use firebug and observe the behavior.
If you are loading the same URL, the browser will use the cached version. If you want to circumvent caching, add "?" followed by a random string to the url every time u call it.

Categories