Ensure javascript is cached in browser in ASP.NET MVC app - javascript

I have a web app written in ASP.NET MVC 3.0. There are some largish scripts (jQuery, jQuery UI) which I want to ensure are cached for best performance. In the Chrome Developer Tools Network tab the scripts always take around 1.5 seconds to be received when a page is loaded. I would assume if they are cached this would be near instant.
Is there any way to ensure javascript is being cached and how to tell if it is or isn't?

For JQuery in particular it is better to use someone elses CDN - you will not have to stream this content from your server AND caching is properly done by someone else. See http://docs.jquery.com/Downloading_jQuery for recommended CDNs.
For files that you have to host yourself make sure you set correct caching headers.
For static content you need to rely on server (likley IIS in ASP.Net case) to set correct headers - see http://support.microsoft.com/kb/247404 for some detais, search for "iis cache control" to get more links.
For dynamic content choose needed values OutputCache attributes or set headers yourself (i.e. see http://www.asp.net/mvc/tutorials/improving-performance-with-output-caching-cs ).

Related

Javascript used to include html, is it cached?

I'm using a method of creating a .js file on server #1 which contains document.writes to write html code, then a simple js include inside html code on server #2 to load that html code (there are multiple server #2's). This is basically replacing an iframe method with the advantage being that each server #2 owner controls their own css.
The method works perfectly as is. My question has to deal with caching. Each time the page is loaded on server #2 I want the .js reloaded, as it will change frequently on server #1. This appears to be the case on each browser I tested, but can I rely on this as being the default case, or is it dependent on browser settings? Despite all I've read on caching I can't figure out what triggers the load for a case like this.
You can control browser caching using HTTP headers on the server side. Like cache-control and cache-expiration. More here - http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
In a case like this, the caching is triggered by the cache policy of the js file. Not the html file.
The browser doesn't cache the rendered page (well, it does for back buttons but that's not what we're talking about). The browser caches the source file. Therefore even if the html page is configured to be cached for a long time the javascript injected content will only be cached as long as its been configured to.
To configure caching policy you need to set specific headers on the server side. Sometimes you can do this in a CGI script. Sometimes you can do this in the server configuration files.
Google "http caching" and read up on how to configure a page to be cached or not cached (also google "json disable caching" or "ajax disable caching" because this issue crops up a lot with ajax).

Override to read local JS-file in web app wrapper

Im looking into creating a web wrapper for a existing web app. I clearly want to make it as quick as possible.
Is it possible to host the JS-files locally, instead of having to download the file, without altering the existing web app?
Using a WebViewClient you can prevent loading the javascript from the web server (edit only in API level 11 and higher unfortunately). Or you can disable JavaScript, load the page, then enable JavaScript again. After the page is loaded you can modify the DOM using javascript: urls to load the scripts from a local url (like file:///android_asset from the top of my head).
You can also change the cache strategy of the WebView so that it will never fetch anything that is already fetched once before, which might also be what you want in this case. These are set in http://developer.android.com/reference/android/webkit/WebSettings.html, you could set it to LOAD_CACHE_ELSE_NETWORK in this case.

Options for communicating between Chrome Extension and Embedding Page's Javascript

I am monitoring browser events such as when a new tab is created. My extension needs to display these browser events in the new tab page.
To make versioning easier I would like the extension to be as dumb as possible. That is, all it needs to do is tell me is that a tab has been created and I need to be able to tell the extension to switch to a tab. Then I do not have to worry about what extension versions people have installed.
The new tab page so far is a redirect to my single-page app hosted on my server.
My options seem to be:
Using custom events to send messages between the content script and embedding page: http://code.google.com/chrome/extensions/content_scripts.html#host-page-communication
This seems like a security risk as the page javascript will also have access to the DOM and hence the messages I am exchanging.
Loading the HTML from server into an iframe, pulling application JS from server and injecting it into the iframe as a contentscript. This allows the app's JS to have full access to the chrome extension API which is what I need.
Another consideration is that my project is currently using RequireJS. For option 2, it seems I won't be able to use this.
Can anyone recommend the preferred option keeping in mind the security risks of option 1?
Will I be able to use RequireJS with option 2?
Is there another way to acheive this?

Versioning Javascript Files to Prevent Unnecessary Cache Clearing

I version all of my client side JS files like "/js/myfile.js?v=3903948" so that my clients don't need to clear their browser cache to ensure they get the updated files. But every time I push an update, without fail, at least one person runs into a problem where they are running the old version and get some kind of error. I used to think that this was just them having already been on the page during the release and just needing to reload the browser, but this happened to me today when I was definitely not previously on the page. I browsed to the live site and was running the old code. I needed to do a browser refresh on that page to get the new file.
What can cause this?
PS I was using Chrome on Win7, but I have seen clients report this before on all different browsers.
If your main web page can also be cached, then the old version of that page can be requesting the old version of the JS file. JS file versioning works best if the page that actually refers to the JS file cannot be cached or has very short caching time.
I agree with jfriend00 about the webpage itself being cashed and thus requesting the old javascript version.
To prevent this, you can have the javascript file loaded by an ajax (Post) request, either requesting the server what is the accurate(latest) version number to download, or requesting the javascript itself and inserting it, e.g. in the head of the page.
Edit: see for example here
I make a quick AJAX request to the server for the version it expects them to have, then force them to refresh the page if the client's script is old.
Seems that proxy or some load balancer is serving old content instead of new. Also check IIS/webserver settings how are these files cached/expired.
You can check what is going on on the wire with tools like Fiddler.

ActiveX control not accessing filesystem when page is generated programmatically

We are working on using a 3rd party's ActiveX control within a web page. Our page includes JavaScript to access and manipulate the control. Part of the control's functionality requires it to access files on the local filesystem.
If we generate the page programmatically, this functionality fails - the ActiveX control appears unable to access the filesystem. If we take the generated page source, copy it into a static file, and serve that file from the same web server, everything works as expected - the ActiveX control gets the info it needs from the filesystem, and we go merrily on our way.
I have used a JavaScript debugger to walk through the two different pages, and verified that the calls to the ActiveX control have identical parameters. I have verified that both the static page and the dynamic page are listed in the "Local Intranet Zone" in IE so they should have the same security constraints.
I have used SysInternals' ProcessMonitor to see what the ActiveX control is doing in the system, and what differs. Interestingly, when the calls to the control succeed, there are ProcessMonitor traces showing where the control is querying the registry for filenames, and accessing the filesystem. When the process fails, it's not the case that there are failures accessing the filesystem, but rather, the control never queries the registry to find the filename, and never tries to hit the filesystem.
The vendor of this control is mystified, and I've run out of ideas of what to try. Is there something that I ought to be checking? Some difference between dynamically-generated pages and static pages that IE or an ActiveX control might be able to detect, that would cause behaviors to change? The URI is different, the static page has a ".html" extension.... There's not much else that's different, as far as I can tell.
Any ideas would be welcome....
We figured out what was wrong, and effectively uncovered a bug in the 3rd party's ActiveX control.
They have a feature where they can optionally validate the URL of the page on which the control is loaded, or they can configure the control with a wildcard that is supposed to match any URL. The vendor had worked with us and configured the control with the wildcard, and assured us that this could not be the problem.
When we replaced the wildcard URL in the control's configuration with our actual URL, the control started working. As far as we can tell, we were fighting all day yesterday against a bug in the control's wildcard handling.

Categories