stop caching views in Angular.js - javascript

for some reason all of my html seems to be 100% cached in chrome.
I am using angular 1.2 with .net web api 2 project, and my content is served in index.html.
I did not make any cache policy changes yet, but it seems to be caching everything very heavily.
none of the changes i make (to the views) are reflected until i clear browser cache.
I don't see new changes after pressing f5 or after publishing my site to the server and doing f5 on that. I have to either explicitly clear browser cache, or keep console open with "no caching while dev tools are open" setting on.
I want to prevent asking users to clear their browser cache when new versions are deployed.

Since noone is chiming in, i'll post the solution I implemented, which works, but could be better
IIS:
http features > http response headers > set custom headers > Expire Web Content: Immediately
(if you don't see http features, you'll have to add the feature to your iis server)
index.html:
<meta http-equiv="CACHE-CONTROL" content="NO-CACHE">
<meta http-equiv="CACHE-CONTROL" content="NO-STORE">
(these may cause issues in older versions of IE)
this works, but I'd prefer to strike a better balance of caching between releases.
Alternative approach would be to use a build tool such as Grunt, and generate unique filenames for your scripts in production and update links in index.html. This would be the best approach I believe, because full caching would be enabled, and browser would always make the request for new version of files since the names would be unique. (I've also seen people append ?v=666 to files, but from what i've read this is not a reliable approach)
Also if you're serving content from .Net pages (instead of basic .html) you have an option of using Bundler which does manage tags between releases.
bundles.Add(new ScriptBundle("~/bundles/angular").Include("~/Scripts/angular/angular.js")
.Include("~/Scripts/angular/angular-route.js"));
this will result in
<script src="/bundles/angular?v=20LMOimhsS0zmHdsbxeBlJ05XNyVXwPaS4nt3KuzemE1"></script>
update
additionally I've also been adding a version parameter in template includes. This prevents caching, but it may prevent caching completely so do some testing if you go that route
<div data-ng-include=" 'app/import/import-summary.html?v=' + appVersion "></div>
appVersion can be a global variable set in app.js
app.run(['$route', '$rootScope', function ($route, $rootScope) {
$rootScope.appVersion = getBuildVersion(); // this can be read from assembly file version or any other place where versions are generated by your build server
}]);

I have found I have the same issue intermittently with my production server as well.
In most instances, 80% of the code that I push to the production server has no problems. However, a new module reference, a new Angular binding, or even the smallest template edit can take several refreshes to actually clear the cache and show the latest updates.
I actually wrote a 7 paragraph question slash statement to air my grievances and hoping someone had the same issue as myself. I ended up deleting the question, as I feared it was just my experience.
I have found a weird workaround, where if I host the site from the terminal by running either 'npm start' or 'http-server' modules on my machine, the constant update between both my development and production server will force the browser to acknowledge both environments and cause the change to either appear in one environment or both.
In my opinion, there are two types of developers: modular and line-by-line asynchronous.
Modular programmers code in large chunks and then upload the entire code effort which minimizes heavy caching.
Asynchronous programmers like to view every single change they make by coding a couple of lines, uploading the small changes and then refreshing the browser to see their latest nuance of progression.
I am definitely an asynchronous programmer. I find it allows me to catch smaller mistakes as well as understand the inner workings of whichever language I am using.
Is it safe to say that asynchronous programming leads to heavy browser caching, which is leading to the problem we are all having on this particular question?
Glad to share in your blind frustration, as this has been a huge problem of mine, ever since I've been dabbling in Angular.
It may at first glance appear to be a small issue, however as you are learning Angular, you may spend hours or even a couple of days recoding the same 30 lines because there was no change in your browser. This is a very dangerous, invisible issue which can lead to large amounts of time being chipped away for development time.

Related

Routes being cached by service worker in react

We are in the process of building a new website for the company I work for which is written in React using CRA. Browser push notifications and a PWA are both required so a service worker is essential, however I beleive the service worker is responsible for some fairly major caching issues.
The website is not yet in production however I added a new page yesterday (and this has happened a few times before) and once deployed to our dev environment nobody in the office was able to access it - they were simply redirected to the homepage (as if the route didn't exist) until they cleared their cache then the route loaded with no issues.
I've read a little bit on semantic versioning however the articles all seem to use NPM rather than yarn and are versioned locally (which isn't great with a team of 8 working on this project) using npm version patch etc.
We are using MS Azure as the build and release pipeline which I assume would be the best place to set versions if this is required.
My question is what are the steps to aviod this problem and am I on the correct lines thinking versioning will mitigate?
Semantic versioning in this context doesn't make any sense, you've been reading most likely about packages (libraries, frameworks) that are published into NPM for the world to use. In CRA projects, and most other web projects too, versioning of your app happens by the build tools as they name the files based on their contents. The filenames include the hash of the contents and are versioned automatically when contents change, eg. app.iue9234980s.js becomes app.92384oujiuoisdf.js etc.
--
If you're using the default Service Worker setup provided by CRA, then you should look at src/serviceWorker.js. In the comments of that file it says
// This lets the app load faster on subsequent visits in production, and gives
// it offline capabilities. However, it also means that developers (and users)
// will only see deployed updates on subsequent visits to a page, after all the
// existing tabs open on the page have been closed, since previously cached
// resources are updated in the background.
What happens here is that the SW and the build process use Workbox SW library that is configured to use precache policy. In this policy, users get the last version that was previously cached from the browser's cache even if there's a new version available, then in the background SW updates the caches, and on another visit users get the newer version. This of course means that users might always be one version "late".
If this behaviour is not what you want, then you need to change src/serviceWorker.js and probably some configuration somewhere in CRA files. You should google something like "custom service workers with cra" for examples.
To better grasp what is happening – and especially what is correct and intended behaviour in differently configured cases – I really recommend (everyone) to read Google's primer on SWs themselves, here: https://developers.google.com/web/fundamentals/primers/service-workers
With understanding of the SWs basic principles, it is then probably useful to checkout the Workbox library https://developers.google.com/web/tools/workbox to see what it can offer for your app.
Reading and understanding the different aspects of SW is key here – it is excruciatinly easy to shoot yourself in the foot with SWs :)

How to execute external JS file blocked by users' adblockers

We use an external service (Monetate) to serve JS to our site such that we can perform adhoc presentation-layer site updates without going through the process of a site re-deploy - which in our case is a time-consuming, monolithic process which we can only afford to do about once per month.
However, users who use adblockers in the browser do not see some of these presentation-layer updates. This can negatively affect their experience of the site as we sometimes include time-sensitive promotions that those users may not be aware of.
To work around this, I was thinking to duplicate the JavaScript file that Monetate is serving and host it on a separate infrastructure from the site. That way, it we needed to make updates to it, we could do so as needed without doing a full site re-deploy.
However, I'm wondering if there is some way to work around the blocking of the Monetate JS file and somehow execute the remote Monetate JS file from our own JS code in such a way that adblockers would not be able to block it? This avoid the need to duplicate the file.
If that file is blocked by adblockers, chances are that it is used to serve ads. In fact, your description of time-sensitive promotions sounds an awful lot like ads, just not for an external provider, but for your own site.
Since adblockers usually match the URL, the easiest solution would indeed be to rehost this file, if possible under a different name. Instead of hosting a static copy, you can also implement a simple proxy with the equivalent of <?php readfile('http://monetdate.com/file.js'); or apache's mod_rewrite. While this will increase load times and can fail if the remote host goes down, it means the client will always get the newest version of the file.
Apart from using a different URL, there is no client-side solution - adblockers are included in the browser (or an extension thereof), and you cannot modify that code for good reasons.
Beware that adblockers may decide to block your URL too, if the script is indeed used to serve ads.
Monetate if probably blacklisted in Adblock, so you can't do nothing about.
I think that self-hosting Monetate script would require to keep it updated by checking for new versions from time to time (maintaining it could become a pain in the ass).
A good solution in my opinion is to inform your users about that limitation with a clear message.
Or, you can get in touch with Monetate and ask for a solution.

How to speed up Drupal in China

My target is to use Drupal in China (or any other country that throttles internet access).
In particular Drupal is not China friendly because of its attempt to load some resources from servers which are not allowed, or run extremely slow because of their filter policy.
In most common Drupal cases, googleapis.com and bootstrapcdn.com.
As a consequence sites are really slow: the main content is loaded quickly enough, but then (depending on the browser configuration) the screen remains blank until the requests timeout. Which usually is tents of seconds.
I made some research on the topic, but solution are often localized on the specific module or theme. I would like to find a general solution to apply without the need to patch jQuery or bootstrap or whatever every time.
To find a general solution is not easy because many calls are made runtime from Javascript or CSS imports, so, solving the problem from Javascript doesn't seem to be a solution (or not?).
The best solution I thought until now is to edit the hosts file on the server, redirecting to localhost some of the calls, e.g.,
127.0.0.1 fonts.googleapis.com
::1 fonts.googleapis.com
127.0.0.1 maxcdn.bootstrapcdn.com
::1 maxcdn.bootstrapcdn.com
[...]
But doesn't seem to work, I'm still seeing them among the downloaded resources.
Somebody managed to solve this problem for the Drupal CMS?
jquery update has means to serve a local jquery version.
for the font library it depends how the font is added. either by the theme (then you will have to modify your theme's code or hook_library_alter the library - which probably is quiet hard to maintain) or if it comes by another module that usually comes with an option to serve local version. (like fontawesome).
After all, are you trying to change the server side output or have an local browser extension modify it?

How to optimally serve and load JavaScript files?

I'm hoping someone with more experience with global-scale web applications could clarify some questions, assumptions and possible misunderstandings I have.
Let's take a hypothetical site (heavy amount of client-side / dynamic components) which has hundreds of thousands of users globally and the sources are being served from one location (let's say central Europe).
If the application depends on popular JavaScript libraries, would it be better to take it from the Google CDN and compile it into one single minified JS file (along with all application-specific JavaScript) or load it separately from the Google CDN?
Assetic VS headjs: Does it make more sense to load one single JS file or load all the scripts in parallel (executing in order of dependencies)?
My assumptions (please correct me):
Compiling all application-specific/local JS code into one file, using CDNs like Google's for popular libraries, etc. but loading all of these via headjs in parallel seems optimal, but I'm not sure. Server-side compiling of third party JS and application-specific JS into one file seems to almost defeat the purpose of using the CDN since the library is probably cached somewhere along the line for the user anyway.
Besides caching, it's probably faster to download a third party library from Google's CDN than the central server hosting the application anyway.
If a new version of a popular JS library is released with a big performance boost, is tested with the application and then implemented:
If all JS is compiled into one file then every user will have to re-download this file even though the application code hasn't changed.
If third party scripts are loaded from CDNs then the user only has download the new version from the CDN (or from cache somewhere).
Are any of the following legitimate worries in a situation like the one described?
Some users (or browsers) can only have a certain number of connections to one hostname at once so retrieving some scripts from a third party CDN would be result in overall faster loading times.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Compiling all application-specific/local JS code into one file
Since some of our key goals are to reduce the number of HTTP requests and minimize request overhead, this is a very widely adopted best practice.
The main case where we might consider not doing this is in situations where there is a high chance of frequent cache invalidation, i.e. when we make changes to our code. There will always be tradeoffs here: serving a single file is very likely to increase the rate of cache invalidation, while serving many separate files will probably cause a slower start for users with an empty cache.
For this reason, inlining the occasional bit of page-specific JavaScript isn't as evil as some say. In general though, concatenating and minifying your JS into one file is a great first step.
using CDNs like Google's for popular libraries, etc.
If we're talking about libraries where the code we're using is fairly immutable, i.e. unlikely to be subject to cache invalidation, I might be slightly more in favour of saving HTTP requests by wrapping them into your monolithic local JS file. This would be particularly true for a large code base heavily based on, for example, a particular jQuery version. In cases like this bumping the library version is almost certain to involve significant changes to your client app code too, negating the advantage of keeping them separate.
Still, mixing request domains is an important win, since we don't want to be throttled excessively by the maximum connections per domain cap. Of course, a subdomain can serve just as well for this, but Google's domain has the advantage of being cookieless, and is probably already in the client's DNS cache.
but loading all of these via headjs in parallel seems optimal
While there are advantages to the emerging host of JavaScript "loaders", we should keep in mind that using them does negatively impact page start, since the browser needs to go and fetch our loader before the loader can request the rest of our assets. Put another way, for a user with an empty cache a full round-trip to the server is required before any real loading can begin. Again, a "compile" step can come to the rescue - see require.js for a great hybrid implementation.
The best way of ensuring that your scripts do not block UI painting remains to place them at the end of your HTML. If you'd rather place them elsewhere, the async or defer attributes now offer you that flexibility. All modern browsers request assets in parallel, so unless you need to support particular flavours of legacy client this shouldn't be a major consideration. The Browserscope network table is a great reference for this kind of thing. IE8 is predictably the main offender, still blocking image and iFrame requests until scripts are loaded. Even back at 3.6 Firefox was fully parallelising everything but iFrames.
Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Working out if the client machine can access a remote host is always going to incur serious performance penalties, since we have to wait for it to fail to connect before we can load our reserve copy. I would be much more inclined to host these assets locally.
Many small js files is better than few large ones for many reasons including changes/dependencies/requirements.
JavaScript/css/html and any other static content is handled very efficiently by any of the current web servers (Apache/IIS and many others), most of the time one web server is more than capable of serving 100s and 1000s requests/second and in any case this static content is likely to be cached somewhere between the client and your server(s).
Using any external (not controlled by you) repositories for the code that you want to use in production environment is a NO-NO (for me and many others), you don't want a sudden, catastrophic and irrecoverable failure of your whole site JavaScript functionality just because somebody somewhere pressed commit without thinking or checking.
Compiling all application-specific/local JS code into one file, using
CDNs like Google's for popular libraries, etc. but loading all of
these via headjs in parallel seems optimal...
I'd say this is basically right. Do not combine multiple external libraries into one file, since—as it seems you're aware—this will negate the majority case of users' browsers having cached the (individual) resources already.
For your own application-specific JS code, one consideration you might want to make is how often this will be updated. For instance if there is a core of functionality that will change infrequently but some smaller components that might change regularly, it might make sense to only compile (by which I assume you mean minify/compress) the core into one file while continuing to serve the smaller parts piecemeal.
Your decision should also account for the size of your JS assets. If—and this is unlikely, but possible—you are serving a very large amount of JavaScript, concatenating it all into one file could be counterproductive as some clients (such as mobile devices) have very tight restrictions on what they will cache. In which case you would be better off serving a handful of smaller assets.
These are just random tidbits for you to be aware of. The main point I wanted to make was that your first instinct (quoted above) is likely the right approach.

Link to external source or store locally

When using third-party libraries such as jquery, yui-reset, swfobject and so on, do you link to the hosted versions, or download and host your own versions?
Pros and cons either way?
Hosted versions are apparently the way to go. For three main reasons (edit: I've added a fourth reason, but it's sort of a moot point):
Google/jQuery/etc servers are likely to be faster than your own
A lot of these servers use content distribution so it will be served from a server geographically close to the requester
If every site used the hosted versions, users are more likely to have the files cached in their browsers, so a trip to the server might not even be necessary
They are possibly more reliable than your own server (however if your own server goes down, this point is moot, as you probably won't be able to serve the main page, so there won't be a request to the js file anyway)
Cons would be
You have no control over the uptime/reliability of the servers (though they are more likely more reliable than your own)
Cannot make any custom mods/patches to these files (though most good frameworks allow you to extend them without needing to modify the original code)
If the hosted file does not allow you to specify the version as part of the filename (eg "jquery-1.3.2.js" rather than "jquery.js"), you probably don't want to use the hosted version, as any updates could possibly break your code
I would say the pros generally outweigh the cons.
These are all javascript libraries - You want to put a copy of it on your own server.
If you somehow use a different version then you would not have tested against the newer version and it might break your code.
I always download and host them locally, just because I'm worried about their server downtime, there's no real guarantee that their servers will be up for the rest of time. There is usually a note in the script about who it belongs to anyway.
I guess the only con would be if the person who made the script wouldn't actually want it downloaded.. But I don't see that ever happening.
Plus the requests times are much faster, instead of requesting a script hosted at google, just request it on your own server.
For production use hosted.
For development use local because if you're offline then your dev site is broke.

Categories