In my project I load json files asynchronously via import().then.
Occasionally, when new content is available but isn't applied yet, the old cached script bundle tries to load the json files by their old name (because the bundling generates new hash names on every build). But those files are not available anymore.
I saw that many apps use a toast message to inform their users about the update so they can refresh, but is there another way to solve this issue?
No. There's no automatic way to handle this.
Based on your question I take it your SW configuration uses a cache-first strategy. Since it's a cache-first strategy the situation you described can happen and will happen sometimes.
You have two options:
Precache all the JSON files too. This way, when the old JS code tries to fetch the JS files asynchronously, they will be served from the SW's cache instead of going to the network. Client gets an old version of the whole app, including JSON files.
Implement some sort of complex(ish) custom-logic that tries to get the files and in an error situation talks to your server, fetches the correct filenames, and tries again with the new filenames. You could easily generate a file that lists all the current JSON filenames.
Both options have different gotchas and they might or might not work depending on the application.
Related
I am using react-markdown (escaped) to load markdown from a JSON file.
I utilize a static site delivers on CloudFront to cut cost and remove the need of operations costs on servers.
Currently all posts get compiled into a posts.json file that get read by react-markdown in a react-based static site.
Beyond maybe chunking this to into multiple files to prevent a huge json file needing downloading, is there any kind of issue from this method?
Is this a bad idea?
EDIT; react-snap is being used to "prebuild" or whatever the term may be. I however am unsure if this is doing anything in regards the json that gets loaded on the page, for example if its getting output in build as plain HTML. Will have to confirm.
Your approach does take on some overhead and latency since there are several dependencies that must be satisfied before your content reaches the user.
The browser must load and parse the script
The script must fetch the JSON from the server
react-markdown has to parse the markdown strings
React has to render the components
None of these are likely to be particularly slow, but we can't do any of it concurrently, and it adds up.
But since your markdown is static and doesn't require re-rendering, you can get some efficiency from moving the render to the server, possibly even to the build step itself. If the markdown string is compiled to HTML via react-markdown at build time, then the client receives the markup that much more quickly.
Frameworks like Next.js do this by design - it allows you to specify async functions that fetch the data needed to render the page at build time. The data can of course be anything that is representable as React props, including JSON.
It may be neither your reponsibility nor your preference to change your project to use a new framework, but I hope the general ideas are useful on their own.
I think server-side rendering will help in your case as most of the resources need to be compiled on the client machine. you can also use a chrome puppeteer that is headless chrome that can be used to transpile resources on the server and then send it client to reduce the latency. Refer https://developers.google.com/web/tools/puppeteer
It looks like you have everything you need to use a static site generator like Gatsby. A static site generator will allow you to keep your big JSON file which will only be read on build time. A static site generator like GatsBy will help you generate stand alone static HTML documents for each of your blog post.
You also can host your static site for free on any the popular CDNs free tiers like Nelify and Surge
I'm using Meteor 1.5, FlowRouter and Blaze.
I've been working on an app which requires offline support, unfortunately it also has several (large) areas of the app that are only available to a small subset of users, to avoid bloating the initial JS download with content most users won't require, I'm using dynamic imports at the FlowRouter.route({action}) level.
For offline mode (in addition to handling the data, etc) I'm using a service worker to cache the JS, CSS and HTML. Unfortunately, because dynamic imports work over the websocket, it isn't possible to cache these as they are loaded.
Fortunately, the user has to notify the server of their intent to work offline (so relevant data and files, videos, etc can be downloaded), this gives the opportunity to load these dynamic imports prior to the client going offline.
What are my options for caching dynamic imports? What I've considered so far:
writing a simple package with all the dynamic imports loaded statically, and with {lazy: true} defined in the package.js.
requires quite a lot of restructuring
the lazy: true means the package isn't actually available via a URL, it seems it is only available as a dynamic import itself.
writing a server side "fetcher" that will take a package name as an argument and serve the content from the file system
I don't know how the client side packages are stored on the server, and if the server has access to those files at all.
using browserify (or something similar) to manually generate a static js bundle in /public which can be downloaded when the client states their intent to go offline
Its very manual, and a change to the dynamic imports could easily be missed.
Has anyone attempted this before, I know meteor doesn't officially support service workers, but as far as I can tell, with the exception of dynamic imports, it plays nice with them
Per default, Angular fetches the HTML templates from the server when the user navigates to a route. With that in mind, imagine this scenario:
User loads the Angular app. The main view has a subpage called "Order".
While the user is studying the main view a new version of the app is rolled out in production. The new version has a complete rewrite of the Order page with new Javscript and HTML.
The user navigates to the Order page. The Javascript is already loaded by the browser in step 1, so the user is on the old version until app is reloaded. But the new template gets fetched from the server on navigation. So now the Javascript and template are our of sync!
Is my assumption that the Javascript/HTML is out of sync, correct?
If so, are there any best practices related to this issue?
I guess one solution is the make Angular fetch all the templates on app initialization. But this could be a performance penalty if the app has hundreds of HTML views.
I've never wondered about that issue myself. One possible idea would be to reuse the pattern known as assets versioning, where upon new release, you rename all your assets.
For instance, instead of login.html you'd use login-xyz.html as a name of a template. xyz could be a random value, or a checksum of the file. Checksum might be a slightly better option because if the new release is small (i.e. you fixed just some small bug in one file), if user loads any page but the fixed one, he/she will not be bothered with a reload - all other files will have the same checksums, they'll work with no interruptions.
This way, when an outdated Anguar app tries to fetch a template, it'd get a HTTP 404 error. As an addition to that, you could write a simple $http interceptor, which would detect a 404 response, and reload page automatically (or offer user an option of doing so).
There are modules which are capable of renaming assets, such as gulp-rev - but I never heard of using that for Angular templates. You might implement something like that on your own, though.
Of course you might want to keep both the new and old versions of files to allow users to work without interrupting them with a refresh. Depends on what your requirements are. I assume you're trying to avoid that, though.
Sample 404 interceptor (CoffeScript, as I have it handy now):
m.factory 'notFoundInterceptor', ($q) ->
return {
responseError: (response) ->
if response?.status == 404
# Reload, or warn user
return $q.defer()
# Not a 404, so handle it elsewhere
$q.reject response
}
m.config ($httpProvider) ->
$httpProvider.interceptors.push 'notFoundInterceptor'
Thanks for good answers.
It turned out that this problem solved itself for us. Every time we roll out a new release all the users sessions gets deleted and users will be sent to the login page. This will trigger a page load and fresh JavaScript/HTML gets loaded.
I've read about this issue long time ago, and one option is to do versioning on changed pages and application.js file.
For example on your version 1 of your application you can on your html file use something like:
<script src="js/angular_app_v1.js"></script>
Inside your routes also version the templateURL
templateUrl: 'templates/view_product_v1.html'
So when you roll out a new version you won't be overwriting templates and users already working will have the old version until they reload the browser but won't have version inconsistences.
Versioning of the assets using the file names would become unmaintainable for even a medium sided app.
Although it is a heavy weight approach for web assets you could look into content negotiation. This is where the call for a resource, generally a REST api returns the version of the resource, Content-Type: application/vnd.contentful.delivery.v1+json.. On the client you can check that the version matches what it expects. So if the client only knows how to load v1.1 and the resource responses with v1.2 the UI would know it cannot process that and should reload the page.
Another possibility is to load all templates up front in the UI. There are build processes in Grunt you can run such as https://github.com/ericclemmons/grunt-angular-templates that will combine all of your templates into a single file for delivery and then load them into $templateCache so no requests ever get to the server.
If you have some sort of server-side language you can build a filter in (.NET, Rails, Java or whatever), and pass along a version number with your template requests. If the version requested by the client is older than what's deployed, you'd send an error to the client. Your client would watch for that error ($http interceptor) and force a page-refresh to pull down the newer javascript code. (Maybe show an alert to the user first so they know what's going on).
You can preload all your templates into a $templateCache and serve them as one templates.js file. There is a gulp task for this https://www.npmjs.com/package/gulp-angular-templatecache. Then your application will load all templates in a single request together with application scripts on start up, thus they will be in sync. Read http://www.johnpapa.net/angular-and-gulp/ for more info.
It always makes sense to have a version number and use it when syncing resources. It's not only a good practice for the use case you described, but also for other situation, such as rolling back to a specific version or having two versions live and usable (for example in order to let some users preview the next version)
I was thinking about creating script that would do the following:
Get all javascripts from JS directory used on server
Combine all scripts to one - that would make only one request instead of multiple
Minify combined script
Cache the file
Let's say that the order in which the files need to be loaded is written in config file somewhere.
Now when I load myexamplepage.com I actually use jQuery, backbone, mootools, prototype and few other libraries, but instead of asking server for these multiple files, I call myexamplepage.com/js/getjs and what I get is combined and minified JS file. That way I eliminate those additional requests to server. And as I read on net about speeding up your website I found out that the more requests you make to server, the slower your web become.
Since I'm pretty new to programming world I know that many things that I think of already exists, I don't think that this is exception also.
So please list what you know that does exactly or similar to what I described.(please note that you don't need to use any kind of minifiers or third party software everytime you want your scripts to be changed, you keep original files structure, you only use class helper)
P.S. I think same method could be used for CSS files also.
I'm using PHP and Apache.
Rather than having the server do this on-the-fly, I'd recommend doing it in advance: Just concatenate the scripts and run them through a non-destructive minifier, like jsmin or Google Closure Compiler in "simple" mode.
This also gives you the opportunity to put a version number on that file, and to give it a long cache life, so that users don't have to re-download it each time they come to the page. For example: Suppose the content of your page changes frequently enough that you set the cache headers on the page to say it expires every day. Naturally, your JavaScript doesn't change every day. So your page.html can include a file called all-my-js-v4.js which has a long cache life (like, a year). If you update your JavaScript, create a new all-in-one file called all-my-js-v5.js and update page.html to include that instead. The next time the user sees page.html, they'll request the updated file; but until then, they can use their cached copy.
If you really want to do this on-the-fly, if you're using apache, you could use mod_pagespeed.
If you're using .NET, I can recommend Combres. It does combination and minification of JavaScript and CSS files.
I know this is an old question, but you may be interested in this project: https://github.com/OpenNTF/JavascriptAggregator
Assuming you use AMD modules for your javascript, this project will create highly cacheable layers on demand. It has other features you may be interested in as well.
Lets say I have an application that updates it javascript/css files (and keeps the same file names)
How can you force a client to not serve files from cache and use the new javascript/css. I have a feeling a bug is being caused by this. I still want the files to cache, so appending a random string to the css/js include will NOT be good.
Append the date the file was changed to the CSS/JS file request. Many frameworks can do this for you, depending on your technology stack.
Even better is to put a version number in the filename and each time you revise the file, you modify the version number in the filename and modify the source page that loads it to point to the new filename. Then you get maximum caching possible when the file hasn't changed, but when you do change the file you immediately get the new version. This technique is used by many sites on the web for exactly this reason.