Cache invalidation and synchronisation Angular/back-end - javascript

Intro:
I've got a complex and long lasting query on the back-end, feeding back the angular app on the front-end.
Currently the angular app uses the cached data on the back-end rather than reading directly from the complex query, which would take few minutes. The cache gets warm every morning and every night.
As users make changes to the UI, and save the data, which is then passed onto the server side, and saved to database. At that time the UI is up to date until the user refreshes the page. At the same time database is up to date, but the cache is stale.
So when the user refreshes the page the stale cache values are displayed on the page.
More info:
I'm now thinking of ways to refresh the cache, and any advice from more experienced folks would be most welcome.
My idea is to refresh the cache by a cache job (one at a time), which is queued as soon as user saves something. The job will have the relevant info what changed, and the whole cache won't have to be recalculated but rather just the bit which changed.
Question part:
What technique can I use to keep the user up to date with the data even if the user refreshes the page? Should I save the 'deltas', on the client side in a form of indexedDB or localstorage, at the same when the data is sent to server. So when the page refreshes the user reads the data from the localstorage or indexed db.
I'm still thinking this through, obviously I don't have much experience in this, any comments on the directions I've taken so far?
Basically I can change anything including back-end/front-end/caching it's still in the POC phase, I'm just trying to be as informed as possible to what worked for other people.
Update
Little more background. I'm working on a index like page, so there are more than one records that can be edited inline.
Also I'm doing some transformation of the flat db records on the back-end, before dumping them into the map like structure, and passing it to the front-end in a form of json.

I would think the simplest way would be to make sure you know the time the cache was created. When you make changes, save the current state of the page in localStorage, along with the time of the cache. When you load the page, you get the cached data, check it's time to see if it is more recent than your localStorage version. If it is, use the cache, if not, reload your data from localStorage since it has the cached data PLUS your changes already.

Your question is too long, let me summarize the facts.
You have a lot of information in the database
Direct search query takes several minutes
To provide fast search, you use cache which is updated two times a day
When user changes the data, database is updated and cache is not, so web page shows outdated information from cache.
This looks like a typical cache using scenario and the solution is obvious: you should update the cache with deltas as soon as database is changed. The real implementation will depend on your application architecture and cache structure.
The typical workflow for your problem would be:
def updateRequest(Request req) {
def tx = db.startTransaction();
tx.execute(createUpdate(req.getData()));
tx.commit(); // if transaction fails, cache is not updated
cache.update(req.getData()); // can be done in background, if you return delta
}

It seems that you are storing your data in tables and you use those tables with a complex query to build a JSON configuration to render your index.html file. I avoided this problem by avoiding tables and using a NoSQL solution. I build the JSON configuration object on the client side and store that JSON configuration object in a NoSQL collection. I do a simple query using the URL to grab the JSON configuration object and render the index.html file.
I have a little experience storing the JSON configuration object with AWS DynamoDB, and if I need to get faster I will probably switch to AWS ElastiCache.
The key is that you need to cache your JSON configuration object with a useful key like the site hostname or some other base URL and use that as your source of truth for index.html rendering.

Related

Caching Firebase data for multiple collections - how can I know when anything has changed?

I'm using Firebase to allow users to save certain information about their logged-in profiles. In total I have 4 different collections, and all of which contain UID-matching documents.
When my site loads up, I need to immediately access data from all 4 collections for a given user's UID. This means that right now, whenever a user reloads my application they generate 4 get requests. However users of my application tend to not change their stored information very often (perhaps once every 20 or so times they load the application), so there's really no need for this data to be re-requested every time.
I've already taken steps to cache all of this data within the user's browser local storage, but here is where my problem lays: how can I know when the data I have cached is out of date?
In the past I've created similar cache mechanisms which ultimately rely on a 'version' flag. Both the data source and the user's local storage contain a 'version'. When both match, the application knows that it has the latest data, however when they are out of sync, the application knows it needs to re-request the data. The 'version' flag on the data source is ultimately changed every time an update occurs which impacts the user in question.
Ideally I want to end up with this flow:
User loads application.
Application compares local data version with version contained in Firebase.
If versions match, use local data.
If versions do not match, request all data.
Is it possible to do something like this with Firebase? Right now I can't see this possible without having to generate an extra set call every time the user's data is modified to change this version flag.
If this isn't possible, how can I know when a user's data needs updating without using up multiple requests?
If it helps, I'm using Firestore and this is a web application with no backend.

Persist large amounts of data in AngularJS

I have a really big angularjs web application which loads large files when you first enter the site. The site is composed by a few SPAs, and every time the user change from one to the other this data is lost and has to be loaded again. Also, if the user presses refresh the data is also loaded again.
This data will very unlikely change during the user's session, so it would be great if we could persist it and reload only when there's an update. I know about local and session storage, but they have a 5000k or 10000k limit, which won´t be enough. I would need aroung 20000k storage.
Is there any way to do that?
We had the same issue a couple of months ago and found no valid solution. We used two work arounds:
Use one SPA (put the various SPAs in a Super SPA) . We had the chance to do so. Don' know if that is an option for you. The module ui-router which routes using state instead of urls might be helpful.
We used paginated data to reduce the size of initial loads.
If you need this huge amount of data to compute e.g. statistics in the browser or rendering graphics, consider to move this computation to the backend.
You can also use the http service to load the data and cache it. But this is only an option, if the data is not changed while the session is active.
Yes and no.
You can use server side code to inject javascript to your page in many different ways, so you will not have to make a ajax request to the server to get the data.
For example if you use php you can do something like
<?php
$script = "<script> var Data = $data;</script>";
echo $script;
?>
Then inside your control do something like
$scope.data = Data;
But browser is not built to handle big data at once. If your data is bigger than 5MB you are going to have a very bad user experience and your app will crash. So instead you should organize your data in a way that all of your data aggregation will be handled at the server side and you will send small package to the client.
Maybe you can create a service that uses $http to load the data asynchronously and set the cache parameter to true, as explained here: How to cache an http get service in angularjs

When to call the backend and when to store locally (angularjs)

I have an ionic app and a Parse.com backend. My users can perform CRUD functions on exercise programmes, changing every aspect of the programme including adding, deleting, editing the exercises within it.
I am confused about when to save, when to call the server and how much data can be held in services / $rootScope?
Typical user flow is as below:
Create Programme and Client (Create both on server and store data in $localStorage).
User goes to edit screen where they can perform CRUD functions on all exercises within the programme. Currently I perform a server call on each function so it is synced to the backed.
The user may go back and select a different programme - downloading the data and storing it localStorage again.
My question is how can I ensure that my users data is always saved to the server and offer them a responsive fast user experience.
Would it be normal to have a timeout function that triggers a save periodically? On a mobile the amount of calls to the server is quite painful over a poor connection.
Any ideas on full local / remote sync with Ionic and Parse.com would be welcome.
From my experience, the best way to think of this is as follows:
localStorage is essentially a cache layer, which if up to date is great because it can reduce network calls. However it is limited to the current session, and should be treated as volatile storage.
Your server is your source of truth, and as such, should always be updated.
What this means is, for reads, localstorage is great, you don't need to fetch your data a million times if it hasn't changed. For writes, always trust your server for long term storage.
The pattern I suggest is, on load, fetch any relevant data and save it to local storage. Any further reads should come from local storage. Edits, should go directly to the server, and on success, you can write those changes to localstorage. This way, if you have an error on save, the user can be informed, and/or you can use localstorage as a queue to continue trying to post the data to the server until a full success.
This is called "offline sync" or sometimes "4 ways data binding". The point is to cache data locally and sync it with a remote backend. This is a very common need, but the solutions are unfornately not that common... The ideal flow would follows this philosophy:
save data locally
try to sync it with server (performing auto merges)
And
Periodically sync, along with a timer and maybe some "connection resumed" event
This is very hard to achieve manually. If been searching modules for a long time, and the only ones that come to my mind don't realy fit your needs (become they often are backend providers that give you frontend connectors; and you already have an opiniated backend), but here they are anyway:
Strongloop's Loopback.io
Meteor
PouchDB

How to create a table in javascript and hold in memory for use on another page within a web application

I have two html files which are the two pages in my application
Page 1 = Home.html
Page 2 = Stats.html
When Page 1 is loaded, I am making AJAX calls to the Facebook API which returns some data and then I want to build a table using that data.
I then want to keep the table in memory and append it to a div on Page 2 when the user navigates to Page 2.
In this way, the wait time for the user is drastically reduced because in the time it takes them to navigate from Page 1 to Page 2, the majority of the work by the browser has been done.
Stats.html page should look into cached data which is fetched from Home.html. This is technically perfectly possible.
Before I give some details, ensure that the code that fecthes data from Facebook API is the same used between your two pages. The idea is to create a function that:
Looks for cached table data and retrieve it if found
If not found, fetch data from Facebook API and store it in the cache
Return the table data
Remember clearing or overriding the cache time to time.
In order to properly store data, there are two main ways:
Cookies
This is the traditional way to store data across a website. These are sent to your server at each request, which is not actually necessary but still helps you client-side. Cookies are controversial because subjects to many security flaws, so I wouldn't recommend using them.
Local storage
This is more modern in browsers, but less compatible with old browsers. This allows you to store data across a web site. This method is getting more and more common, I think you should use this. On top of that, I recommend using an helper library in order to ease the storage, like store.
More info about web storage at Mozilla's.

Caching client side code results across pages

In our application, we are painting navigation component using JavaScript/jQuery and because of authorization, this involves complex logic.
Navigation component is required on almost all authenticated pages, hence whenever user navigates from one page to another, the complex logic is repeated on every page.
I am sure that under particular conditions the results of such complex calculations will not change for a certain period, hence I feel recalculation is unnecessary under those conditions.
So I want to store/cache the results at browser/client side. One of the solution I feel would be creating a cookie with the results.
I need suggestions if it is a good approach. If not, what else can I do here?
If you can rely on modern browsers HTML 5 web strorage options are a good bet.
http://www.html5rocks.com/en/features/storage
Quote from above
There are several reasons to use client-side storage. First, you can
make your app work when the user is offline, possibly sync'ing data
back once the network is connected again. Second, it's a performance
booster; you can show a large corpus of data as soon as the user
clicks on to your site, instead of waiting for it to download again.
Third, it's an easier programming model, with no server infrastructure
required. Of course, the data is more vulnerable and the user can't
access it from multiple clients, so you should only use it for
non-critical data, in particular cached versions of data that's also
"in the cloud". See "Offline": What does it mean and why should I
care? for a general discussion of offline technologies, of which
client-side storage is one component.
if(typeof(Storage)!=="undefined")
{
// this will store and retrieve key / value for the browser session
sessionStorage.setItem('your_key', 'your_value');
sessionStorage.getItem('your_key');
// this will store and retrieve key / value permanently for the domain
localStorage.setItem('your_key', 'your_value');
localStorage.getItem('your_key');
}
Better you can try HTML 5 Local Storage or Web SQL, you can have more options in it.Web SQL support is very less when compared to Local Storage. Have a look on this http://diveintohtml5.info/storage.html

Categories