Finding better way of handling user session based data in angular/nodeJS - javascript

I have a stored procedure that take too much time to run as it internally does multiple joins on different tables, does group by, order by and then returns response to nodeJS server. The server than passes on this data received from mysql db to angular/client.
Because the client has to wait for too long to let http request complete. I was thinking of better approaches that will shorten the http request completion time.
Need suggestions on how can i handle or make the data in advance that a user will need later.Like adding a small cache database on server that will store user based data and clears out when user session is destroyed.If i go with this approach than which db will be best?
Or instead of having cache Db can i have a json file on server to store users data and use it?

Having a json file on server would be too cumbersome to handle in cases like updation and query in my opinion. Please have a look at memory based dbs for caching purposes if the data doesnt change too often. My recommendation is redis just because i have the best hands on it. However there are many other other options that you can find out with a quick serch like memcached.
Caching strategy only works if the data you are caching doesnt change too quickly. You can explore different schema strategies for the same.

Related

storing logged user data from front-end

I'm using angularjs to front-end and laravel framework to back-end. my database is Mysql.
I need to store all user's logs. Since I don't want to store user's logs inside mysql I chose MongoDB.
now I don't want to use laravel to store user's logs,Instead, I want to use nodejs.
At a glance:
laravel - mysql : to store data
nodejs - mongoDB : to store user's logs
My question,
I want to send user's logs from angularjs to nodejs and storing them inside mongoDB.
I think most systems stores user's logs from server-side(here laravel to mongodb or mysql),but I send user's logs from front-end to nodejs to storing logs. of course the connection between angularjs and nodejs has a hashing method.
What are the advantages and disadvantages of this method?
I have built a framework that logs all kinds of user actions from the front end JS app to an ajax call. For performance reasons it is based on asynchronous evenrs and buffers actions until some number are queued (generally 10) or a window close event. This data is stored in a back end database (ours is a fixed SQL schema so we have normalized all logs to a specific format).
We have found this a very good debugging and auditing tool. It does consume a modest amount of bandwidth and we only have a few hundred simultaneous users, but I would think with a good server design it could scale well (we use two sets of servers in two data centers).
It does help to classify the "actions" so we have some nav actions for focus changes, some api actions for other ajax calls, error logs for anything unusual, some input actions for form data edits, etc. We always log a session id, the user id, timestamp, etc. Best part is the framework does it all and the app writer does not need to think about it unless they explicitly want to call a logging function.
Your mileage will vary based on your requirements/environment, but it works very well for us.

Better performance between frontend JS and backend JS

I am a frontend guy, but I am working on a project in which I need to process lots of data in my nodeJS backend (my front is reactJS).
But once the data that needed to be processed in the backend is processed, I have the choice of either reprocessing this data in node or in react (knowing that in the end, I need this data in frontend).
Example: An array of links has been created in my backend, but I need to extract a single link from this array, in order to display it in React. I have the choice, pass the array to react and process the data there, or do it directly in node.
Is there a common fashion to fix this dilemma? What should I take into account to make a decision?
It's not good to send excessive information from your backend to your frontend. If you're going to send data to your frontend from your back-end and a lot of it isn't going to be used, then it's probably best to adjust your backend so that it only returns information that's going to be actually used by your frontend.
Alternatively, if your frontend isn't going to use all the the information sent by your backend right away, but potentially might use it later (based on user input), then it's better to send all the data from your backend and process it on the front end as needed to avoid making future requests to your backend.
Taking an array of links as an example:
If the user requests to see a link based on certain criteria, and that's the only link that they are going to see (based on the design of your application), then your backend should process that request and return only the link that your user wants to see to be displayed on the front end.
If the user can request to see a link, but could potentially request to see another link later, then your backend should send a full array of links that might need to be displayed at some point. Then your frontend can display the links at the appropriate time without having to make a request to your backend each time the user wants to see a new link.
In my opinion, if the logic doesn’t need to be done by the browser, then do it on the server. It will help you with reducing the size of your app in the long run. You want your final, bundled .js file to be as small as possible. That’s just one small step you can take to contribute to that.
The short answer is that it all depends on your business logic. Regarding how best to handle an array of items to be sent from backend to front-end, if a user will only ever need to see this one item, for example, then by all means, have the backend parse the array of data on its end and send that single item to the client front-end. If, on the other hand, you anticipate that you'll need to work with an array of items to be presented to the user at some point in the app, it would be reasonable to simply have the backend send the array of items. Furthermore, that array of items could be, for instance, a filtered version of the items that would be relevant to this particular user.

Cache invalidation and synchronisation Angular/back-end

Intro:
I've got a complex and long lasting query on the back-end, feeding back the angular app on the front-end.
Currently the angular app uses the cached data on the back-end rather than reading directly from the complex query, which would take few minutes. The cache gets warm every morning and every night.
As users make changes to the UI, and save the data, which is then passed onto the server side, and saved to database. At that time the UI is up to date until the user refreshes the page. At the same time database is up to date, but the cache is stale.
So when the user refreshes the page the stale cache values are displayed on the page.
More info:
I'm now thinking of ways to refresh the cache, and any advice from more experienced folks would be most welcome.
My idea is to refresh the cache by a cache job (one at a time), which is queued as soon as user saves something. The job will have the relevant info what changed, and the whole cache won't have to be recalculated but rather just the bit which changed.
Question part:
What technique can I use to keep the user up to date with the data even if the user refreshes the page? Should I save the 'deltas', on the client side in a form of indexedDB or localstorage, at the same when the data is sent to server. So when the page refreshes the user reads the data from the localstorage or indexed db.
I'm still thinking this through, obviously I don't have much experience in this, any comments on the directions I've taken so far?
Basically I can change anything including back-end/front-end/caching it's still in the POC phase, I'm just trying to be as informed as possible to what worked for other people.
Update
Little more background. I'm working on a index like page, so there are more than one records that can be edited inline.
Also I'm doing some transformation of the flat db records on the back-end, before dumping them into the map like structure, and passing it to the front-end in a form of json.
I would think the simplest way would be to make sure you know the time the cache was created. When you make changes, save the current state of the page in localStorage, along with the time of the cache. When you load the page, you get the cached data, check it's time to see if it is more recent than your localStorage version. If it is, use the cache, if not, reload your data from localStorage since it has the cached data PLUS your changes already.
Your question is too long, let me summarize the facts.
You have a lot of information in the database
Direct search query takes several minutes
To provide fast search, you use cache which is updated two times a day
When user changes the data, database is updated and cache is not, so web page shows outdated information from cache.
This looks like a typical cache using scenario and the solution is obvious: you should update the cache with deltas as soon as database is changed. The real implementation will depend on your application architecture and cache structure.
The typical workflow for your problem would be:
def updateRequest(Request req) {
def tx = db.startTransaction();
tx.execute(createUpdate(req.getData()));
tx.commit(); // if transaction fails, cache is not updated
cache.update(req.getData()); // can be done in background, if you return delta
}
It seems that you are storing your data in tables and you use those tables with a complex query to build a JSON configuration to render your index.html file. I avoided this problem by avoiding tables and using a NoSQL solution. I build the JSON configuration object on the client side and store that JSON configuration object in a NoSQL collection. I do a simple query using the URL to grab the JSON configuration object and render the index.html file.
I have a little experience storing the JSON configuration object with AWS DynamoDB, and if I need to get faster I will probably switch to AWS ElastiCache.
The key is that you need to cache your JSON configuration object with a useful key like the site hostname or some other base URL and use that as your source of truth for index.html rendering.

When to call the backend and when to store locally (angularjs)

I have an ionic app and a Parse.com backend. My users can perform CRUD functions on exercise programmes, changing every aspect of the programme including adding, deleting, editing the exercises within it.
I am confused about when to save, when to call the server and how much data can be held in services / $rootScope?
Typical user flow is as below:
Create Programme and Client (Create both on server and store data in $localStorage).
User goes to edit screen where they can perform CRUD functions on all exercises within the programme. Currently I perform a server call on each function so it is synced to the backed.
The user may go back and select a different programme - downloading the data and storing it localStorage again.
My question is how can I ensure that my users data is always saved to the server and offer them a responsive fast user experience.
Would it be normal to have a timeout function that triggers a save periodically? On a mobile the amount of calls to the server is quite painful over a poor connection.
Any ideas on full local / remote sync with Ionic and Parse.com would be welcome.
From my experience, the best way to think of this is as follows:
localStorage is essentially a cache layer, which if up to date is great because it can reduce network calls. However it is limited to the current session, and should be treated as volatile storage.
Your server is your source of truth, and as such, should always be updated.
What this means is, for reads, localstorage is great, you don't need to fetch your data a million times if it hasn't changed. For writes, always trust your server for long term storage.
The pattern I suggest is, on load, fetch any relevant data and save it to local storage. Any further reads should come from local storage. Edits, should go directly to the server, and on success, you can write those changes to localstorage. This way, if you have an error on save, the user can be informed, and/or you can use localstorage as a queue to continue trying to post the data to the server until a full success.
This is called "offline sync" or sometimes "4 ways data binding". The point is to cache data locally and sync it with a remote backend. This is a very common need, but the solutions are unfornately not that common... The ideal flow would follows this philosophy:
save data locally
try to sync it with server (performing auto merges)
And
Periodically sync, along with a timer and maybe some "connection resumed" event
This is very hard to achieve manually. If been searching modules for a long time, and the only ones that come to my mind don't realy fit your needs (become they often are backend providers that give you frontend connectors; and you already have an opiniated backend), but here they are anyway:
Strongloop's Loopback.io
Meteor
PouchDB

Backbone.js Security

i am learning Backbone.js at the moment, so sorry if my question is nooby :-P
in my program i check my data at server-side to be correct and etc ... but i was wondering what will happen if users change the data stored in models using Console in FireBug for example and try .save() or .fetch().
is there any way to stop such actions ?
considering all my data is going to be stored in models and can be easily retrieved by users i am not really comfortable using backbone.js, is it just me or is there something wrong here ?!
A simple and safety way is to include the user credentials (username and password) into your model and check it on the server side to each AJAX calls.
To avoid so much bdd requets, you can also generate an associated array of id => serial key to each logged user on the server side and return it by fetch() during the auth proccess, then, check if the id and the serial key you generated match to each AJAX calls.
but i was wondering what will happen if users change the data stored in models using Console in FireBug for example and try .save() or .fetch().
Then the edited data would be submitted to the server
is there any way to stop such actions ?
No, you just have to deal with them in the same way that you deal with any request: Perform authentication/authorization to make sure that the user making the request is allowed to do so.
considering all my data is going to be stored in models and can be easily retrieved by users i am not really comfortable using backbone.js
Then don't use it.
But don't be paranoid about keeping data secret if it is stuff you would display to the user if you weren't using a client side framework like backbond.
considering all my data is going to be stored in models and can be easily retrieved by users i am not really comfortable using backbone.js, is it just me or is there something wrong here ?!
You aren't doing anything wrong, but not using Backbone won't make your site any more secure. Even if you are not using Backbone, I can fire up the console while on your site and make any ajax request I want to your server. If I wanted to take it further, I could build an application that makes any request I want.
No real security can be implemented client-side. That is the server's responsibility regardless of whether or not you are using something like Backbone.

Categories