I'm using angularjs to front-end and laravel framework to back-end. my database is Mysql.
I need to store all user's logs. Since I don't want to store user's logs inside mysql I chose MongoDB.
now I don't want to use laravel to store user's logs,Instead, I want to use nodejs.
At a glance:
laravel - mysql : to store data
nodejs - mongoDB : to store user's logs
My question,
I want to send user's logs from angularjs to nodejs and storing them inside mongoDB.
I think most systems stores user's logs from server-side(here laravel to mongodb or mysql),but I send user's logs from front-end to nodejs to storing logs. of course the connection between angularjs and nodejs has a hashing method.
What are the advantages and disadvantages of this method?
I have built a framework that logs all kinds of user actions from the front end JS app to an ajax call. For performance reasons it is based on asynchronous evenrs and buffers actions until some number are queued (generally 10) or a window close event. This data is stored in a back end database (ours is a fixed SQL schema so we have normalized all logs to a specific format).
We have found this a very good debugging and auditing tool. It does consume a modest amount of bandwidth and we only have a few hundred simultaneous users, but I would think with a good server design it could scale well (we use two sets of servers in two data centers).
It does help to classify the "actions" so we have some nav actions for focus changes, some api actions for other ajax calls, error logs for anything unusual, some input actions for form data edits, etc. We always log a session id, the user id, timestamp, etc. Best part is the framework does it all and the app writer does not need to think about it unless they explicitly want to call a logging function.
Your mileage will vary based on your requirements/environment, but it works very well for us.
Related
Introduction
I'm setting up a Nodejs server (with socket.io) which will handle users achievements and store them in my Mysql database.
The Problem
Socket io generates a new unique "socket id" each time a user reloads the page, this means I either have to store the achievement data on the Node server -or fetch it again for every page reload.
My question
Should I store the Achievement data on my Server - or should I fetch it every time a user needs it?
Performance wise - thoughts
My head tells me to store the achievement data on the Node server - as a simple page reload will cause my server to fetch information again.
Solution to your problem:
generate a uuid client side and store it as a cookie and store it in localStorage as backup.
on page load check for the uuid. if it doesn't exist, generate it.
send the uuid to socket.io when connecting.
serverside, use the uuid for storing and retrieving the achievements. you can choose any way of storing this data you like. flat files, relational db's, whatever you like. you could even store it in memory as an object with uuid as key, and periodically serialize and store the object as a json file, that way on crash you can reload the json file as your server's "state".
hopefully there is enough here to give you an idea of a good way to address your problem.
I need to design an offline-first application with sync capabilities so I decided to go with CouchDB. Since I will deploy this application on user's workstation he/she has the capability to tamper with data (in his/her local database e.g. PouchDB). AFAIK, CouchDB only offers validation functions (which only has access to incoming document, its previous version and userCtx) to avoid this, but most of the times this validation depends on the business logic. Is there any way to manage this scenario?
Anyway, if the user tampers the "local" db, its modifications will be deleted when the sync occurs (if the remote db is the master).
Or if you choose 2 ways sync, the master db is modified by the user...
In PouchDB Documentation I found that sync is between local database and the remote CouchDB database. I'm trying to build a native application that have a local database for every user (many databases) and sync to one remote database:
Let's say that user 01 syncs to the remote database, then user 02 syncs to it too. I think it'll override the data of the first user. This is what I want:
// Default
user1 = ['data']
user2 = ['data']
remote = [user1 or user2]
// what i want
user1 = ['data']
user2 = ['data']
remote = [user1, user2, ....etc]
Replication relies on the sequence of the databases being synced. Thus, you only "overwrite" data when their _id values will clobber one another. However, I suspect the next issue you'll need to contend with is keeping user data separate.
If you replicate all your users into a single database, all your users will also receive every other users' data as well when they replicate back out of that single database. I'm not sure what your use-case is, but that isn't generally how apps are structured. If you want to use a single database like this, you'll need to include some sort of tagging to your documents and use Filtered Replication.
To keep your data segmented by user, you'll need to be diligent in using this parameter every time you sync. However, your database is probably exposed to the public internet if you're syncing them, and the lack of document-level controls means your savy users will be able to see everyone else's data anyways.
My recommendation here is to give each user their own database, rather than replicating everything into a single hub database. With CouchDB 2.x, you have access to couch_peruser, which gives each registered user their own database automatically. (this will require registering your users with CouchDB as well, but honestly it's the best for security in a publicly-exposed server anyways)
I have an ionic app and a Parse.com backend. My users can perform CRUD functions on exercise programmes, changing every aspect of the programme including adding, deleting, editing the exercises within it.
I am confused about when to save, when to call the server and how much data can be held in services / $rootScope?
Typical user flow is as below:
Create Programme and Client (Create both on server and store data in $localStorage).
User goes to edit screen where they can perform CRUD functions on all exercises within the programme. Currently I perform a server call on each function so it is synced to the backed.
The user may go back and select a different programme - downloading the data and storing it localStorage again.
My question is how can I ensure that my users data is always saved to the server and offer them a responsive fast user experience.
Would it be normal to have a timeout function that triggers a save periodically? On a mobile the amount of calls to the server is quite painful over a poor connection.
Any ideas on full local / remote sync with Ionic and Parse.com would be welcome.
From my experience, the best way to think of this is as follows:
localStorage is essentially a cache layer, which if up to date is great because it can reduce network calls. However it is limited to the current session, and should be treated as volatile storage.
Your server is your source of truth, and as such, should always be updated.
What this means is, for reads, localstorage is great, you don't need to fetch your data a million times if it hasn't changed. For writes, always trust your server for long term storage.
The pattern I suggest is, on load, fetch any relevant data and save it to local storage. Any further reads should come from local storage. Edits, should go directly to the server, and on success, you can write those changes to localstorage. This way, if you have an error on save, the user can be informed, and/or you can use localstorage as a queue to continue trying to post the data to the server until a full success.
This is called "offline sync" or sometimes "4 ways data binding". The point is to cache data locally and sync it with a remote backend. This is a very common need, but the solutions are unfornately not that common... The ideal flow would follows this philosophy:
save data locally
try to sync it with server (performing auto merges)
And
Periodically sync, along with a timer and maybe some "connection resumed" event
This is very hard to achieve manually. If been searching modules for a long time, and the only ones that come to my mind don't realy fit your needs (become they often are backend providers that give you frontend connectors; and you already have an opiniated backend), but here they are anyway:
Strongloop's Loopback.io
Meteor
PouchDB
My app is built in PHP (served on nginx/php-fpm) and I use node/js with socket.io for users push notifications. These are pushed using Redis pub/sub to interlink between PHP and node.js
The node.js app maintains an array of online userid's. They get added when a user connects to socket.io and removed from the array when they disconnect.
MySQL is used as the main database and I have a standard relationship table which denotes who is following who. A list of followers userid's is retrieved when a user logs and displayed to them.
I now wish to intersect these two sets of data to provide live online status's of these relationships in a similar manner to facebook (a green light for online and grey for offline)
What would be the most performant and most scale-able way of managing this. My current thought process is along these lines:
On the client side we have a javascript array of followers user id's. Set up a timer client side which pushes this array to the node.js app every 60 seconds or so. Node.js handles inersecting the followers id's with its current array of online users and returns an object depicting which users are online.
Now this would work but it feels like it might be a heavy load on node.js to be consistently looping through users followers lists for every online user. Or perhaps I am wrong and this would be relatively trivial considering the main application itself is served by PHP and not via node which only currently handles notification pushing?
Regardless, is there a better way? It's worth noting that I also use redis to build users activity streams (The data is stored in MySQL and redis maintains lists of activity id's)
So seeing as I already have a Redis server active, would there be a better method leveraging Redis itself instead?
Thanks for any advice
If i remember right, when socket.io connected to client side, that makes a request to the client every time for checking active connection and return result, in callback succeffull you can put code that will be update time active users in DB. And when get last notes beyond 5 minutes from DB.