I have an express server setup that is handling all of my routing and session stuff. I want the system to work so that when a user is logged in they are able to connect to a "hub" like entity that is uniquely based on the location of the "hub". I thought about working it like each of the "hubs" is a collection in a database, but the way it works is that a user connects to the "hub" and then disconnects from it when they are done but can connect to different "hubs" based on a location. How should I go about creating a unique group of "hub" like things that all act as objects with storable data?
Instead of connecting to a "hub", why not just present them with different information from a database based on their location. The user will never really connect to anything other than your single backend. Unless, ofcourse, you set up different servers all over the world (known as a CDN, and probably a bit too much effort).
If you're using express, you could use something like mongodb for data storage.
With mongodb you get the mongoose npm package. With that, you can create Schemas. You could have different Schemas as "hubs" and load the correct ones based on location data. That would let the user see different information for different locations.
Related
I'm using angularjs to front-end and laravel framework to back-end. my database is Mysql.
I need to store all user's logs. Since I don't want to store user's logs inside mysql I chose MongoDB.
now I don't want to use laravel to store user's logs,Instead, I want to use nodejs.
At a glance:
laravel - mysql : to store data
nodejs - mongoDB : to store user's logs
My question,
I want to send user's logs from angularjs to nodejs and storing them inside mongoDB.
I think most systems stores user's logs from server-side(here laravel to mongodb or mysql),but I send user's logs from front-end to nodejs to storing logs. of course the connection between angularjs and nodejs has a hashing method.
What are the advantages and disadvantages of this method?
I have built a framework that logs all kinds of user actions from the front end JS app to an ajax call. For performance reasons it is based on asynchronous evenrs and buffers actions until some number are queued (generally 10) or a window close event. This data is stored in a back end database (ours is a fixed SQL schema so we have normalized all logs to a specific format).
We have found this a very good debugging and auditing tool. It does consume a modest amount of bandwidth and we only have a few hundred simultaneous users, but I would think with a good server design it could scale well (we use two sets of servers in two data centers).
It does help to classify the "actions" so we have some nav actions for focus changes, some api actions for other ajax calls, error logs for anything unusual, some input actions for form data edits, etc. We always log a session id, the user id, timestamp, etc. Best part is the framework does it all and the app writer does not need to think about it unless they explicitly want to call a logging function.
Your mileage will vary based on your requirements/environment, but it works very well for us.
In PouchDB Documentation I found that sync is between local database and the remote CouchDB database. I'm trying to build a native application that have a local database for every user (many databases) and sync to one remote database:
Let's say that user 01 syncs to the remote database, then user 02 syncs to it too. I think it'll override the data of the first user. This is what I want:
// Default
user1 = ['data']
user2 = ['data']
remote = [user1 or user2]
// what i want
user1 = ['data']
user2 = ['data']
remote = [user1, user2, ....etc]
Replication relies on the sequence of the databases being synced. Thus, you only "overwrite" data when their _id values will clobber one another. However, I suspect the next issue you'll need to contend with is keeping user data separate.
If you replicate all your users into a single database, all your users will also receive every other users' data as well when they replicate back out of that single database. I'm not sure what your use-case is, but that isn't generally how apps are structured. If you want to use a single database like this, you'll need to include some sort of tagging to your documents and use Filtered Replication.
To keep your data segmented by user, you'll need to be diligent in using this parameter every time you sync. However, your database is probably exposed to the public internet if you're syncing them, and the lack of document-level controls means your savy users will be able to see everyone else's data anyways.
My recommendation here is to give each user their own database, rather than replicating everything into a single hub database. With CouchDB 2.x, you have access to couch_peruser, which gives each registered user their own database automatically. (this will require registering your users with CouchDB as well, but honestly it's the best for security in a publicly-exposed server anyways)
I am trying to implement client to client messaging in my app, using socket.io, and node.js and Android.
I searched a little, and found a lot of tutorials, explaining how to deal with targetting specific client when sending messages through socket.io socket.
Send message to specific client with socket.io and node.js
The solution is almost always the same : Creating a hashmap object, linking user info such as its username, email address (or anything unique allowing to identify it), with its socketid.
Then calling io.clients[sessionID].send()
Now I have two questions :
This would work if only one instance of the app is running, but imagine if my app is divided in multiple instances (for large app).
What if a client A, connected to instance X, wants to send message to user B, connected to instance Z. If, as seen in the example, socketids are stored directly in a simple object existing in the script, some sockets wont know about others users existing in an other instance.
If I am totally wrong (and I might), is this a good practice to store all user's socketids in a single variable ? If yes, would it still be okay with a 50000+ users enviromnment ? If no, should I find another solution like storing user's socketids in database ?
You could use a redis instance, shared between all your app instances. And you get 2 birds with one stone.
The redis would store all your socket ids in a centralized place.
I'm already familiar that MongoDB is based on documents that are in JSON format. I'm creating my first web app using the MEAN stack, where users can register and then be able to sign in to a back-end dashboard where they can implement products/profile information etc. How would I set that up to be stored on MongoDB? would each user be stored as a document? And as far as security reasons, how can I go about not allowing a GET request to be able to get a different users information?
Currently, I just have a collection of users, and a collection of products (with the unique id number for each user), etc. to me that doesn't seem the proper way to store data.
If anyone can help me on how to setup the database for Mongo that would be fantastic! Thanks in advance!
would each user be stored as a document?
Yes, each user is an object, thus it's stored as a separate document.
how can I go about not allowing a GET request to be able to get a different users information?
This has nothing to do with Mongo or any other data storage mechanism. You'll need to work on a Web service layer which exposes your data filtering requests to authorize them based on user role, claims or any authorization approach you might find useful in your scenario.
Maybe you should look at implementing OAuth2. There's a package that integrates with Express to implement your own OAuth2 authorization server: node-oauth2-server.
Currently, I just have a collection of users, and a collection of
products (with the unique id number for each user), etc. to me that
doesn't seem the proper way to store data.
You are on the right way actually. Now when you show products page for users you need to retrieve only documents that belong to that single user that is currently authenticated, this implies that you have authentication set up, and products have a userId field.
This is how most application work. If you want to read about other ways of solving this then read about multi-tenancy.
My app is built in PHP (served on nginx/php-fpm) and I use node/js with socket.io for users push notifications. These are pushed using Redis pub/sub to interlink between PHP and node.js
The node.js app maintains an array of online userid's. They get added when a user connects to socket.io and removed from the array when they disconnect.
MySQL is used as the main database and I have a standard relationship table which denotes who is following who. A list of followers userid's is retrieved when a user logs and displayed to them.
I now wish to intersect these two sets of data to provide live online status's of these relationships in a similar manner to facebook (a green light for online and grey for offline)
What would be the most performant and most scale-able way of managing this. My current thought process is along these lines:
On the client side we have a javascript array of followers user id's. Set up a timer client side which pushes this array to the node.js app every 60 seconds or so. Node.js handles inersecting the followers id's with its current array of online users and returns an object depicting which users are online.
Now this would work but it feels like it might be a heavy load on node.js to be consistently looping through users followers lists for every online user. Or perhaps I am wrong and this would be relatively trivial considering the main application itself is served by PHP and not via node which only currently handles notification pushing?
Regardless, is there a better way? It's worth noting that I also use redis to build users activity streams (The data is stored in MySQL and redis maintains lists of activity id's)
So seeing as I already have a Redis server active, would there be a better method leveraging Redis itself instead?
Thanks for any advice
If i remember right, when socket.io connected to client side, that makes a request to the client every time for checking active connection and return result, in callback succeffull you can put code that will be update time active users in DB. And when get last notes beyond 5 minutes from DB.