I have a web application that reflects the content of a list from my server. To do that I use websockets (socket.io) to listen to update messages from my server.
After having a good first snapshot of the list, it receives update events like {'action':'changed','type': 'typeA', 'id':1}, then the page can make a request to http://server.com/api/typeA/1 and insert, delete or replace the updated item in the model.
The problem is, if any update event occur while my websocket connection is being stablished the system will lose those and be delayed. Or if it requests the first snapshot after the connection event happens, the request may complete after some update is signaled and then the new value may be replaced by an ancient one.
Is there some lib to make what Meteor DDP does for publishing a generic DB in a server written in java?
We came across many distributed data mechanisms and ended up choosing a data sync strategy using deepstream.io that implements the features that we wanted for cloning a collection from the server by sending updates on demand, and have a good and well supported framework for JS and Java.
It worths giving it a try.
Please take a look over: https://github.com/Reactive-Extensions/RxJS
I think this is what you're looking for.
Thank you,
Alex S.
Related
I've read a few StackOverflow posts related to this subject but I can't find anything specifically helps me in my scenario.
We have multiple monitoring instances within our network, monitoring different environments (Nagios, Icinga, more...). Currently I have a poller script written in PHP which runs every minute via cron, it asks the instance to return all of its problems in JSON, the script then interprets this and pushes it in to a MySQL database.
There is then an 'overview' page which simply reads the database and does some formatting. There's a bit of AJAX involved, every X seconds (currently use 30) it checks for changes (PHP script call) and if there are changes it requests them via AJAX and updates the page.
There's a few other little bits too (click a problem, another AJAX request goes off to fetch problem details to display in a modal etc).
I've always been a PHP/MySQL dev, so the above methodology seemed logical to me and was quick/easy to write, and it works 'ok'. However, the problems are: database constantly being polled by many users, mesh of javascript on the front end doing half the logic and PHP on the back doing the other half.
Would this use case benefit from switching to NodeJS? I've done a bit of Node.JS before but nothing like this. Can I subscribe to MySQL updates? Or trigger them when a 'data fetcher' pushes data in to the database? I've always been a bit confused as I use PHP to create data and javascript to 'draw' the page, is there still a split of NodeJS doing logic and front end javascript creating all the elements, or does NodeJS do all of this now? Sorry for the lack of knowledge in this area...
This is definitely an area where Node could offer improvements.
The short version: with websockets in the front-end and regular sockets or an API on the back-end you can eliminate the polling for new data across the board.
The long version:
Front-end:
You can remove all need for polling scripts by implementing websockets. That way, as soon as new data arrives on the server, you can broadcast it to all connected clients. I would advise Socket.io or the Primus websocket wrapper. Both are very easy to implement and incredibly powerful for what you want to achieve.
All data processing logic should happen on the server. The data is then sent to the client and should be rendered on the existing page, and that is basically the only logic the client should contain. There are some frameworks that do all of this for you (e.g. Sails) but I don't have experience with any of those frameworks, since they require you to write your entire app according to their rules, which I personally don't like (but I know a lot of developers do).
If you want to render the data in the client without a huge framework, I highly recommend the lightweight but incredibly useful Transparency rendering library. Using this, you can format a Javascript object on the server using Node, JSONify it, send it to the client, and then all the client would have to do is de-JSONify it and call Transparency's .render.
Back-end:
This one depends on how much control you have over the behaviour of the instances you need to check. I assume you have some control, since you can get all their data in a nice JSON format. So, there are multiple options.
You can keep polling every so often. This is the easiest solution since it requires no change to the external services. The Javascript setInterval function is very useful here. Depending on how you connect with the instances, you might be able to use a module like Request to do the actual request, so that takes out a bunch more of the heavy lifting.
The benefit of implementing the polling in your Node app as well, is that you will receive the data in your Node app and that way you can immediately broadcast it to the clients, even before inserting it into a database. This will greatly reduce the number of queries on your database.
An alternative to polling would be to set up a simple Express-based API where the applications can post their 'problems', as you call them. This way your application will get notified the moment a problem occurs, and combined with the websockets connection to the client this would result in practically real-time updates.
To be more redundant, you would have a polling timer alongside the API, so that you can check the instances in case there's something wrong that causes them to not send over any more data.
An alternative to the more high-level API would be to just use direct socket communication, which is basically the same approach only using a different set of functions.
Lastly, you could also keep the PHP-based polling script. This would be the most efficient solution since you wouldn't go and replace everything. Then from the Node app that's connected to the clients with websockets, you could set an interval to query the database every so often and broadcast the updates. This will still greatly reduce the number of queries, since no matter how many clients are connected there will only be one query, the response of which then gets sent to all connected clients.
I hope my post has give you some ideas of how you could implement your application using Node. Keep in mind though that I am just one developer, this is how I would approach building your application in Node. There will definitely be others who have different opinions.
I am developing an application based on mongo and sails, and i am testing how the realtime update in sails works.
I am using sails 0.9.16 now, but i am interested also in answers about sails 0.10.
I want a list to be updated when new documents are created in the corresponding collection. This works when i add documents via sails sockets, sending a post message. In that case i see other clients receiving a message and the list on the client side is updated.
There is an external service writing on the mongo database tough, so the collection is growing all the times. The new elements created directly by the external service in the database are not notified to listening clients, so i have to refresh the web page in order to show those elements.
Questions:
are notifications about database creations supposed to work, when those creations do not come from sails itself?
if yes, does this require some special configuration?
if not, what would be a recommended way to keep client side listing about a collection updated when the database is changing?
Cheers
Very interesting question, though not an unusual one: the guys from Meteor were having the same problem. Basically, without watching the DB you can't even scale your app horizontally, since one server process will have no idea on what data changes were made by another one.
So, at first they sort of patched it by polling the DB every 10 seconds. :) Obviously, it wasn't the best solution, so they ended up with another one (which can also work for Sails): now they are tailing the MongoDB oplog and fire an update whenever there's a change in the corresponding collection.
That said, to answer your questions:
AFAIK, a Sails process has no clue about any external changes made to the DB;
so, nothing to configure;
a way to track external DB (MongoDB) updates would be using one of the oplog watchers you can find in npm (e.g. this or one of these, etc.) to listen to the changes and trigger updates whenever there's a need.
Unfortunately, no ready-to-use solution here, but I hope at least that now you have an idea on how to make it work.
In a web application the user is able to perform some tasks I need to send to the server asynchronously. Basically, this is really easy, but now I would like it to be also working fine in offline-mode.
My idea is to use a client-side queue, and transfer elements from that queue to the server if the network connection is available.
I could use PouchDB, but I don't need all the tasks on the client-side, so I don't want a full client-side database with all the elements the server has as well. I only need some kind of queue: Put it in there, and try to send it to the server: If it worked, dequeue, otherwise try again after a short pause.
How could I implement this? Is there something such as RabbitMQ (conceptually!) available for browsers? A queue on top of the browsers' built-in database? Something like that?
Or can this issue be solved using PouchDB?
PouchDB does support one-way replication (just do clientDb.replicate.to("http://server/")), so if you are already running CouchDB on your server, it might be a quick & easy way to implement a queueing of tasks type of system.
You will probably want to use a filter on your replication, because when you "dequeue" or delete a task from the client side db, you probably don't want to replicate that delete to the server :) This answer is specific to CouchDB, but it should work in PouchDB too, as I think PouchDB does support filtered replication: CouchDB replicate without deleting documents.
That said, using PouchDB like this seems a little awkward, and the full replication system might be a little more overhead than is necessary for a simple queueing of tasks. Depends on what the needs of your app are though, and the exact nature of the tasks you are queueing! It could be as simple as an array that you push tasks into, and periodically check if there are tasks in there, which you can pop or shift off the array and send to the server.
There's also async.queue, which is commonly used in node.js but also works in the browser (this queue is not backed by any type of storage, but you could add persistent storage using PouchDB or another client-side db).
I'm programming a browser application (html5+websockets+css3+js preferred) that enables users to concurrently access (read, write) attributes of the same object. To create a real-time experience I'd like to use optimistic synchronization. I read about Timewarp and Trailing State algorithms and I wonder if there is a javascript library, which already implements these or similar algorithms.
I found this question, but unfortunately it was not answered yet. XSTM only supports pessimistic synchronization as it seems.
Do you have any idea for me?
I am working on a realtime HTML5 web browser application now too. Maybe my choice of weaponry could inspire you...who knows, so I am using:
Frontend:
KnockoutJS -it takes care of displaying data which I send to every connected client in JSON(view models), you can easily subscribe to changes in the client data and push the changes back to server, though I am having problems displaying pages with knockoutjs on mobile browsers
on serverside I run custom made server based on Fleck
Since JSON is my favourite data format, I ditched SQL databases in favour of [RavenDB][2], which stores data almost exactly as they are sent via websocket protocol and also it is pretty quick
Instead of creating every websocket and defining the entire structure by hand, is there a library that will let me run a function on the Node.js server, that can call a related function on all clients connected to the server simultaneously? Likewise, can I securely call a server function FROM the client browser? I feel like every time I have to construct a command to send over the web socket, I'm working on the transmission layer instead of the application layer, and I want to be thinking at the higher layer the entire time.
I wouldn't mind building something like this myself if it doesn't already exist, but I have a hard time believing this isn't solved on node already.
What you are really looking for is an node.js RPC solution. Here are a couple of node.js RPC options:
DNode - shows some good examples.
BERT-RPC
nowjs
I have not personally used them, but they look like they have good potential.
Try to look at now.js.
Try msg-rpc, it just provides the rpc support you need, also no particular requirement for the websocket library of server/client you already have. Tell how to send out a message and forward the messages received, that's all.