I have a fully working Jersey/Rest application server on an embedded device and am in the process of converting it to websockets with atmosphere-jersey to make it available through a firewall. I've just run into some design questions.
I have around 125 different rest call endpoints. I setup a websocket to a few of them and transferred data back and forth, similar to rest, but with live pushes. Since I built a socket with a subscriber for each endpoint, does this mean I'm actually maintaining a websocket on the browser side for each connecting endpoint? Or is the browser smart enough to hold a single socket open to the same domain and send requests back and forth to each endpoint? If I am maintaining a lot of websockets, then is there a particular strategy to do all communication with multiple endpoints, using a single websocket?
As well, my project is going to require an intermediary service to match up a login to a device of registered socket listeners. Is there a container that takes care of matching up logins to a websocket broker, that I could host with my own webservices (must be free)? Since all of my backend services look like rest, I don't want to have to subscribe each endpoint to the intermediary; so I'm wondering if I need to setup a single websocket broker to handle the traffic and push it out to the endpoints, or if the jersey-atmosphere service is smart enough to handle this?
Edit: added a design question:
In order to communicate between a web browser and a back end server using a single websocket interface; Is there a clean and easy way to generate a POJO for each receiving broker, or will I have to do a JSON conversion as the first step in each class that receives an object? If I build a javascript message with some sort of key to determine the broker, then I could map the key to a class and do a pojo generation for passing back the object to the handler: but it seems like this is a bit clunky and coupled.
Related
There are millions of tweets and millions of active users in the Twitter. When a tweet gets like or retweet,how do they send live updates(websockets) of every tweet to its clients?
I think they wouldn't send live updates(websockets) of each tweet to every active user, that would result in (no of active tweets)X(no of active users)=(millions)X(millions)>10^12 live updates in each minute, each user would get millions of updates(of all the tweets) in each minute.
I think the live update of a particular tweet would only be received by the users who are watching that particular tweet.If this assumption is correct,then please tell me, how do they filter clients who are watching a particular tweet and send live updates of that tweet only to those filtered clients?
I was just watching a tweet in the Twitter, I was surprised to see live updates in likes and retweets of that tweet.I haven't seen any social media(like Instagram) giving live updates for every single post of it. I want to implement this method in my social media website.What I had concluded might or might not be correct, but I would request you to explain me, how does Twitter send live updates of every single tweet only to those particular users who are watching it.
To be clear, ONE device has ONE socket connection, to Twitter's cloud.
That ONE socket connection, receives ALL information from Twitter's cloud
new tweets
new likes
new retweets
everything else
all information comes on the ONE socket.
The cloud "figures out" what to send to who.
Is this what you were asking? Hope it clears it up.
The amazing thing is that twitter's cloud can connect to perhaps 100 ? million devices at the same time. (This is an amazing, major engineering achievement which requires an incredible amount of hardware, money and engineers.)
BTW if you're trying to implement something like this for an experiment or client. These days it is inconceivable you'd try to write the server side to achiever this, from scratch. Services exist, which do exactly this - example pusher.com, pubnub.com and so on.
(Indeed, these realtime infrastructure services, are, the basic technology of our era - everything runs on them.)
Here's a glance at the mind-boggling effort involved in Twitter's cloud: https://blog.twitter.com/engineering/en_us/topics/infrastructure/2017/the-infrastructure-behind-twitter-scale.html
Realtime communication or what you refer to as 'live updates' is all a play of various low-level networking protocols. Here's a bit of background on the protocols in general just so you know what you are working with:
A regular REST API uses the HTTP as the underlying protocol for communication, which follows the request and response paradigm, meaning the communication involves the client requesting some data or resource from a server, and the server responding back to that client. This is what you usually see in a regular website that isn't really live but shows or does something following a button click or similar trigger from the user.
However, HTTP is a stateless protocol, so every request-response cycle will end up having to repeat the header and metadata information. This incurs additional latency in case of frequently repeated request-response cycles.
With WebSockets, although the communication still starts off as an initial HTTP handshake, it is further upgrades to follow the WebSockets protocol (i.e. if both the server and the client are compliant with the protocol as not all entities support the WebSockets protocol).
Now with WebSockets, it is possible to establish a full-duplex and persistent connection between the client and a server. This means that unlike a request and a response, the connection stays open for as long as the application is running (i.e. it’s persistent), and since it is full-duplex, two-way simultaneous communication is possible. Now the server is capable of initiating communication and 'push' some data to the client when new data (that the client is interested in) becomes available.
The WebSockets protocol is stateful and allows you to implement the Publish-Subscribe (or Pub/Sub) messaging pattern which is the primary concept used in the real-time technologies where you are able to get new updates in the form of server push without the client having to request (refresh the page) repeatedly. Examples of such applications other than Twitter are Uber-like vehicle location tracking, Push Notifications, Stock market prices updating in real-time, chat, multiplayer games, live online collaboration tools, etc.
You can check out a deep dive article on WebSockets which explains the history of this protocol, how it came into being, what it’s used for and how you can implement it yourself.
Another interesting one is SSE or Server-Sent Events which is a subscribe-only version of WebSockets and restricted to the web platform. You can use SSE to receive real-time push updates from servers, but this would be unidirectional as you can only receive updates via SSE and not really publish anything. Here's a video where I explain this in much more detail: https://www.youtube.com/watch?v=Z4ni7GsiIbs
You can implement these various protocols as required from scratch or use a distributed messaging service like Ably which not only provides the messaging infrastructure of these protocols but also offers other add-ons such as scalability, reliability, message ordering, protocol interoperability, etc, out of the box, which is essential for a production-level app.
Full disclaimer: I'm a Dev Advocate for Ably but I hope the info in my answer is useful to you nevertheless.
Background:
I am building a reactJS application using AWS cognito, dynamo and S3. The application is based in the recruitment sector where employers and employees can post and view jobs respectively. When an employee applies for a job the employer can view the employees profile and decided whether or not to message them. Employees and employers converse via an on-site messaging service.
The Question:
What is the best method to facilitate user chat?
i.e. what is a nice & efficient way to store messages and notify users when they have a new message.
Our current approach is to have a setTimeout() on the site and check for new messages but this will be very inefficient so i'm looking for some guidance.
I would like to stay inside the amazon infrastructure as much as possible but I am open to all suggestions.
I'm currently building something similar for a startup I'm working at. Our React app is served by node.js server, while the API backend is provided by a django API with drf. As in your user chat case, we need to handle some real time data arriving in the frontend.
Our approach
The solution may be split up into inter server and server-browser realtime communication:
We use redis (aws elasticache to be exact) as a publish/ subscribe message queue to push incoming data from the API backend to the nodejs server. Specifically, whenever an instance of the model in question is created due to an HTTP POST call (i.e. in your case a message, which is send to the server), we publish JSON serialized information on a channel specific to the actors of concern.
On the node.js servers, we subscribe to channels of interest and receive information from the backend in real-time. We then use socket.io to provide a websocket connection to the frontend, which may be easily integrated with React.
Limitations of this approach
You cannot simply server your React app as a static website from S3 and have to rely on a node x React approach. react-boilerplat(by Max Stoiber I think) is a great way to start.
What's more, you can also use websockets end to end. We use this approach as our data source isn't a browser but a constrained device.
Hope that helps!
Case scenario:
I have few 2k concurrent users access to the website with various devices but using their browsers. Once one of them create new topic, all others currently connected should receive a notification (basically I simple update little icon number in app upper right corner).
One way to accomplish this is to have web app keep requesting updates via ajax calls but that overload my slow server with numerous requests.
I use azure to host my web app (written in PHP). There are some services included in my hosting package such as Event Hub, Service Bus etc. What service could I use in order to have my backed talk to a "service" whenever there is a new post, and than to have that "service" talk to my clients (their browsers) and informing them about new notification or any type of data updates?
You're probably looking for websockets. A websocket sets up a connection between the page in the client's browser and your webserver. Through this connection you can push new topics to all connected clients.
It is advisable to decouple the websocket sending process from the request handling of the topic creation. For this you need a background worker which sends websocket notifications when triggered from a processing event.
You can implement this in PHP using ratchet.
I'm creating an app where the server and the clients will run on the same local network. Is it possible to use web sockets, or rather more specifically, socket.io to have one central server and many clients that are running native apps
? The way I understand socket.io to work is that the clients read the web-pages that are served from the server but what happens when your clients become tablet devices running native apps instead of web pages in a browser?
The scenario I'm working with at the minute will have one central server containing a MEAN app and the clients (iPads) will make GET requests to the data available on the server. However, I'd also like there to be real-time functionality so if someone triggers a POST request on their iPad, the server acknowledges it and displays it in the server's client-side. The iPad apps will (ideally) be running native phonegap applications rather than accessing 192.168.1.1:9000 from their browser.
Is this technically possible to connect to the socket server from the native apps or would the devices have to send POST requests to a central server that's constantly listening for new 'messages'? I'm totally new to the whole real-time stuff so I'm just trying to wrap my head around it all.
Apologies if this isn't totally clear, it's a bit hard to describe with just text but I think you get the idea?
Correct me if I am wrong.
You have multiple iPads running native app. They send a POST request to your node JS server which is running in a computer in the same local network. Whenever the server receives a request from app, you want to display that a request has been received in your computer screen.
If my assumptions about the scenario is correct, then it is fairly easy to do. Here are the steps to do it.
Create a small webpage (front end). Load socket IO in the front end page like this -
<script type="text/javascript" src="YOUR_SERVER_IP/socket.io/socket.io.js"></script>
Then connect to server using var socket = io(). This should trigger connection event in your backend.
Handle all POST request from apps normally. Nothing special. Just add a small snippet in between. socket.emit('new_request', request_data). This sends new_request event to front end.
Handle the new_request in your front end using socket.on('new_request', function(request_data) { ... });. That's it. No need to add anything to your native app for realtime update.
The second step would be a little complicated as it is necessary to make socket variable available inside all POST requests. Since you chose node.js, I don't think you need any help with that.
Not totally clear on your project, but I'll try to give you some pointers.
An effective way to send data between native apps and a server is using a REST server. REST is based on HTTP requests and allows you to modify data on the server, which can connect to your database. The data returned is typically either JSON or XML formatted. See here for a brief intro: http://www.infoq.com/articles/rest-introduction
Android/iOS/etc have built in APIs for making HTTP requests. Your native app would send a request to the server, parse the response, and update your native UI accordingly. The same server can be used from a website using jQuery ajax HTTP requests.
Express.js is more suited to serving web pages and includes things like templating. Look into "restify" (see here: mcavage.me/node-restify/) if you just want to have a REST server that handles requests. Both run on top of node.js (nodejs.org).
As far as real-time communication, if you're developing for iOS look into APNS (Apple Push Notification Service). Apple maintains a persistent connection, and by going through their servers you can easily send messages to your app. The equivalent of this on Android is GCM (Google Cloud Messaging).
You can also do sockets directly if that's easier for you. Be careful with maintaining an open socket on a mobile device though, it can be a huge battery drain. Here's a library for connecting ObjC to Socket.IO using websockets, it may be useful for you: https://github.com/pkyeck/socket.IO-objc
Hope that helps!
To answer your question, it is definitely possible. Socket.io would serve as the central server that can essentially emit messages to all of the client. You can also make Socket.io listen for the messages from any of the clients and serve the emitted message to the rest of the clients.
Here's an example of how socket.io can be used. Simply clone, npm install, and run using 'node app.js'
All you have to do is to provide a valid server address when you connect your socket from the iPad clients:
var socket = io.connect( 'http://my.external.nodejs.server' );
Let us know if you need help with actual sending/receiving of socket events.
It is possible to connect to Websockets from your apps.
If you are using PhoneGap then you need a pluging that gives support to websockets in your app (the client) and then use websocket like normal way using Javascript see this.
If your app is native iOS look into this it could help you.
The primary use of the Sockets in your case is to be a bidirectional "pipe" between an app and server. There is no need of server sending the whole web-page to the native app. All what you need is to send some data from server to the client(app) in response to POST (or GET) request and then using this data on client side to update client's UI in real-time. If you are going to use moderate amount of devices (say tens of them), you may have connected all of them to the server permanently keeping individual socket connection open for every individual link server-to-app. Thus you may deliver data and update client's state in real time.
In fact web browsers also employ sockets to communicate to web servers. However as in general case there is no control on amount of concurrent clients in Internet, for the sake of limited networking resources conservation, servers do not keep sockets open for a long time, closing it just after the web-page was sent to client (or timeout has expired). That's how HTTP protocol works on the low level. The server waiting for the HTTP clients (browsers) by listening the 80 port, responding them by sending the whole web page content, then closing the connection and keep waiting for another requests on the same port.
In your case it's basically a good idea to use socket.io as it's a uniform implementation of sockets (ok WebSockets) on both client and server side. The good starting point is here
Is there any advantages of having two distinct websocket connections to the same server from the same client? To me this seems a bad design choice, but is there any reason why/where it should work out better?
There are several reasons why you might want to do that but they probably aren't too common (at least not yet):
You have both encrypted and unencrypted data that you are sending/receiving (e.g. some of the data is bulky but not sensitive).
You have both streaming data and latency sensitive data: imagine an interactive game that occasionally has streamed video inside the game. You don't want large media streams to delay receipt of latency sensitive normal game messages.
You have both textual (e.g. JSON control messages) and binary data (typed arrays or blobs) and don't want to bother with adding your own protocol layer to distinguish since WebSockets already does this for you.
You have multiple WebSocket sub-protocols (the optional setting after the URI) that you support and the page wants to access more than one (each WebSocket connection is limited to a single sub-protocol).
You have several different WebSocket services sitting behind the same web server and port. The way the client chooses per connection might depend on URI path, URI scheme (ws or wss), sub-protocol, or perhaps even the first message from client to server.
I'm sure there are other reasons but that's all I can think of off the top of my head.
I found that it can make client logic much simpler when you are only subscribing to updates of certain objects being managed by the server. Instead of devising a custom subscription protocol for a single channel, you can just open a socket for each element.
Let's say you obtained a collection of elements via a REST API at
http://myserver/api/some-elements
You could subscribe to updates of a single element using a socket url like this:
ws://myserver/api/some-elements/42/updates
Of course one can argue that this doesn't scale for complex pages. However, for small and simple appications it might make your life a lot easier.