I am working on a Node JS + socket.io application.
The entire application works fine, but after about 5 mins the server stops to receive the events that the client triggers. :(
When the events are not triggered, I can see that the server is successfully receiving the heart beat.
debug - got heartbeat packet
debug - cleared heartbeat timeout for client 4cKMC4Iqje-7dDfibJZm
debug - set heartbeat interval for client 4cKMC4Iqje-7dDfibJZm
debug - emitting heartbeat for client 4cKMC4Iqje-7dDfibJZm
debug - websocket writing 2::
debug - set heartbeat timeout for client 4cKMC4Iqje-7dDfibJZm
I am also sure that the client is emitting the messages because, I can see the data being sent in the chrome Developer tools.
Following is the sample data that is being sent
5:::{"name":"ev_SendChatMessage","args":[{"chatMsg":"dgdfsgfs","aID":"10010001835364"}]}
Also, I have checked the results of TCP Dump at the server machine, it is successfully receiving the data packets.
Node version is v0.10.21
socket.io version is 0.9.16
Client Code
var socket;
$(function()
{
// Connect to the Live Proctoring Server.
socket = io.connect('http://autoproc.am.in:8899');
});
function SendChatMsg()
{
// This get called on click of a button
socket.emit( "ev_SendChatMessage", { chatMsg : "textToSend", aID : "123" } );
}
Server Code
var options = {};
var io = require( 'socket.io' ).listen( 8899, options );
// Called when a connection is made with a new Client
function OnConnection ( socket )
{
console.log( "Connection has been made with " + socket.id );
socket.on('ev_SendChatMessage', SendChatMessageFromModerator );
socket.on('disconnect', OnDisconnect );
}
// This stops getting called after some time. In the beginning it is getting called successfully.
function SendChatMessageFromModerator( data )
{
console.log( data );
}
Edit: To be more precise this thing happens only after around receiving 7-8 messages and emitting 7-8 messages.
Edit: I tried to change the transport mechanism from Web Socket to "xhr-polling". Even then I am facing same problem, instead that I can see something worth in the debug.
debug - xhr-polling received data packet 5:::{"name":"ev_SendChatMessage","args":[{"chatMsg":"sfsdfdsfs","aID":"10010001167896"}]}
debug - clearing poll timeout
debug - xhr-polling writing 8::
debug - set close timeout for client JfaWyiP3YqTRmqyzz4z6
debug - xhr-polling closed due to exceeded duration
debug - setting request GET /socket.io/1/xhr-polling/JfaWyiP3YqTRmqyzz4z6?t=1389965419417
debug - setting poll timeout
debug - discarding transport
debug - cleared close timeout for client JfaWyiP3YqTRmqyzz4z6
This clearly shows that data has reached the Node JS application.
I found the solution to my problem..
Problem was that I was creating a database connection pool, but I was not releasing the connections using dbConn.release();
Once the connections in the pool were exhausted the application kept on waiting for the database connection to be fetched from the pool.
In short, Devil was in the details. The details I had not mention in my question. hahaha..!!
Related
Overview
On two separate Azure instances, on first one node.js servers are running and they connect to single node redis server running on second instance and node.js is trying to keep active connection with redis. Node Redis module is being used to store and retrieve data in redis and socket.io-emitter module is used to allow different servers to send messages based on collective connectivity of clients.
Problem
After the initial connection is done after sometime (sporadic)the connection freezes and finally crashes with ETIMEDOUT error being thrown from all servers.
What have I tried.
i. Added socket_keepalive first and then together with socket_initialdelay
const redis = require('redis');
let options = {socket_keepalive : true, socket_initialdelay : 200000};
global.client = redis.createClient(port, host, options);
ii. With socket.io-emitter module tried initialising it with new redis object using node redis module itself but the notifications stopped working after that so retracted back to the same thing.
This stopped the notification to devices individually
let options = {socket_keepalive : true, socket_initialdelay : 200000};
let redis_socket = require('redis');
let pub = redis_socket.createClient(port, host, options);
let ioz = require('socket.io-emitter')(pub);
*Obviously the timed out issue exists with the working method.
iii. On redis's server the timeout config is set at 0 and tcpdelay was 300 seconds but we tried changing it to 0 (tcpdelay) but still the problem persists.
It definitely breaks my head because same piece of code works in another environment but what is causing this is still a mystery, after investigating a bit more I realised that the clients connected (available with monitor command) drops after some time and hence etimedout error is thrown.
Same redis machine is also used for caching and it is working without any issue.
Looks like you might be hitting the TCP idle timeout of 4 minutes.
According the self-documented config for Redis 3.2, the value for tcp-keepalive has to be non-zero for it to work. So, you might want to set a value like 120 (240 / 2) instead and try again.
I have html page sent by node.js server and socket.io component that connects to that server like this:
var socket = io();
Also several socket events:
socket.on('server-message', function(type, content) {
...
});
socket.on('server-update', function(type, content) {
...
});
The problem is that in the moment server is stopped, i get client side errors:
https://example.com/socket.io/?EIO=3&transport=polling&t=LptmQyC net::ERR_CONNECTION_REFUSED
Once server is started again it crashes in a 30 seconds after.
It looks like i could use a detection if server is not available anymore and just destroy all socket related events, then reconnect by page refresh or some button.
Maybe someone could help me with this.
I dont believe that much can be done about the error, its internal socket.io error that is a result of unsuccessful server polling.
For better understanding, once connection is severed, socket.io goes into ajax polling and will try to reconnect as soon as possible (meaning as soon as server is up again).
Server crash after reconnect on the other hand can be addressed very easy. The code you presented is only functional in terms of how to connect, you are missing handlers for disconnect and optionally - reconnect.
I will show you how you can add to this code to manage disconnects (and prevent server crashes after bringing it back up).
First of all connect to server:
var socket = io();
Create a disconnect event right after (btw its also a code for server side disconnect event):
socket.on('disconnect', function() {
destroy_socket_events();
});
Create function that will destroy all your listeners:
var destroy_socket_events = function() {
socket.off('disconnect');
socket.off('server-message');
socket.off('server-update');
// add all other socket events that you have in here ...
};
Now what is going to happen is this:
connection is made
server is stopped
client triggers disconnect event
function destroys all of your socket listeners
server is back up
client reconnects (socket.io will do it because of the polling)
no listener is ever triggered at that point
So you can safely reinitialize all of your code and attach listeners again properly.
I have the below code -->
var last_will = new Paho.MQTT.Message("last message");
last_will.destinationName = "Bridge123";
client = new Paho.MQTT.Client("broker.mqttdashboard.com", Number("8000"), "AX123");
client.onConnectionLost = onConnectionLost;
client.onMessageArrived = onMessageArrived;
client.connect({onSuccess:onConnect} , {willMessage:last_will});
When i disconnect the client i expect a last will message being sent to the topic i have created .. Am using Paho 's mqtt version -3.1 .. Websockets are getting created fine but i do not see the last will message ...
Can anyone guide here ?
Adding the bigger picture :
I have a Python script p gathering current on / off status of a IOT device in the local environment and publishing to a topic "IOT1" over mqtt . I do not want the python script always running to get status from the IOT device as it overloads the device .. To solve this i am in need of finding active clients for "IOT1" topic so that i run or pause the thread sending requests to the IOT device in the local environment .. Is there a way other than the last will message to know this ?
Last Will and Testament messages are only published if client does not disconnect cleanly.
If you close the connection gracefully it will not be sent.
Only when the server fails to receive a message or ping packet in the time out period will the server send the message.
willMessage should be the property of the first object. See below code snippet.
client.connect({onSuccess:onConnect, willMessage:last_will});
I have a page which opens a websocket connection (socket.io) to a node.js server.
For testing purposes I want to open the page with a headless browser, using CasperJs (I also tried pure PhantomJs with the same result).
Summary:
Open socket.io connection using Chrome always works.
Open socket.io connection using CasperJs always works (at least the corresponding callbacks on client and server are executed).
Message exchange (socket.emit) using Chrome always works.
Message exchange (socket.emit) using CasperJs sometimes works.
I get the same behaviour when I configure socket.io to use polling only instead of websocket. I know "sometimes" isn't very accurate, but I haven't found a "patter" yet when it happens.
Did you use a headless browser to open a page with socket.io/websockets successfully? Do you have any hints what might be the reason?
Some more details:
My client performs an
var socket = io.connect('http://localhost:3000');
After that on the server the connection method is called (it prints the transport type and socket id for debugging):
io.on('connection', function (socket) {
console.log('Client connect ('+ socket.client.conn.transport.constructor.name +'): ' + socket.id);
...
}
The output is as expected (socket.io uses XHR at first and then upgrades to websocket):
Client connect (XHR): d-moyZ_D6Z6eG7SNAAAA
Also on the client-side the connect callback is executed:
socket.on('connect', function () {
console.log("Client successfully connected to data server. Transport type: " + socket.io.engine.transport.constructor.name);
...
}
The output on the client console is as expected:
Client successfully connected to data server. Transport type: XHR
If I use a normal browser I can exchange messages without problems. If I use my casper script the message exchange sometimes works. In most cases just nothing happens when I call socket.emit. The casper script is pretty basic. I had the feeling that it might be a timing issue, so I created a wait for the javascript resource which performs the connection, no change:
casper = require('casper').create({
verbose: true,
logLevel: "debug"
});
casper.start('page.html', function() {
this.echo(this.getTitle());
});
casper.waitForResource("dataConnection.js", function() {
this.echo('socket.io has been loaded.');
this.wait(14000, function() {
this.echo("I've waited for some seconds.");
});
this.capture('casper.png');
});
casper.on('remote.message', function(message) {
console.log(message);
});
casper.run();
The disconnect callbacks on client/server are also executed as expected. You can see that the upgrade to websocket worked:
Client disconnect (WebSocket): d-moyZ_D6Z6eG7SNAAAA connectionCount: 0
Thank you!
I'm using Node.js, Socket.io and Websocket.
Every time I emit something, it writes it in my console.log. I emit a lot of stuff so my console.log becomes totally unusable for debugging purposes.
Is there any way to prevent Websocket from writing all emit events on my console?
An example of what is written:
debug - websocket writing 5:::{"stuff...."}
debug - websocket writing 5:::{"more stuff...."}
debug - websocket writing 5:::{"even more stuff...."}
You need set log level option to 0 like this:
var io = require('socket.io').listen(port, host);
io.set('log level', 0);