I've set up a standard Phoenix websocket/channel environment but I am not using the socket.js provided - I have my own (very simple) code that connects to the channels and topics. However, I can't get the socket to persist beyond a minute or so. Is there any way to define the timeout for sockets? I don't have any special configurations in the Phoenix-side (all standard as per the documentation)
My javascript code is as follows:
const ws = new WebSocket(sock_url);
ws.onmessage = (msg) => {
const { payload, event } = JSON.parse(msg.data);
if (!event.startsWith("phx_")) {
onMessage(payload.body);
}
};
ws.onclose = (code, reason) => {
onClose(code, reason);
};
ws.onopen = () => {
ws.send(JSON.stringify({
topic: `users_socket:${user_id}`,
event: "phx_join",
payload: {},
ref: '1'
}));
};
Update: I ended up using the socket.js file that comes with Phoenix as everyone suggested - it just does everything I need. Thanks to everyone who answered :)
I am developing a project with Websockets (using Go not Phoenix or Elixir) and I've had the same disconnection problems that I've manage to solve (at least it has not been timing out since) by "pinging" the websocket i.e. sending a message in specific intervals.
Perhaps you could have something like this in your Javascript.
ws.onopen = () => {
ws.send(/** YOUR CODE */);
// Send a ping event every 10 seconds
setInterval(() => ws.send(JSON.stringify({ event: "ping" })), 10000);
}
And handle this new event type accordingly server-side. Also you can try to monitor the onclose event and depending on the reason re-open the connection. You can find a list of such event codes in the Mozilla docs.
phoenix backend expects a ping every 30 seconds. You can re-configure it like so:
defmodule UserSocket do
use Phoenix.Socket
## Transports
transport :websocket, Phoenix.Transports.WebSocket,
timeout: 300_000, # 5 minutes
transport_log: :debug
...
end
If you do not care for the timeout you can set it to very high. the code above sets it to 5 minutes.
In general phoenix.js will implement all of this for you. It is a very small lib. You will find at the end you implemented all that is in the lib with a bunch of things you got wrong :-)
Related
I use node.js with the NET class to connect multiple Rasperry Pi's.
//Server
const net = require('node:net');
const server = net.createServer();
//Client
const tcpConnect = net.createConnection(SERVERINFO);
Thereby I can easily register multiple clients, open/close streams. Everything works without any problems.
But now I noticed that the callbacks.
tcpConnect.on('end', () => { console.log('end triggered') });
tcpConnect.on('close', () => { console.log('close triggered') });
tcpConnect.on('error', () => { console.log('error triggered') });
do not work if the server suddenly has a power failure (e.g. power supply is pulled). The clients don't trigger an error/end or close, so I can't close the connection properly.
As a result, an error is not triggered until the next attempt to write something to an existing stream. But this is not an option for me, especially for "online monitoring".
Does anyone know a way to identify the shut off of the server immediately?
You need to enable TCP keep-alives in order for the client to detect that the server is no longer responding and you can tune how often it checks on an idle connection. (I've written a language generic description at the beginning of this answer.)
In node it appears you can do this: socket.setKeepAlive(true), i.e. for this client to detect the server is gone when the socket is idle:
//Client
const tcpConnect = net.createConnection(SERVERINFO);
tcpConnect.setKeepAlive(true);
So I wrote simple video creator script in NodeJS.
It's running on scheduled cron job.
I have a panel written in PHP, user enter details and clicks "Submit new Video Job" Button.
This new job is saving to DB with details, jobId and status="waiting" data.
PHP API is responsible for returning 1 status at a time, checks status="waiting" limits query to 1 then returns data with jobID when asked
Video Creation Script requests every x seconds to that API asks for new job is available.
It has 5 tasks.
available=true.
Check if new job order available (With GET Request in every 20 seconds), if has new job;
available=false
Get details (name, picture url, etc.)
Create video with details.
Upload Video to FTP
Post data to API to update details. And Mark that job as "done"
available=true;
These tasks are async so everytask has to be wait previous task to be done.
Right now, get or post requesting api if new job available in every 20 seconds (Time doesnt mattter) seems bad way to me.
So any way / package / system to accomplish this behavior?
Code Example:
const cron = require('node-cron');
let available=true;
var scheduler = cron.schedule(
'*/20 * * * * *',
() => {
if (available) {
makevideo();
}
},
{
scheduled: false,
timezone: 'Europe/Istanbul',
}
);
let makevideo = async () => {
available = false;
let {data} = await axios.get(
'https://api/checkJob'
);
if (data == 0) {
console.log('No Job');
available = true;
} else {
let jobid = data.id;
await createvideo();
await sendToFTP();
await axios.post('https://api/saveJob', {
id: jobid,
videoPath: 'somevideopath',
});
available = true;
}
};
scheduler.start();
RabbitMQ is also a good queueing system.
Why ?
It's really well documented (examples for many languages including javascript & php).
Tutorials are simple while they're exposing real use cases.
It has a REST API.
It ships with a monitoring UI.
How to use it to solve your problem ?
On the job producer side : send messages (jobs) to a queue by following tutorial 1
To consume jobs with your nodejs process : see RabbitMQ's tutorial 2
Other suggestions :
Use a prefetch value of 1 and publisher confirms so you can ensure that an instance of consumer will not receive messages while there's a job running.
Roadmap for a quick prototype : tutorial 1... then tutorial 2 x). After sending and receiving messages you can explore the options you can set on queues and messages
Nodejs package : http://www.squaremobius.net/amqp.node/
PHP package : https://github.com/php-amqplib/php-amqplib
While it is possible to use the database as a queue, it is commonly known as an anti-pattern (next to using the database for logging), and as you are looking for:
So any way / package / system to accomplish this behavior?
I use the free-form of your question thanks to the placed bounty to suggest: Beanstalk.
Beanstalk is a simple, fast work queue.
Its interface is generic, but was originally designed for reducing the latency of page views in high-volume web applications by running time-consuming tasks asynchronously.
It has client libraries in the languages you mention in your question (and many more), is easy to develop with and to run in production.
What you are doing in a very standard system design paradigm, done with Apache Kafka or any queue based implementation(ex, RabbitMQ). You can check out about Kafka/rabbitmq but basically Not going into details:
There is a central Queue.
When user submits a job the job gets added to the Queue.
The video processor runs indefinitely subscribing to the queue.
You can go ahead and look up : https://www.gentlydownthe.stream/ and you will recognize the similarities on what you are doing.
Here you don't need to poll yourself, you need to subscribe to an event and the other things will be managed by the respective queues.
Can't receive any notifications sent from the Server peripheral.
I am using ESP32 as Server with the "BLE_notify" code that you can find in the Arduino app (File> Examples ESP32 BLE Arduino > BLE_notify).
With this code the ESP32 starts notifying new messages every second once a Client connects.
The client used is a Raspberry Pi with Noble node library installed on it (https://github.com/abandonware/noble). this is the code I am using.
noble.on('discover', async (peripheral) => {
console.log('found peripheral:', peripheral.advertisement);
await noble.stopScanningAsync();
await peripheral.connectAsync();
console.log("Connected")
try {
const services = await peripheral.discoverServicesAsync([SERVICE_UUID]);
const characteristics = await services[0].discoverCharacteristicsAsync([CHARACTERISTIC_UUID])
const ch = characteristics[0]
ch.on('read', function(data, isNotification) {
console.log(isNotification)
console.log('Temperature Value: ', data.readUInt8(0));
})
ch.on('data', function(data, isNotification) {
console.log(isNotification)
console.log('Temperature Value: ', data.readUInt8(0));
})
ch.notify(true, function(error) {
console.log(error)
console.log('temperature notification on');
})
} catch (e) {
// handle error
console.log("ERROR: ",e)
}
});
SERVICE_UUID and CHARACTERISTIC_UUID are obviously the UUIDs coded in the ESP32.
This code sort of works, it can find Services and Characteristics and it can successfully connect to the peripheral, but it cannot receive messages notifications.
I also tried an Android app that works as client, from that app I can get all the messages notified by the peripheral once connected to it. So there is something missing in the noBLE client side.
I think there is something wrong in the on.read/on.data/notify(true) callback methods. Maybe these are not the methods to receive notifications from Server?
I also tried the subscribe methods but still not working.
The official documentation is not clear. Anyone could get it up and running? Please help.
on.read/on.data/ are event listeners. There is nothing wrong with them. They are invoked when there is a certain event.
For example adding characteristic.read([callback(error, data)]); would have invoked the on.read.
From the source:
Emitted when:
Characteristic read has completed, result of characteristic.read(...)
Characteristic value has been updated by peripheral via notification or indication, after having been enabled with
characteristic.notify(true[, callback(error)])
I resolve using the following two envs NOBLE_MULTI_ROLE=1 and NOBLE_REPORT_ALL_HCI_EVENTS=1 (see the documentation https://github.com/abandonware/noble)
I write a Node.Js app and I use Socket.Io as the data transfer system, so requests should be particular to per user. How can I make this?
My actual code;
node:
io.on('connection', (socket) => {
socket.on('loginP', data => {
console.log(data);
})
})
js:
var socket = io('',{forceNew : false});
$("#loginbutton").click(function() {
var sessionInfo = {
name : $("#login input[name='username']").val(),
pass : $("#login input[name='pass']").val()
}
socket.emit("loginP", sessionInfo)
})
It returns one more data for per request and this is a problem for me. Can I make this on Socket.Io or should I use another module, and If I should, which module?
If I understand your question correctly (It's possible I don't), you want to have just one connection from each user's browser to your nodejs program.
On the nodejs side, your io.on('connection'...) event fires with each new incoming user connection, and gives you the socket for that specific connection. So, keep track of your sockets; you'll have one socket per user.
On the browser side, you should build your code to ensure it only calls
var socket = io(path, ...);
once for each path (your path is ''). TheforceNew option is for situations where you have multiple paths from one program.
I am implementing socket.io in node.js for a Windows Azure project. I have to send data to all the connected clients at regular intervals.
I am new to node.js but I guess multi-threading is not possible. The purpose of socket.io is to support real-time applications, so is there any way that I can send data continuously to all my connected clients at regular intervals and also process whatever data the clients send to the socket.io server simultaneously?
EDIT:
This is roughly my socket.io implementation
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('first', "connected");
socket.on('clientData', function (data) {
//processing the data
});
});
function requestQueue() {
// some processing
io.sockets.emit('broadcast', recordsQueue);
// should go to sleep for a time period
}
Essentially I want the requestQueue method to be running continuously like a thread, which will emit data to connected clients at particular intervals. And also if the client sends any data to "clientData" event, then I should be able to receive the data and process it.
Any idea on how I can do it?
Thanks
My solution:
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('first', "connected");
socket.on('clientData', function (data) {
//processing the data
});
});
function requestQueue() {
var sleepTime = 0;
// calcuate sleepTime
io.sockets.emit('broadcast', recordsQueue);
// should go to sleep for a time period
if(sleepTime !=0)
setTimeout(function () { requestQueue() }, sleepTime);
}
As others have suggested, you can achieve this by using the setInterval or setTimeout functions to periodically emit data to connected clients. Although everything runs within the same controlling thread, all of the I/O is done asynchronously so your application will still be responsive. IE, any data received while the timer is running will be queued up and processed after the timer function returns.
See Timers in the node.js manual for more information.
On a side note, you will want to have your node.js application focus on processing asynchronous I/O in near real-time as that is the strength of node.js. If you need to do any heavy computations then you will want to offload that to another thread/process somehow, by simply making an async request for the result.
You are right, node is single threaded. But it makes heavy use of events.
The strategy you should go for is to listen for events on the connections, to retrieve data, and when you wish to send data back you do it because another event has been emitted.
If you take a look at the examples on the socket.io frontpage, you will see how they use events to start processing the streams. The on(eventName, function) method is how you listen for an event in NodeJS.
If you wish to do something based on timing (generally a bad idea), take a look at the setInterval method.
Server
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
Client
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost');
socket.on('news', function (data) {
console.log(data);
socket.emit('my other event', { my: 'data' });
});
</script>
There is also a Node.js library for doing threads: node-webworker-threads
https://github.com/audreyt/node-webworker-threads
This basically implements the Web Worker browser API for node.js.
Note that this does not align with node.js's philosophy of single threadedness, but it's there if you need it.
Multi threading in Node.js?
Yes ..it is possible in nodejs using npm module 'java4node', it this package a method is
.multiThreading(tasks , callbacks)
This Function is use for perform multithreading in java script manner
for using this module
link: https://www.npmjs.com/package/java4node
run commend
npm install java4node
var java = require('java4node')
java.multiThreading({
one: function(callback,parameters) {
console.log("function one is running independently");
callback()
},
two: function(callback,parameters) {
console.log("function two is running independently");
callback()
},
three: function(callback,parameters) {
console.log("function three is running independently");
callback()
}
}, function(err, results) {
callback(err, results)
});
The logic you define inside the setInterval etc. won't be working in a separate thread and eventually blocking the main. Let's say there are 1000 users and you called the setTimeout etc. for 1000 times, eventually they will be running and spending that precious time. Only the final IO operations are none-blocking!
You may consider investigating nodeJX for multithreading in node.js