So I'm using Sails.js, and when I run sails lift it works properly in serving the website. However, how can I create a TCP server that also runs when I sails lift and constantly listens for raw TCP messages from clients? I have found this simple TCP server example for Node.js (which obviously will work for Sails):
var net = require('net');
var server = net.createServer(function(socket) {
socket.write('Echo server\r\n');
socket.pipe(socket);
});
server.listen(1337, '127.0.0.1');
How can I integrate this into Sails? Do I need to modify the app.js file (which is what I presume gets run when sails lift is entered)? Any ideas?
I'm a little bit late to the party, but recently i had the same requirement and came up with the following code:
let net = require('net');
net.createServer(function (socket) {
socket.on('data', function (data) {
const req = {
url: '/controllername/method',
method: 'get'
};
const res = {
_clientCallback: function _clientCallback(clientRes) {
// TODO: do something useful with clientRes
message = clientRes.body;
if (clientRes.body && clientRes.body.message) {
message = clientRes.body.message;
}
process.stdout.write(message);
}
}
sails.router.route(req, res);
});
}).listen(1338);
Have a look at the project sails-hook-sockets too.
Related
this code is in my nodejs backend (https://backend.example.com) server.js file:
const WebSocket = require('ws');
const server = new WebSocket.Server({
port: 7500
},
() => {
console.log('Server started on port 7500');
}
);
This code is in my nextjs frontend chat (http://frontend.example.com/chat) page file:
React.useEffect(() => {
ws.current = new WebSocket("ws://localhost:7500");
ws.current.onopen = () => {
console.log("Connection opened");
setConnectionOpen(true);
};
ws.current.onmessage = (event) => {
const data = JSON.parse(event.data);
setMessages((_messages) => [..._messages, data]);
};
return () => {
console.log("Cleaning up...");
ws.current.close();
};
}, []);
it works fine in localhost but on deployed live server, the websocket is not communicating, what is wrong with my code?
EDIT: Have updated the useEffect() to:
React.useEffect(() => {
ws.current = new WebSocket("wss://backend.example.com:7500");
ws.current.onopen = () => {
console.log("Connection opened");
setConnectionOpen(true);
};
ws.current.onmessage = (event) => {
const data = JSON.parse(event.data);
setMessages((_messages) => [..._messages, data]);
};
return () => {
console.log("Cleaning up...");
ws.current.close();
};
}, []);
but still it does not work, If I visit the https://backend.example.com I get Upgrade Required
If this is the code you deploy on live server, then I think the following points have to be addressed.
On the client you point to localhost, you should have the server name, instead.
And more important, in local you're publishing the app in http, while in live server it is in https?
In this case the WebSocket url should change the protocol from ws to wss.
UPDATE
Another point of attention is your server.
I don't see code that is handling the connection, according to the documentation example.
import WebSocket from 'ws';
const ws = new WebSocket('ws://www.host.com/path');
// This handle the co
ws.on('open', function open() {
ws.send('something');
});
ws.on('message', function message(data) {
console.log('received: %s', data);
});
If this is not the library you're using, please update your question whit all the references to the library you're using or the documentation you're referring to to write your code.
About the port, there is nothing specific about the port, the only issue is that you can have the port blocked, on the firewall of your client or on the network gateway of your server.
But that depends on your environment, you should check if the port is usable.
A simple test is to try a small server with an html page published on the 7500 port on your server. If you can see the page the port is ok.
And more you should not use the same port of your server, pick another one, because the http server is reserving that port, and your WebSocket will fail attempting to bind.
But you should see an error on the server if that happened.
If you want to use your application server port, instead of starting a different server then follow this example.
I have a data provider which gives me stock prices via TCP connection. The data provider only allows a static IP to connect to their service.
But since I need to format the data before sending it to my front-end I want to use my express back-end as a proxy.
What that means is:
I need to connect my back-end to my data provider via websocket(socket.io) in order to get the data (back-end acts as client)
I need my back-end to broadcast this received data to my front-end(back-end acts as server)
My question is: Is that possible at all? Is there an easier way to achieve this? Is there a documentation on how to use an express app as websocket server and client at once?
EDIT:
I got this working now. But my current solution kills my AWS EC2 instance because of huge CPU usage. This is how I've implemented it:
const net = require('net');
const app = require('express')();
const httpServer = require('http').createServer(app);
const client = new net.Socket();
const options = {
cors: {
origin: 'http://someorigin.org',
},
};
const io = require('socket.io')(httpServer, options);
client.connect(1337, 'some.ip', () => {
console.info('Connected to some.ip');
});
client.on('data', async (data) => {
// parse data
const parsedData = {
identifier: data.identifier,
someData: data.someData,
};
// broadcast data
io.emit('randomEmitString', parsedData);
});
client.on('close', () => {
console.info('Connection closed');
});
httpServer.listen(8081);
Does anyone have an idea why this causes a huge CPU load? I've tried to profile my code with clinicjs but I couldn't find a apparent problem.
EDIT2: To be more specific: My data provider provides my with stock quotes. So every time a quote changes, I get new data. I then parse this data and emit it via io.emit. Could this be some kind of bottleneck?
This is the profile I get after I run clinicjs:
I don't know how many resources you have on your AWS, but 1,000 clients shouldn't be a problem.
I have personally encountered 2 bottlenecks:
Clients connected with Ajax, not WS (It used to be a common problem with old socket.io)
The socket.io libraries were served by Node, not Nginx / Apache. Node is poor at keeping-alive management.
Check also:
How often do you get data from some.ip? Good idea is aggregate and filter it.
Do you need to notify all customers of everything? Is it enough just to inform interested? (Live zone)
Maybe it is worth moving the serving to serviceWorker.js or Push Events?
As part of the experiment, log yourself events. Receiving data, connecting and disconnecting the client. Observe the server logs.
As part of the debugging process, log events. Receiving data, connecting and disconnecting the client. Observe the server logs.
Or maybe this code is not responsible for the problems, but the data download for the first view. Do you have data in the buffer, or do you read for GET index.html?
To understand what was going on with your situation, I created an elementary TCP server that published JSON messages every 1ms to each client that connects to it. Here is the code for the server:
var net = require('net');
var server = net.createServer(function(socket) {
socket.pipe(socket);
});
server.maxConnections = 10
server.on('close', () => console.log('server closed'))
server.on('error', (err) => console.error(err))
server.on('listening', () => console.log('server is listening'))
server.on('connection', (socket) => {
console.log('- client connected')
socket.setEncoding('utf8')
var intervalId = setInterval(() => socket.readyState === "open" &&
socket.write(JSON.stringify({
id: intervalId,
timestamp: Date.now(),
}) + '\n'), 1)
socket.on('error' , (err) => console.error(err))
socket.on('close' , () => {
clearInterval(intervalId)
console.log('- client closed the connection')
})
})
server.listen(1337, '0.0.0.0');
As you see, we set up a setInterval function that will emit a simple JSON message to each connected client every 1 ms.
For the client, I used something very similar to what you have. At first, I tried pushing every message received by the server to the browser to the WebSocket connection. In my case, it also pushed the CPU to 100%. I don't know exactly why.
Nonetheless, even though your data is being updated every 1 ms, it is doubtful that you need to refresh your webpage at that rate. Most websites work at 60 fps. That would mean updating the data every 16ms. So, a straightforward solution would be to batch the data and send it to the browser every 16 ms. Just this modification greatly increases performance. You can go even further by extending the batch time or filtering some of the sent data.
Here is the code for the client, taking advantage of batch messages. Bear in mind that this is a very naive implementation made to show the idea. A better adjustment would be to work the streams with libraries like RxJS.
// tcp-client.js
const express = require('express');
const http = require('http');
const { Server } = require("socket.io");
const net = require('net')
const app = express();
const server = http.createServer(app);
const io = new Server(server);
const client = new net.Socket()
app.get('/', (req, res) => {
res.setHeader('content-type', 'text/html')
res.send(`
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>TCP - Client</title>
</head>
<body>
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io();
socket.on('msg', (msg) => document.body.textContent = msg);
</script>
</body>
</html>
`);
});
io.on('connection', (socket) => {
console.log('- user connected');
socket.on('disconnect', () => {
console.log('- user disconnected');
});
});
var buffer = []
setInterval(() => {
io.emit("msg", JSON.stringify(buffer))
buffer = []
}, 16)
client.connect(1337, '127.0.0.1', function() {
console.log('- connected to server');
});
client.on('data', function(data) {
buffer.push(data.toString("utf8"))
});
client.on('close', function() {
console.log('- connection to server closed');
});
server.listen(3000, () => {
console.log('listening on 0.0.0.0:3000');
});
I am making a control panel for my Minecraft mining turtle and I need to communicate between the two using websockets. I have troubleshooted the Lua side of things and that works as intended when I connected it to a echo server.
Code here:
local ws,err = http.websocket("wss://localhost:5757")
if ws then
ws.send("Hello")
print(ws.receive())
ws.close()
end
However, I can not get the NodeJS side to work. I have tried different ports, I've tried opening ports.
Code here:
const WebSocket = require('ws');
const wss = new WebSocket.Server({ port: 5757 });
wss.on('connection', function connection(ws) {
ws.on('message', function incoming(message) {
console.log('received: %s', message);
});
ws.send('testing 123');
});
I can't figure out where I have gone wrong. All help is appreciated.
EDIT: Thought I'd add that it's not giving errors either, that I am using the ws npm package and running the latest LTS node version.
EDIT 2: I tried this code example here
const WebSocket = require('ws');
const ws = new WebSocket('wss://echo.websocket.org/');
ws.on('open', function open() {
ws.send('testing 123');
});
ws.on('message', function incoming(data) {
console.log(data);
});
And it worked and replied with 'testing 123' so it seems that the web socket doesn't want to run on local host.
I'm trying to push messages from server to client using ws module in node.js application.
Please find below the code in app.js.
var server = app.listen(port);
var WebSocketServer = require('ws').Server
var wss = new WebSocketServer({ server: server});
wss.on('connection', function (ws) {
console.log('connected');
});
ws.on('message', function (data, flags) {
});
});
code in the client side js file.
window.WebSocket = window.WebSocket || window.MozWebSocket;
var host = window.document.location.host.replace(/:.*/, '');
var port = window.document.location.port;
var socketConnection = new WebSocket('ws://'+ host+':'+ port);
socketConnection.onerror = function (error) {
// TODO : report error here..
};
socketConnection.onmessage = function (message) {
var obj = JSON.parse(message.data)
};
The problem is when i'm running the node application locally.i.e http://localhost:port and when I try to handshake with ws connection through "ws://localhost:port\". The handshake happens and i am able to send and receive messages.
But when i access the same application using the ip of my machine i.e as http://10.xx.yy.zz:port. Im able to open the application and do evrything but the handshake with websockets time out. The same happens when the app is running in another machine and i try to connect to it from my machines browser.The handshake doesnot happen and it times out.
I have tried the same with demos provided out of box by ws module. The same happens. Am I missing something when creating the websocket server?
I just started up learning how to make web applications. I am making a webserver in nodejs (a to-do list app). I am using the express framework, and mongodb as database. For communication between the client and the server i am using socket.io.
I can't find a way to make it so that when the server emits and event the client will update the info on all of his open windows of the page. Right now the info updates only on the window that triggered the event on ther server. This is the server code:
Server code:
var io = require('socket.io').listen(server);
io.of('/home').on('connection', function (socket) {
socket.on('newListGroup', function (data) {
...
socket.emit('groupNo', obj);
});
}); `
Client javascript:
var socket = io.connect('http://localhost/login');
socke.on('groupNo', function(data){ ... });
$('#newListGroup').blur(function() {
socketLogin.emit('newListGroup', {newGroup:newGroup});
});
Can this work or should I take another approach?
You can broadcast a message to all sockets like this:
var io = require('socket.io').listen(server);
io.of('/home').on('connection', function (socket) {
socket.on('newListGroup', function (data) {
socket.broadcast.emit('groupNo', obj); });
});
It should be limited to the namespace but you will probably have to implement your own logic for broadcasting only to windows on the same client (probably using authentication) if that is what you want to do.