Socket.io lost connection when phone locks react native - javascript

I have a simple application written in react native that uses a nodejs server connection with socket.io. My problem is that when the phone screen is locked the socket disconnects from the server. I need the always connected connection.
//server
const app = express();
const server = require('http').Server(app);
const io = require('socket.io')(server,{
pingInterval: 20000,
pingTimeout: 10000,
});
//client mobile
const io = socket("http://192.168.0.20:3003");

It is difficult to continue socket.io connection on the lock screen.
(This is possible if you set a manifest that uses VoIP in the app)
I encountered the same problem and took action.
The countermeasure method was released to npm as "syncsocketio".
https://www.npmjs.com/package/syncsocketio
"syncsocketio" wraps socket.io.
"syncsocketio" was created with Typescript, but it can also be used with javascript.
Note that you need to use "syncsocketio" on both the server and client.
The npm page is in Japanese, but using the sample code is very easy to use.
Github also contains server and client test code, so check it out.
https://github.com/codianz/syncsocketio
I will be happy if it helps.

Related

How to run socketIO server together with main server on nodejs

Currently, we have a main server that is being hosted on localhost:3000 but to run our socket.io function, we need to run it on the same server. However, we need to run it separately (npm start separately). Is there a way to run it together or on the same server without it crashing?
You cannot run socket.io in a separate process, but on the same port as some other web server in some other process. The OS will not allow that as only one process can have a listening server on a specific port. If they are in the same process, that's easy as pie as socket.io is built to share an http server in the same process (one listening server internally, traffic divided between the two uses). But, not from separate processes.
To do that, you'd have to use something like nginx on your port 3000 to proxy plain web requests to one server on some other port say 3001 and socket.io requests to another server on some other port say 3002. The client would only deal with port 3000 and nginx would direct the traffic to the right server on different ports.
I'm thinking that when you say "npm start separately", you must have some other problem you're trying to solve with that statement and we could probably help with a better way to solve that actual problem (if you disclosed what that actual requirement is) while keeping socket.io and the http server in the same process and thus no need for a proxy to divide the traffic between two separate servers.
For example, you could start up your web server with no socket.io server started and then you could tell your web server process to start up the socket.io server later. Or you could start both the web server and socket.io server in the same process at initialization time, but have a temporary server configuration that blocks incoming socket.io connections until some other requirement is met.
But, without understanding what the real requirement is, you're just lobbing us an XY problem where you describe your attempted solution rather than the actual problem that needs to be solved. When we explain that your attempted solution is the wrong way to go, we need to know what the real problem is to help further.
This is simple SocketIo Server Code.
It's not a client code.
You have to download SocketIO at npm.
const express = require('express');
const http = require('http');
const SocketIo = require('socket.io');
const app = express();
app.set('view engine', 'pug');
app.set('views', __dirname + '\\views');
app.use('/public', express.static(__dirname + '/public'));
app.get('/', (req, res) => res.render('home'));
const handleListen = () => console.log('Listening on http://localhost:3000');
const httpServer = http.createServer(app);
const wsServer = SocketIo(httpServer);
wsServer.on('connection', (socket) => {
console.log('someone joined!')
socket.on('join_room', (roomName) => {
socket.join(roomName);
socket.to(roomName).emit('welcome');
});
});
httpServer.listen(3000, handleListen);
For more Info visit official documentation.
https://socket.io/get-started/chat

how to connect express app in VM from internet?

I am running an express app in node.js on Azure Linux VM, and I want to connect this website from my personal computer.
const express = require('express');
const app = express();
app.listen(3000, () =>
{
console.log("Server is running..");
})
And for ex. in my computer, I can just run localhost:3000 but on VM, I cant reach.
The public ip adress of VM is 40.XXX.XX.252 and I try to connect on my browser like this : http://40.XXX.XX.252:3000 but it doesn't work either..
I know it is really common question but i tried everything that is suggested and I couldnt fix.
ps: I tried to run this app on my computer and again, I could connect through localhost:3000 but not from another computer with my computer's public ip.
Problem causes:
It must be that port 3000 is not open. You can open port 3000 through commands to set it. (The settings in the firewall also need to be checked)
But I personally think this is not safe.
I recommend using an intranet penetration tool, which is the safest way to access through the generated link, because sometimes the public ip assigned by the operator is not fixed. Here I recommend using ngrok linux version.
You also can refer to my answer in another post.
sending http request from azure web app to my machine

Moving ExpressJS API files onto Server

So I created an simple API using ExpressJS that connects to MongoDB to perform CRUD operations. Currently I am able to get the local host running by performing command "npm nodemon" in the source folder. And it worked by testing with postman I wonder how to implement it on the server. As server runs a linux system, also I have a line of code in my root file "server.js ":
const port = process.env.PORT || 5000;
I think the process.env.port needs need to be changed in order to make it work on the server?
In addition, I did look into aws CE2 server it is so complicated that I was immediately overwhelmed. I am hoping someone can recommend dummy like me a simple and very specific solution to have a server run my scripts in ExpressJS environment. Thank you
I'm assuming your question is "How to deploy express app to a server?"
You can read some advanced topics on http://expressjs.com/, which covers some best practices, and other useful stuff. But the things you want to look at now is Things to do in your environment / setup
The important part is:
Keep your express runing on port 5000
Run your app in cluster
Run your app behind a proxy server like Nginx.
You can check this nice guide (Step 3 and 4) on how to deploy your express app to a Linux server with PM2, Nginx.
So at the end, your express app will run on port 5000 (or whatever port you desire), and your Nginx will run on port 80, and nginx will forward any request to your express app.

Supertest returns 301 using express app. App only takes hosted server

Hi I am using supertest to test my Node js express server application.
Here is what I am trying to achieve.
let request = require('supertest');
let app = require('./server.js');
request(app).get("/api").then(data=>{//*do something here*//});
However, I am getting 301 Moved Permanently
If I actually start my server on port 8008, then change test to
let request = require('supertest');
let app = require('./server.js');
let agent = request.agent('localhost:8008');
agent.get("/api").then(data=>{//*do something here*//});
Then I get correct api responses as I expect.
Is there a way to make it work and get 200 response by using request(app) instead of using localhost:8008?
I will be running tests as part of continuous integration and I don't have full control of testing environment that I won't be able to run testing server each to have access to localhost.
Thanks.
I found out that it was issue with SSL connection that I was enforcing on express.
I have conditionally turned off SSL enforcement on testing environment, and it works as I expected!
Hope this helps anyone who have same issue in the future :)

Azure Web Site starting my Hapi Node.js site with socket protocol

Whenever I deploy my Hapi.js web application to azure, it starts the server using the socket protocol (see output below).
socket:\\.\pipe\b5c0af85-9393-4dcb-bd9a-3ba9b41ed6fb
GET /
GET /{param*}
GET /api/employees
POST /api/employees
GET /api/employees/{id}
PUT /api/employees/{id}
DELETE /api/employees/{id}
POST /api/worklog
GET /login
POST /login
Hapi server started # socket:\\.\pipe\b5c0af85-9393-4dcb-bd9a-3ba9b41ed6fb
150914/214730.270, [response], socket:\\.\pipe\b5c0af85-9393-4dcb-bd9a-3ba9b41ed6fb: [1;32mget[0m / {} [32m200[0m (316ms)
However, whenever I am running this locally, it starts using http... I have not run into this issue using express or loopback, only Hapi. Is there some sort of configuration that I am missing? This is the server.connection function:
var server = new Hapi.Server();
var host = process.env.host || '0.0.0.0';
var port = process.env.port || 3000;
server.connection({host: host, port: port});
The reason this is a big deal is because I cannot pass socket://*<mydoamin>* to google as a callback URI for OAuth.
You shouldn't need to pass socket://<domain> to google, you'd pass the normal https://yourDomain.com or even the https://yourSiteName.azurewebsites.net to Google for OAuth callback and it should work as you would expect.
The fact that the node application is listening on a pipe rather than a normal tcp socket is just an implementation detail of iisnode. Basically the problem is that node has it's own webserver so you can't use it with other webservers like IIS, Apache, nginx, etc. iisnode bridges the gap between IIS and node in that it allows IIS to listen to the HTTP port on the machine 80 and when IIS gets a request on that port, it just forwards it to the node process that's listening on a named pipe. This allows you to manage your sites in IIS as you normally would on a Windows Server machine, while actually writing your app in node.
You can think of it as 2 webservers running on the box, one (IIS) is acting as a proxy for the other (node) where all the work is actually happening. The fact that the iisnode developer chose to use a named pipe instead of a normal tcp socket is odd (though kind of understandable since you can't easily reserve a port per se as you can a pipe), but it's the way it is.

Categories