Does setInterval in node.js start running when computer falls asleep? - javascript

I'm trying to fetch an API in node.js server and updated the stored data by using setInterval in JavaScript in every 3 hours. The code is like this: (I'm using mongoDB as database)
mongoose.connect(
constants.db,
{
useNewUrlParser: true,
useUnifiedTopology: true,
reconnectTries: 30000,
},
() => {
app.listen(5000, () => {
console.log("server is running");
const time = 10800000; // for 3 hours
setInterval(dataFetch, time);
dataFetch();
});
}
);
The problem is that I want to use heroku free web-hosting plan and as you know, in this case, heroku make the server sleep after 1 hour until it receives a request and I want to make sure that in case of server sleep, this code do the job well.
Please guide me to handle this case of situation. Thank you :)

Related

mongodb connections and sessions not working as intended

for context I am fairly new to mongodb transactions, i was initially working with a standalone database but am using atlas now so that i can test my transactions without affecting my main database.
here is how i used to connect to the atlas database :
export var client = mongoose
.connect(CONNECTION_URL, { useNewUrlParser: true, useUnifiedTopology: true })
.then(() =>
app.listen(PORT, () => console.log(`SERVER RUNNING ON PORT: ${PORT}`))
)
.catch(() => console.log("launch error, probably ip address on mongo db"));
however i keep getting a
TypeError: client.startSession is not a function
error so i changed my client to be like this :
export var client = await MongoClient.connect(CONNECTION_URL, { useNewUrlParser: true, useUnifiedTopology: true })
app.listen(PORT, () => console.log(`SERVER RUNNING ON PORT: ${PORT}`));
which is incorrect in a sense but it seems to get my session to start, however it fails all my model queries (the error is : usermodels.findOne() buffering timed out after 10000ms) and it only works as intended when i use my client to get a collection instead of using the model directly
const Collection = client.db("myFirstDatabase").collection("usermodels");
and then running
const user = await Collection.find({});
instead of
const user = await User.find({}); (where User is just the model i import)
I cant seem to understand why I am unable to use my models as i used to and have to call the collections each time I attempt to run a query on my DB, why isn't possible to run queries with models and collections at the same time ?
I tried going through the official and unofficial documentation about connections/sessions/transactions/queries and its been days since I've been reading but I feel like am on the wrong track or that I am missing something obvious, any help or pointers would be appreciated, I tried looking into previous SO questions but most of them are either unanswered or missing crucial context !

Use express as proxy for websocket

I have a data provider which gives me stock prices via TCP connection. The data provider only allows a static IP to connect to their service.
But since I need to format the data before sending it to my front-end I want to use my express back-end as a proxy.
What that means is:
I need to connect my back-end to my data provider via websocket(socket.io) in order to get the data (back-end acts as client)
I need my back-end to broadcast this received data to my front-end(back-end acts as server)
My question is: Is that possible at all? Is there an easier way to achieve this? Is there a documentation on how to use an express app as websocket server and client at once?
EDIT:
I got this working now. But my current solution kills my AWS EC2 instance because of huge CPU usage. This is how I've implemented it:
const net = require('net');
const app = require('express')();
const httpServer = require('http').createServer(app);
const client = new net.Socket();
const options = {
cors: {
origin: 'http://someorigin.org',
},
};
const io = require('socket.io')(httpServer, options);
client.connect(1337, 'some.ip', () => {
console.info('Connected to some.ip');
});
client.on('data', async (data) => {
// parse data
const parsedData = {
identifier: data.identifier,
someData: data.someData,
};
// broadcast data
io.emit('randomEmitString', parsedData);
});
client.on('close', () => {
console.info('Connection closed');
});
httpServer.listen(8081);
Does anyone have an idea why this causes a huge CPU load? I've tried to profile my code with clinicjs but I couldn't find a apparent problem.
EDIT2: To be more specific: My data provider provides my with stock quotes. So every time a quote changes, I get new data. I then parse this data and emit it via io.emit. Could this be some kind of bottleneck?
This is the profile I get after I run clinicjs:
I don't know how many resources you have on your AWS, but 1,000 clients shouldn't be a problem.
I have personally encountered 2 bottlenecks:
Clients connected with Ajax, not WS (It used to be a common problem with old socket.io)
The socket.io libraries were served by Node, not Nginx / Apache. Node is poor at keeping-alive management.
Check also:
How often do you get data from some.ip? Good idea is aggregate and filter it.
Do you need to notify all customers of everything? Is it enough just to inform interested? (Live zone)
Maybe it is worth moving the serving to serviceWorker.js or Push Events?
As part of the experiment, log yourself events. Receiving data, connecting and disconnecting the client. Observe the server logs.
As part of the debugging process, log events. Receiving data, connecting and disconnecting the client. Observe the server logs.
Or maybe this code is not responsible for the problems, but the data download for the first view. Do you have data in the buffer, or do you read for GET index.html?
To understand what was going on with your situation, I created an elementary TCP server that published JSON messages every 1ms to each client that connects to it. Here is the code for the server:
var net = require('net');
var server = net.createServer(function(socket) {
socket.pipe(socket);
});
server.maxConnections = 10
server.on('close', () => console.log('server closed'))
server.on('error', (err) => console.error(err))
server.on('listening', () => console.log('server is listening'))
server.on('connection', (socket) => {
console.log('- client connected')
socket.setEncoding('utf8')
var intervalId = setInterval(() => socket.readyState === "open" &&
socket.write(JSON.stringify({
id: intervalId,
timestamp: Date.now(),
}) + '\n'), 1)
socket.on('error' , (err) => console.error(err))
socket.on('close' , () => {
clearInterval(intervalId)
console.log('- client closed the connection')
})
})
server.listen(1337, '0.0.0.0');
As you see, we set up a setInterval function that will emit a simple JSON message to each connected client every 1 ms.
For the client, I used something very similar to what you have. At first, I tried pushing every message received by the server to the browser to the WebSocket connection. In my case, it also pushed the CPU to 100%. I don't know exactly why.
Nonetheless, even though your data is being updated every 1 ms, it is doubtful that you need to refresh your webpage at that rate. Most websites work at 60 fps. That would mean updating the data every 16ms. So, a straightforward solution would be to batch the data and send it to the browser every 16 ms. Just this modification greatly increases performance. You can go even further by extending the batch time or filtering some of the sent data.
Here is the code for the client, taking advantage of batch messages. Bear in mind that this is a very naive implementation made to show the idea. A better adjustment would be to work the streams with libraries like RxJS.
// tcp-client.js
const express = require('express');
const http = require('http');
const { Server } = require("socket.io");
const net = require('net')
const app = express();
const server = http.createServer(app);
const io = new Server(server);
const client = new net.Socket()
app.get('/', (req, res) => {
res.setHeader('content-type', 'text/html')
res.send(`
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>TCP - Client</title>
</head>
<body>
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io();
socket.on('msg', (msg) => document.body.textContent = msg);
</script>
</body>
</html>
`);
});
io.on('connection', (socket) => {
console.log('- user connected');
socket.on('disconnect', () => {
console.log('- user disconnected');
});
});
var buffer = []
setInterval(() => {
io.emit("msg", JSON.stringify(buffer))
buffer = []
}, 16)
client.connect(1337, '127.0.0.1', function() {
console.log('- connected to server');
});
client.on('data', function(data) {
buffer.push(data.toString("utf8"))
});
client.on('close', function() {
console.log('- connection to server closed');
});
server.listen(3000, () => {
console.log('listening on 0.0.0.0:3000');
});

Front and back end timer/event syncing?

I have an API endpoint that accepts some data from the client. There is also a 1 minute timer which is visible to the client.
What I hope to achieve is this:
Whilst the timer is active ( > 0 ) any posts sent to the API are kept in storage or in array ( or something ). Once the timer reaches zero, The client can no longer make a request to the API and any requests that were made and stored whilst the timer was active, are now processed through a function - For sake of example lets just say that this function logs all the data to the screen.
Perhaps i'm thinking of this in the wrong way, but how do I sync a front and backend timer so both the server and the client side know when to stop processing POST requests and to let the server know that it's time to process all the data that was sent during that 1 minute.
var express = require("express");
var app = express();
app.post("/api/data", function(req, res){
// do something here - no clue
});
app.listen(process.env.PORT, () => {
console.log("server running on port: ", process.env.PORT);
});
Apologies if I've explained this poorly.
Appreciate any help I can get, thank you :)
What do you want to achieve, because letting the client decide something is the worst choice you can make.
Maybe pass the server time to the client. Calculate the difference and then start counting. But check on the server if the client is actually allowed to post.
or let the server calculate how many seconds there are left until countdown reached and pass these to the client. But still server needs to check the valid time.
I would pick one of those. Let the server be the deciding factor and don't depend on the client. Since you can easily change PC time.
When the timer on the client side reaches 0 send a api call to the server. When the server gets this api call puts a boolean to false. If this boolean is false, you ignore data. What I mean in code is something like this
var ignore = false;
app.post("/api/data/stop", function(req,res){
ignore = true;
});
app.post("/api/data", function(req, res){
if(!ignore){
// listening to data
}
});
app.listen(process.env.PORT, () => {
console.log("server running on port: ", process.env.PORT);
});

Node JS mqtt client doesn't received subscribed messages when broker goes down and comes up

I have created a mqtt node js client. My connection options are as follows.
mqttOptions = {
clientId: '100',
keepAlive: 1000,
clean: false,
reconnectPeriod: '1000',
will: willMessage
};
I disconnected the server and brought it up again, while the client was still running. The client had the logic to publish every 1 second. Though the client was publishing after this reconnect, it was not receiving the message. It was subscribed to its own message topic. Since I set the clean option to be false, should it not subscribe to the topics on the reconnect and start receiving them?
Below is how I'm establishing the connection.
this.client = mqtt.connect(url, mqttOptions);
and below is how I'm subscribing.
this.client.subscribe(topic);
What am I doing wrong here? Please advice.
We faced this issue with EMQ as the broker and with mqtt library for NodeJS. When it was mosquitto as broker, the client reconnects and gets all the messages it had subscribed. But, if it subscribes again, it gets n number of copies of the same message. As per the library document, it is recommended to check for connack and connack.sessionPresent for previous subscriptions.
We subscribed to all events of client and found that offline is the one that is called when the broker goes down. Then the reconnect and close gets called until the broker is up. Hence, here is how we did it. On offline, end the client forcefully and on completion of end, create a new client - the same function that was used to create client:
doConnect() {
this.client = mqtt.connect('mqtt://myhost', this.myOptionsIfAny);
this.client.on('connect', () => {
this.client.subscribe('mytopics');
this.client.on('message', (topic, message) => {
// do processing
});
this.client.on('offline', () => {
this.client.end(true, () => {
doConnect();
});
});
}
clean: 'false',
Should 'false' definitely be a string? I presume it should be a boolean.

nodeJS video streaming, connection not closing when user disconnects or moved to different page

I am building a video streaming server in nodeJS using express and using "request-progress" module to get the progress status.
It's(video streaming) working fine.
But the problem is, even after I close the browser or moves to next page, the server still streaming data. I can see that in console.
Here is the code for the same:
app.route('/media/*').get(function (req, res) {
var originalUrl = "http://myactualserver.com";
var resourceUrl = originalUrl.split('media');
var requestURL = BASE_URL + resourceUrl[1];
req.on('close', function () {
console.log('Client closed the connection');
});
var options = {};
progress(request(requestURL), {
throttle: 1000,
delay: 0,
lengthHeader: 'content-length',
})
.on('progress', function (state) {
console.log('progress', state);
})
.on('error', function (err) {
console.log('err');
})
.on('end', function () {
console.log('end');
})
.pipe(res);
});
I tried the following post and finally added the
req.on("close", function(){});
Video Streaming with nodejs and Express
node.js http server, detect when clients disconnect
Node.js server keeps streaming data even after client disconnected
My question is:
1. How will call either the "error" or "end" function of "request-progress"?
2. How will I close the streaming?
3. Why console.log('Client closed the connection');
is called twice?
Answers to your questions:
You can probably use emit from events.EventEmitter
Refer this example
https://github.com/IndigoUnited/node-request-progress/blob/master/test/test.js
Alternatively you can use the last progress status in close function.
Not sure what is the question here, the progress will continue till the socket is closed.
If you are making the request from browser (default player), it makes two requests.

Categories