How to use setInterval (with clearInterval) to send requests using Bluebird Promises? - javascript

I'm using node-request for sending the requests to server to get some report. The thing is server needs some time to generate the report, so it responses with the report state. I'm checking the report state with setInterval() function, and use clearInterval() when server sends ready response. But with this approach, even after I use clearInterval, responses of earlier requests keep coming, and the response handler runs again and again. This does not cause a lot of harm, but still I believe it can be done better.
Here is my code:
checkForReportReady = setInterval =>
#request URL, options, (err, res, body) =>
console.log err if err
body = JSON.parse body
if body['status'] is 'ready'
clearInterval checkForReportReady
#processReport body
, 1000
What I need: make a request, wait for response, check the status, if status is not ready - make another request after some timeout, repeat until the status code in response is ready. If the status is ready - exit the loop (or clear the interval) and run #processReport.
I tried to make promisified request, and put it into setInterval, but the result was the same.
P.S. I do not control the server, so I can't change the way it responds or deals with the report.

I would recommend not to put requests in an interval callback. This can get ugly when they a) fail b) take longer than the interval.
Instead put a setTimeout in the success handler and try again after (and only if) receiving a response.
This is rather easy with promises:
request = Promise.promisifyAll require 'request'
getReport = () =>
request URL, options
.spread (res, body) =>
body = JSON.parse body
if body.status is 'ready'
body
else
Promise.delay 1000
.then getReport # try again
getReport().then(#processReport, (err) -> console.log(err))

It seems like you can just use a setTimeout() in your response handler:
function checkForReportReady() {
request(URL, options, function(err, res, body) {
if (err) {
console.log(err);
} else {
if (body.status === "ready") {
processReport(body);
// do any other processing here on the result
} else {
// try again in 20 seconds
setTimeout(checkForReportReady, 20*1000);
}
}
});
}
This will run one request, wait for the response, check the response, then if it's ready it will process it and if it's not ready, it will wait a period of time and then start another request. It will never have more than one request in-flight at the same time.
If you want to use Bluebird promises, you can do that also, though in this case it doesn't seem to change the complexity particularly much:
var request = Promise.promisifyAll(require('request'));
function checkForReportReady() {
return request(URL, options).spread(function(res, body) {
if (body.status === "ready") {
return body;
} else {
// try again in 20 seconds
return Promise.delay(20 * 1000).then(checkForReportReady);
}
});
}
checkForReportReady().then(function(body) {
processReport(body);
}, function(err) {
// error here
});

Related

Node.js - Fastify: Connection closed during sleep (setTimeout)

I am having a problem with a Node.js server using Fastify.
At some point during the execution of a request, the server seems to be closing the connection and the client is getting a socket hang up error.
The logic in the server is:
Fastify client calling a service.
Service sending HTTP request using Axios to get certain information. The service implements a retry mechanism and after every retry, it waits for 15 seconds to make sure the information is available.
The code is as follows:
Fastify server:
fastify.post('/request', async (request, reply) => {
try {
const result = await service.performOperation(request.body);
return result;
} catch(error) {
console.error('Error during operation: %s', error.toString());
throw error;
}
})
fastify.addHook('onError', (request, reply, error, done) => {
console.error('onError hook: %o', error);
done();
})
Service:
async function performOperation(request) {
let attempt = 0;
let latestErrorMessage;
while(attempt++ < 5) {
try {
await waitBeforeAttempt();
return await getInfoFromServer(request);
} catch (error) {
latestErrorMessage = getErrorMessage(error);
if (attempt < 5) {
console.log(`Re-attempting after error: ${latestErrorMessage}`);
}
}
}
throw new Error(`Error after 5 attempts. Last error: ${latestErrorMessage}`);
}
function waitBeforeAttempt() {
return new Promise(resolve => setTimeout(resolve, 15000));
}
async function getInfoFromServer(request) {
const response = await axios.post('http://localhost:3000/service', request, {timeout: 120000});
return response.data.toString();
}
The problem is that the server seems to be closing the connection.
According to the logs, this is happening after waiting for 15 seconds and before the call via Axios, before finishing the first attempt.
You can see in the logs that after closing the connection, the logic continues and finishes with all the attempts with no problems whatsoever.
There is nothing in the logs as to why the connection is closed, not even from the Fastify onError hook declared.
Nothing from Axios either. I guess if there were any timeouts that would throw an exception and be logged.
Important note
Noted that connections are not dropped if I change the waitBeforeAttempt implementation to implement a busy wait instead of setTimeout ie:
function waitBeforeAttempt() {
const start = new Date();
let now;
while (true) {
now = new Date();
if (now - start >= 15000) {
break;
}
}
}
Is there anything I'm doing wrong that is causing the connections to be dropped? Perhaps the 15 seconds wait is too high? I have other setTimeout in the code via Puppetter (same implementation as mine) that don't seem to be causing the problem.
Just answering my own question. The problem turned out to be nothing related to the wait or timeouts.
This wasn't happening when the Node.js service was running locally, only happening intermittently when running on Kubernets + Nginx.
Nginx was just restarting without any apparent reason.
Nginx has been updated and the issue is not showing anymore.

Wrapping node.js request into promise and piping

Here is my scenario:
I want to get some external resource (binary file) using request library and pipe it to the client of my own application. If response code is != 200 or there are problems reaching remote server, I want to intercept and provide custom error message instead. Ideally, if response is fine, I want original headers to be preserved.
I was able to achieve that with the first piece of code I've pasted below. However, my whole application is based on Promise API so I wanted to make it consistent and wrap it in promise too. And when I do that, it no longer works. Firstly, I tried to achieve that with request-promise, without success. Then I tried to prepare very simple example on my own, still no luck.
Working example
var r = request.get('http://foo.bar');
r.on('response', result => {
if (result.statusCode === 200) {
r.pipe(res);
} else {
res.status(500).send('custom error message')
}
});
r.on('error', error => res.status(500).send('custom error message');
Not working example
var r;
return new Promise((resolve, reject) => {
r = request.get('http://foo.bar');
r.on('response', result => {
if (result.statusCode === 200) {
resolve();
} else {
reject()
}
});
r.on('error', reject);
}).then(() => {
r.pipe(res);
}).catch(() => {
res.status(500).json('custom error message');
});
By 'not working' I mean - no response is delivered, request is pending until timeout.
I've changed the code to call .pipe() on result passed to resolve instead of r. It responds to client, but response is empty then.
At the end, I've tried replacing request lib with simply http.get(). And with that, server returns file to the client, but headers (like Content-Type or Content-Length) are missing.
I've googled a lot, tried several request versions... and nothing is working.
The problem is that when "response" is triggered, you create a new promise that resolves immeadiately, but the then callback is always executed asynchronously, and when it gets called the file has arrived at the server, and there is no data flowing through the stream anymore. Instead you could just use the body parameter of the callback:
request.get('http://foo.bar', function(request, response, body) {
if(response.statusCode === 200) {
res.end(body);
} else {
res.status(500).end();
}
});
For working with streams request seems a bit buggy, axios seems to do it better:
axios.get("http://foo.bar"', {
validateStatus: status => status === 200,
responseType: "stream"
}).then(({data: stream}) => {
stream.pipe(res);
}).catch(error => {
res.status(500).json(error);
});

Node - send response to browser when loop has finished

I am making a POST request in one of my routes add-users. I have created an array called success. In each loop provided the API POST runs successfully I am adding a string 'user added' to the array. Once the array has completed I want to send the response to the browser with the success array.
I have noticed something strange. When I type in the url the start of add-users it runs the loop before I hit enter to navigate to the page. This seems strange? Is node listening and predicting which url I am going to hit?
Here is my current attempt but its not working for some reason.
app.get('/add-users', function (req, res) {
var success = [];
var count = 0;
users.forEach(function(user, i){
request({
url: url,
method: 'POST',
json: true
}, function(err, resp, body){
if (!err && resp.statusCode === 200) {
success.push('user added');
}
});
if(count === users.length) {
res.json(success);
}
});
});
Regarding browser fetching the response before hitting enter key on the url, it is very unusual behaviour. Maybe you should check your machine if it is infected by any malware!
Regarding the code used by you, count is not incremented anywhere in the forEach loop. So it remains 0 forever and never equals users.length. So the loop will end but it will never send a response.
Also, you are testing for equality between count and users.length at the wrong place in the code.
This code should work:
app.get('/add-users', function (req, res) {
var success = [];
var count = 0;
users.forEach(function(user){
request({
url: url,
method: 'POST',
json: true
}, function(err, resp, body){
if (!err && resp.statusCode === 200) {
success.push('user added');
}
count++; // <<=== increment count
//
if(count === users.length) { // and then test if all done
res.json(success);
}
});
});
});
The problem here is you are mixing synchronous and asynchronous code together in a wrong way. Please note that forEach is synchrounous and request is asynchronous. So, looping over users finishes faster than the first result you get from request method.
#SantanuBiswas has one way you can solve your problem with the response returning before your requests are all complete, though depending on how many users you've got in your array and how slow the upstream service is, this is a potentially disastrous user experience as it will wait until all the requests are complete and only then fire back a response.
A better solution (in my opinion) would be to respond immediately with a 202 Accepted status code, and then update your DB with information about each user's status in the request handler for later reporting or debugging. Something like this (assuming you're using mongoose for your local user storage):
app.get('/add-users', function (req, res) {
res.setStatus(202).send(); // you can put more info in the body if desired
users.forEach(function(user){
request({
url: url,
method: 'POST',
json: true
}, function(err, resp, body){
const status = err ? 'error' : 'complete'; // obviously you might want to put more info somewhere, eg. error log, if there is an error
User.update({_id: user.id}, {status:status}, function(e) { if (e) console.error(err);});
});
});
});
Still another way, though it adds complexity to your solution, is to implement websockets to have your client get updated each time a request is complete. That example is a bit longer than I have time to post here, but the docs on the site I linked are excellent.

Send response and continue to perform tasks Express | Node.js

In Node.js (which I'm new to) I am trying to perform a series of tasks after receiving a response. However, I want to make the response time as fast as possible. I don't need to return the results of these tasks to the client, so I'm trying to return the response immediately.
My current implementation is roughly:
var requestTime = Date.now;
app.post('/messages', function (req, res) {
console.log("received request");
// handle the response
var body = res.body;
res.send('Success');
res.end();
console.log("sent response");
performComplexTasks(body)
})
function performComplexTasks(body){
// perform data with body data here;
console.log("finished tasks:", Date.now()-requestTime, "ms");
}
// -------LOG-----------
// received request
// POST /api/messages 200 3.685 ms - 59
// sent response
// finished tasks: 2500ms
The client making the request seems to hang until performComplexTasks() is finished. (The POST finishes in 3.685ms, but the response takes 2500ms to finish.)
Is there a way to send the response immediately and complete other tasks without having the client wait/hang? (In my case, the client cannot make multiple API calls.)
If your job is not super-CPU-intense and you can tolerate some work on the main server thread, then just use await to break the execution so that the request can be properly sent. You can use setTimeout or await.
// This line is wrong too - ask a separate question if needed
var requestTime = Date.now;
app.post('/messages', async function (req, res) {
console.log("received request");
// handle the response
var body = res.body;
res.status(200).send({ success: true });
console.log("sent response");
// Method 1:
await performComplexTasks(body)
// Method 2:
setTimeout(() => performComplexTasks(body), 0);
})
async function performComplexTasks(body){
// The line below is required if using the `await` method -
// it breaks execution and allows your previous function to
// continue, otherwise we will only resume the other function after
// this function is completed.
await new Promise(resolve => setTimeout(resolve, 1));
// perform data with body data here;
console.log("finished tasks:", Date.now()-requestTime, "ms");
}
This isn't really a fantastic solution and you'd need to use worker threads for long operations.
Am I right that you're trying to execute a CPU-intensive job in performComplexTasks? If so, then event loop is being locked by that task and new requests are waiting until the job is finished.
It's a bad practice in node.js to execute such 'complex' tasks in the same process as http server. Consider using background workers, queues or something like that.
See this topic for details: Node.js and CPU intensive requests

How to put a wait in between the execution of two functions in nodejs

Here is my code, in wpt.runtest function i am hitting some url and getting url in response, now i will use this url in my request function, but i need to wait for sometime as it takes some time before the data is available at the url which is to be used in request module.
wpt.runTest('http://some url ',function(err, data) {
//console.log("hello -->",err || data);
data_url = data.data.summaryCSV;
console.log(data_url);
console.log('-----------');
console.log(data_url);
console.log('-----------');
request({uri:data_url,method:'GET'}, function (error, response,body) {
console.log('----######----');
console.log(response.headers);
console.log('----######----');
//console.log(response);
if (error) {
console.log('got an error' + error);
}
//console.log(response);
//console.log(body);
var data = body;
console.log('here is the body' + body);
what i want is, there should be a time gap or pause before my request function gets executed, how can i achieve this
Above problem can be solved by
Using Cron job
setTimeout function
Using settimeout
wpt.runTest('http://some url ',function(err, data) {
//console.log("hello -->",err || data);
data_url = data.data.summaryCSV;
console.log(data_url);
console.log('-----------');
console.log(data_url);
console.log('-----------');
var delay = 5000;
settimeout(function(){
request({uri:data_url,method:'GET'}, function (error, response,body){
// handle error and response
}
}, delay);
});
using Cron
This will be really helpful if you have some extended cases of this particular problem. Ex, you need to keep track of resource at various time, lets say after every hour or day. If you are expecting such cases then considering Cron will be helpful.
Steps of execution with cron:
Register a cron using the cron pattern.
Start the cron.
When pattern evaluates to true, your supplied function will be executed. Put your implementation (making request for resource using request library) in supplied function.
Helper libraries for cron in nodejs are Lib1, Lib2.
Examples lib1, lib2

Categories