I am making a POST request in one of my routes add-users. I have created an array called success. In each loop provided the API POST runs successfully I am adding a string 'user added' to the array. Once the array has completed I want to send the response to the browser with the success array.
I have noticed something strange. When I type in the url the start of add-users it runs the loop before I hit enter to navigate to the page. This seems strange? Is node listening and predicting which url I am going to hit?
Here is my current attempt but its not working for some reason.
app.get('/add-users', function (req, res) {
var success = [];
var count = 0;
users.forEach(function(user, i){
request({
url: url,
method: 'POST',
json: true
}, function(err, resp, body){
if (!err && resp.statusCode === 200) {
success.push('user added');
}
});
if(count === users.length) {
res.json(success);
}
});
});
Regarding browser fetching the response before hitting enter key on the url, it is very unusual behaviour. Maybe you should check your machine if it is infected by any malware!
Regarding the code used by you, count is not incremented anywhere in the forEach loop. So it remains 0 forever and never equals users.length. So the loop will end but it will never send a response.
Also, you are testing for equality between count and users.length at the wrong place in the code.
This code should work:
app.get('/add-users', function (req, res) {
var success = [];
var count = 0;
users.forEach(function(user){
request({
url: url,
method: 'POST',
json: true
}, function(err, resp, body){
if (!err && resp.statusCode === 200) {
success.push('user added');
}
count++; // <<=== increment count
//
if(count === users.length) { // and then test if all done
res.json(success);
}
});
});
});
The problem here is you are mixing synchronous and asynchronous code together in a wrong way. Please note that forEach is synchrounous and request is asynchronous. So, looping over users finishes faster than the first result you get from request method.
#SantanuBiswas has one way you can solve your problem with the response returning before your requests are all complete, though depending on how many users you've got in your array and how slow the upstream service is, this is a potentially disastrous user experience as it will wait until all the requests are complete and only then fire back a response.
A better solution (in my opinion) would be to respond immediately with a 202 Accepted status code, and then update your DB with information about each user's status in the request handler for later reporting or debugging. Something like this (assuming you're using mongoose for your local user storage):
app.get('/add-users', function (req, res) {
res.setStatus(202).send(); // you can put more info in the body if desired
users.forEach(function(user){
request({
url: url,
method: 'POST',
json: true
}, function(err, resp, body){
const status = err ? 'error' : 'complete'; // obviously you might want to put more info somewhere, eg. error log, if there is an error
User.update({_id: user.id}, {status:status}, function(e) { if (e) console.error(err);});
});
});
});
Still another way, though it adds complexity to your solution, is to implement websockets to have your client get updated each time a request is complete. That example is a bit longer than I have time to post here, but the docs on the site I linked are excellent.
Related
I am running a cron job every 5 mins to get data from 3rd party API, It can be N number of request at a time from NodeJS application. Below are the details and code samples:
1> Running cron Job every 5 mins:
const cron = require('node-cron');
const request = require('request');
const otherServices= require('./services/otherServices');
cron.schedule("0 */5 * * * *", function () {
initiateScheduler();
});
2> Get the list of elements for which I want to initiate the request. Can receive N number of elements. I have called request function (getSingleElementUpdate()) in the forEach loop
var initiateScheduler = function () {
//Database call to get elements list
otherServices.moduleName()
.then((arrayList) => {
arrayList.forEach(function (singleElement, index) {
getSingleElementUpdate(singleElement, 1);
}, this);
})
.catch((err) => {
console.log(err);
})
}
3> Start initiating the request for singleElement. Please note I don't need any callback if I received a successful (200) response from the request. I just have to update my database entries on success.
var getSingleElementUpdate = function (singleElement, count) {
var bodyReq = {
"id": singleElement.elem_id
}
var options = {
method: 'POST',
url: 'http://example.url.com',
body: bodyReq,
dataType: 'json',
json: true,
crossDomain: true
};
request(options, function (error, response, body) {
if (error) {
if (count < 3) {
count = count + 1;
initiateScheduler(singleElement, count)
}
} else{
//Request Success
//In this: No callback required
// Just need to update database entries on successful response
}
});
}
I have already checked this:
request-promise: But, I don't need any callback after a successful request. So, I didn't find any advantage of implementing this in my code. Let me know if you have any positive point to add this.
I need your help with the following things:
I have checked the performance when I received 10 elements in arrayList of step 2. Now, the problem is I don't have any clear vision about what will happen when I start receiving 100 and 1000 of elements in step 2. So, I need your help in determining whether I need to update my code for that scenario or not or is there anything I missed out which degrade the performance. Also, How many maximum requests I can make at a time. Any help from you is appreciable.
Thanks!
AFAIK there is no hard limit on a number of request. However, there are (at least) two things to consider: your hardware limits (memory/CPU) and remote server latency (is it able to respond to all requests in 5 mins before the next batch). Without knowing the context it's also impossible to predict what scaling mechanism you might need.
The question is actually more about app architecture and not about some specific piece of code, so you might want to try softwareengineering instead of SO.
I have a settings page that you can insert and delete some filters. For delete request I used this:
$('#delete-filter').click(function (e) {
var filtername = $('#filter-list').val();
var filterCount = $('#filter-list option').length;
var retVal = confirm("Are you sure to delete this filter?");
if( retVal == true ){
$.ajax({
url: "/settings?filtername=" + filtername,
method: 'DELETE',
dataType: "json",
success: function (result) {
}
});
}
else{
return false;
}
});
And here is my route for this page:
router.delete('/settings', ensureAuthenticated, function (req, res, next) {
var promise = user.deleteFilter(req.session.user_id, req.query.filtername);
var promise2 = promise.then(function (data) {
req.session.selected_filter = data.selected;
res.redirect('/settings');
}, function (error) {
console.log(error);
})
})
Essentially I want to redirect to page to settings, so the page reloads with the new data. But in promise chain I can't use any response functions. Am I using the redirect wrong? or I can't send a response in promise chain ?
You've misidentified the problem.
When you issue an HTTP redirect, you say "The thing you were looking for? Get it from here instead."
This is not the same as "The browser should display this URL as a new page".
The HTTP redirect is followed, the settings page is delivered to the browser, then the browser makes it available as the result in your success function. (You then completely ignore it as you haven't put anything in that function).
If you want the browser to load a new page, then you need to deliver the URL as data (not as a redirect) and then assign that value to location.href.
The settings page probably shouldn't be determined dynamically, so you can probably just hard code the URL into the success function.
Hard coding it would make more sense, since you shouldn't send a redirect in response to a DELETE request:
If a DELETE method is successfully applied, the origin server
SHOULD send a 202 (Accepted) status code if the action will likely
succeed but has not yet been enacted, a 204 (No Content) status
code if the action has been enacted and no further information is
to be supplied, or a 200 (OK) status code if the action has been
enacted and the response message includes a representation
describing the status.
I'm new to node.js so I'll try my best to explain the problem here. Let me know if any clerification is needed.
In my node.js application I'm trying to take a code (which was received from the response of the 1st call to an API), and use that a code to make a 2nd request(GET request) to another API service. The callback url of the 1st call is /pass. However I got an empty response from the service for this 2nd call.
My understanding is that after the call back from the 1st call, the function in app.get('/pass', function (req, res).. gets invoked and it sends a GET request. What am I doing wrong here? Many thanks in advance!
Here is the part where I try to make a GET request from node.js server and receive an empty response:
app.get('/pass', function (req, res){
var options = {
url: 'https://the url that I make GET request to',
method: 'GET',
headers: {
'authorization_code': code,
'Customer-Id':'someID',
'Customer-Secret':'somePassword'
}
};
request(options, function(err, res, body) {
console.log(res);
});
});
Im a little confused by what you are asking so ill just try to cover what i think you're looking for.
app.get('/pass', (req, res) => {
res.send("hello!"); // localhost:port/pass will return hello
})
Now, if you are trying to call a get request from the request library when the /pass endpoint is called things are still similar. First, i think you can remove the 'method' : 'GET' keys and values as they are not necessary. Now the code will be mostly the same as before except for the response.
app.get('/pass', (req, res) => {
var options = {
url: 'https://the url that I make GET request to',
headers: {
'authorization_code': code,
'Customer-Id':'someID',
'Customer-Secret':'somePassword'
}
};
request(options, function(err, res, body) {
// may need to JSONparse the body before sending depending on what is to be expected.
res.send(body); // this sends the data back
});
});
I'm using node-request for sending the requests to server to get some report. The thing is server needs some time to generate the report, so it responses with the report state. I'm checking the report state with setInterval() function, and use clearInterval() when server sends ready response. But with this approach, even after I use clearInterval, responses of earlier requests keep coming, and the response handler runs again and again. This does not cause a lot of harm, but still I believe it can be done better.
Here is my code:
checkForReportReady = setInterval =>
#request URL, options, (err, res, body) =>
console.log err if err
body = JSON.parse body
if body['status'] is 'ready'
clearInterval checkForReportReady
#processReport body
, 1000
What I need: make a request, wait for response, check the status, if status is not ready - make another request after some timeout, repeat until the status code in response is ready. If the status is ready - exit the loop (or clear the interval) and run #processReport.
I tried to make promisified request, and put it into setInterval, but the result was the same.
P.S. I do not control the server, so I can't change the way it responds or deals with the report.
I would recommend not to put requests in an interval callback. This can get ugly when they a) fail b) take longer than the interval.
Instead put a setTimeout in the success handler and try again after (and only if) receiving a response.
This is rather easy with promises:
request = Promise.promisifyAll require 'request'
getReport = () =>
request URL, options
.spread (res, body) =>
body = JSON.parse body
if body.status is 'ready'
body
else
Promise.delay 1000
.then getReport # try again
getReport().then(#processReport, (err) -> console.log(err))
It seems like you can just use a setTimeout() in your response handler:
function checkForReportReady() {
request(URL, options, function(err, res, body) {
if (err) {
console.log(err);
} else {
if (body.status === "ready") {
processReport(body);
// do any other processing here on the result
} else {
// try again in 20 seconds
setTimeout(checkForReportReady, 20*1000);
}
}
});
}
This will run one request, wait for the response, check the response, then if it's ready it will process it and if it's not ready, it will wait a period of time and then start another request. It will never have more than one request in-flight at the same time.
If you want to use Bluebird promises, you can do that also, though in this case it doesn't seem to change the complexity particularly much:
var request = Promise.promisifyAll(require('request'));
function checkForReportReady() {
return request(URL, options).spread(function(res, body) {
if (body.status === "ready") {
return body;
} else {
// try again in 20 seconds
return Promise.delay(20 * 1000).then(checkForReportReady);
}
});
}
checkForReportReady().then(function(body) {
processReport(body);
}, function(err) {
// error here
});
I send JSON POST data via a form in a MEAN environment to my server. On the server side, I process the data inside of a waterfall function, using the async library, including various operations such as:
[...]
- create a database entry for a new author
- create a database entry for a new book
- associate the new book to an author (reference to book ID)
[...]
This is the method called by my route, which handles the associated POST-request:
exports.createAuthor = function(req, res) {
console.log(req.url+' !!!POST REQUEST INCOMING!!! '+req.body);
async.waterfall([
function(callback){
//create Author db entry
},
function(parameter, callback){
//add author to additional directory (db action)
},
function(parameter, callback){
//create book db entry
},
function(parameter, callback){
//associate book to author (db action)
}
], function (err, result) {
console.log('DONE!!!');
res.send('200');
});
}
This is the client-side AngularJS controller code:
searchApp = angular.module("searchApp",[]);
searchApp.controller('authorCreator', function ($scope,$http) {
$scope.tags = [];
$scope.sendAuthor = function(){
alert('I was called!');
$http({
method: 'POST',
url: '/newauthor/',
data: { 'authorname' : $scope.authorName,
'authordescription' : $scope.authorDescr,
'bookname' : $scope.bookName,
'tags' : $scope.tags }
})
.success(function(data){
//no actions yet
})
.error(function(){
//no actions yet
});
};
});
This is the AngularJS form:
<div ng-controller="authorCreator">
<form>
<p>Author name: <input ng-model="authorName"></p>
<p>Author description: <input ng-model="authorDescr"></p>
<p>Book name: <input ng-model="bookName"></p>
<p>Tags:<input ng-model="tags"></p>
<p><button ng-click="sendAuthor()">Send</button></p>
</form>
</div>
I noticed that, if the waterfall-process is "stuck" somewhere, meaning the client does not get an answer to it's request whatsoever, the POST request seems to be sent a second time automatically (as soon as the browser is giving a timeout according to firebug). According to firebug, a second POST request does not seem to be sent by the browser, so the call must be initiated from somewhere else.
I found out by checking the database (multiple documents with identical values, except the ObjectID of course) and monitoring the node.js console window where I output incoming POST data. Again: as soon as the entire waterfall-process completes, hence the client browser does not abort the post request after a while, and res.send('200') executes, the error does not occur (= no multiple db entries).
Can anyone please tell me, who does initiate this second POST request and how may I deactivate it?
Cheers
Igor
Try adding this:
exports.createAuthor = function(req, res) {
if(req.method == 'POST' && req.url = 'REQUESTEDURL'){
console.log('POST REQUEST INCOMING!!! '+req.body);
async.waterfall([
//TODO...
]);
}
Maybe the problem is that the favicon or some other resource is doing a request to
After spending some time on that issue I found out, that this error seems to be based on missing answers to the client (be it via res.json, res.sendfile, ...). Therefore the client seems to re-send the request after some time, thus executing server-side code a second time. Responding to the client in reasonable time solves this issue. Sorry for the confusion.
i "fixed" this by adding
.get('/favicon.ico:1', (req, res) =>{
//do nothing because i dont care
})