Javascript: ajax trouble with requests arrays - javascript

I'm fairly new to javascript and I'm still trying to grasp callback concepts. Even then, I'm not sure I have an issue with callbacks or ajax itself.
So I have this callback function which basically takes an array of objects somewhere else in my code.
function powerAmplifierInfo(id, cb) {
ApiGet('/system/radio-frequency/tx/' + id + '/power-amplifier', function (err, result) {
if (err && !TESTING) {
alertError("Connection error: " + err.statusText);
return;
} else if (result && result.result.code !== "OK") {
alertError(result.result.message);
return;
}
if (TESTING) {
result = { powerAmplifier: "TEST model", uptime: Date.now() };
}
cb(result.powerAmplifier);
});
}
This function uses another function which uses ajax to send a get request to some server.
function ApiGet(urlSuffix, cb) {
var url = apiPrefix + "/" + urlSuffix;
$.ajax({
url: url,
type: 'GET',
success: function (result) {
cb(null, result);
},
error: function (err) {
cb(err, null);
}
});
}
Now, the request chains in my list works but sometimes requests get mixed up together when I rerun the code in loop as it seems ajax switch some port orders while sending requests. I found out this issue while troubleshooting my local network with Wireshark. Here's an image to show it:
The Get requests are sent but between the 3rd and the 4th ones, there is a port mixup. As we can see, the server replies without the port order mixup.
I'm wondering, is this a known issue or would there be a problem with my code, or even my server? I do not get this error when sending a single object in my callback functions instead of an object array.
Thanks in advance!

Related

Pass additional parameters to Javascript request 3 argument callback

I am trying to nest several callbacks with the goal of programmatically navigating a page using POST requests. I have a list of different files I am trying to download, and want to assign each of them a different filename so that I can make asynchronous download calls and perform OCR on them.
The callback that actually downloads the PDF is here:
filename = "doodle" + Math.floor((Math.random() * 100) + 1) + ".pdf";
request.post(
{
url:'https://ecorp.sos.ga.gov/BusinessSearch/DownloadFile',
form: {
'documentId': body
},
headers: {
'Referer': 'https://ecorp.sos.ga.gov/BusinessSearch/BusinessFilings'
}
}, getPDF).pipe(fs.createWriteStream(filename));
Notice that the request sends the resulting page to a getPdf function, which actually opens the PDF for OCR after the request has been made. Currently I have a filename hardcoded in the OCR method, because I do not know how to pass an additional variable to the callback function. I would like to be able to write to a randomly generated file, and in the callback method retrieve that filename.
function getPDF (error, response, body) {
if(!error){
console.log(body);
filename = "/doodle" + Math.floor((Math.random() * 100) + 1) + ".pdf";
//console.log(__dirname + filename);
/*pdfText(__dirname + "/doodle.pdf", function(err, chunks) {
//console.log(chunks);
});*/
var pdf_path = __dirname + filename;
//option to extract text from page 0 to 10
var option = {from: 0, to: 10};
var pdf_body;
pdfUtil.pdfToText(pdf_path, option, function(err, data) {
pdf_body = data;
var result;
try{
var namez = /AUTHORIZED SIGNATURE:\s+(.+?)\s{2}/.exec(data)[1];
var emailz = /Email: (.+?)\s{2}/.exec(data)[1];
result = {query:query, info:{name: namez, email: emailz}};
} catch (err){
//result = {query:query, error:"non-selectable text in PDF"};
}
results.push(result);
//console.log(JSON.stringify(result));
});
}
}
the library I am using is called request
and the documentation shows the following:
The callback argument gets 3 arguments:
An error when applicable (usually from http.ClientRequest object)
An http.IncomingMessage object
The third is the response body (String or Buffer, or JSON object if the json option is supplied)
The most straight forward approach would be to simply change the signature of getPDF to
function getPDF (filename, error, response, body) {...}
then you'd call it from within an anonymous function as the callback handler of request.post:
request.post({...}, function(error, response, body){
getPDF( 'filename', error, response, body);
});
E6 notation makes this even nicer:
request.post({...}, (error, response, body)=>{
getPDF( 'filename', ...arguments);
});
You can accomplish this using a more sophisticated method by partially applying the additional argument and then calling it in point-free fashion:
function getPDF (filename, error, response, body, ) {...}
getPdf2 = getPDF.bind(null, "filename");
request.post({...}, getPdf2);
The approaches amount to to the same thing and I might use either depending on circumstances. Arguably, the intent of the second approach is easier to read. It's certainly more fashionable.

AJAX calls are not executed on server side in same order as called on client side

I have a ASP.NET MVC application with following javascript function in a view:
function OnAmountChanged(s, fieldName, keyValue, url) {
console.log("Get Number: " + s.GetNumber());
console.log(" ");
$.ajax({
type: "POST",
url: url,
data: { key: keyValue, field: fieldName, value: s.GetNumber() },
success: function () {
reloadShoppingCartSummary();
}
});
}
With a up / down button I can increase / decrease the number of my field which calls this function for every click.
With the console.log I can see that the JavaScript function is called in the correct order (the order I changed the number value).
But by debugging on server side I noticed that the order it executes the calls is different.
What can cause this problem?
And how can I solve or workaround this problem?
This is not surprising at all and not something you can expect. If you want requests to be processed in a specific order, then you have to send one request, wait for it's response, then send the next. You can't just send all of them and expect them to be processed in a specific order.
In your specific case, if a request is "in-flight", you would have to either disable the button or you would queue up the next request and play it back when the previous request finishes.
There are also various strategies for how you send the data that can prevent, adapt to or detect out of sequence issues, but which to use and how to do it depends upon the specific data you're sending and how it works on both the client and server side of things.
There are many possible coding strategies for dealing with this issue. Here's one method that queues any requests that come in while another request is in process, forcing them to be processed in order by the server:
var changeQueue = [];
changeQueue.inProcess = false;
function OnAmountChanged(s, fieldName, keyValue, url) {
// if already processing a request, then queue this one until the current request is done
if (changeQueue.inProcess) {
changeQueue.push({s:s, fieldName: fieldName, keyValue: keyValue, url: url});
} else {
changeQueue.inProcess = true;
console.log("Get Number: " + s.GetNumber());
console.log(" ");
$.ajax({
type: "POST",
url: url,
data: { key: keyValue, field: fieldName, value: s.GetNumber() },
success: function () {
reloadShoppingCartSummary();
}, complete: function() {
changeQueue.inProcess = false;
if (changeQueue.length) {
var next = changeQueue.shift();
OnAmountChanged(next.s, next.fieldName, next.keyValue, next.url);
}
}
});
}
}
FYI, I think we can probably get promises to do the queuing work for us, but I'm working out some details for doing that. Here's one idea that's in process: http://jsfiddle.net/jfriend00/4hfyahs3/. If you press the button multiple times rapidly, you can see that it queues the presses and processes them in order.
Here's a specific idea for a generic promise serializer:
// utility function that works kind of like `.bind()`
// except that it works only on functions that return a promise
// and it forces serialization whenever the returned function is called
// no matter how many times it is called in a row
function bindSingle(fn) {
var p = Promise.resolve();
return function(/* args */) {
var args = Array.prototype.slice.call(arguments, 0);
function next() {
return fn.apply(null, args);
}
p = p.then(next, next);
return p;
}
}
function OnAmountChanged(s, fieldName, keyValue, url) {
console.log("Get Number: " + s.GetNumber());
console.log(" ");
return $.ajax({
type: "POST",
url: url,
data: { key: keyValue, field: fieldName, value: s.GetNumber() }
}).then(reloadShoppingCartSummary);
}
var OnAmountChangedSingle = bindSingle(OnAmountChanged);
So, to use this code, you would then pass OnAmountChangedSingle to your event handler instead of OnAmountChanged and this will force serialization.

Abstracting the making of requests with Node.js

Traditionally I use jQuery for all my JS code, but I'm tasked to launch a simple API with node.js. Today is my first day with Node but I know enough about JS and closures to do OK. One of the tasks of the API is to authenticate across a third party service and being a python guy, I wanted to abstract all my outbound request calls like so:
EDIT
var http = require('http');
var init = function(nconf) {
var methods = {
/*
Helper method to create the request header
*/
headers: function(params) {
var content = JSON.stringify(params);
return {
'Content-Type': 'application/json',
'Content-Length': content.length
}
},
/*
Helper method to create the options object
which is used in making any type of
outbound http request
*/
options: function(host, path, method, params) {
return {
host: host,
port: 80,
path: path,
method: method,
headers: methods.headers(params)
}
},
/*
Helper method to abstract the making of
outbound http requests
*/
call: function(options, params, success, err) {
var req = http.request(options, success);
req.on('error', err);
req.write(params);
req.end();
},
/*
Helper method to parse the response
and return a json object
*/
parse: function(res, result) {
var responseString = '';
res.on('data', function(data) {
responseString += data;
});
res.on('end', function() {
result = JSON.parse(responseString);
});
},
/*
API method to return the latest
release and tag names
*/
latest: function(req, res, next){
// // var url = nconf.get('prod:authenticate');
//authenticate the test user
msg = methods.authenticate(nconf.get('test:user'), nconf.get("test:password"));
res.send(msg);
next();
},
/*
Method used by this API to authenticate users.
It is used to de-couple this API from the Database
Schema by calling out to the TTCPAS App and requesting it
to handle the authentication
*/
authenticate: function(username, password){
// create post parameters with API key
var params = {"username": username, "password": password, "api_key": nconf.get('api_key')};
//construct options object with params and header
var options = methods.options(nconf.get('ttcpas:host'), nconf.get('ttcpas:auth_url'), 'POST', params);
var result;
var success = function(res) {
res.setEncoding('utf-8');
methods.parse(res, result);
};
methods.call(options, params, success, function(err){});
while (typeof(result.statusCode) == 'undefined') {
//wait 1 second;
setTimeout(function(){
console.log("waiting on request at " + nconf.get('ttcpas:host') + nconf.get('ttcpas:auth_url'));
}, 1000);
}
//then down here
if (result.statusCode == 200) {return result};//success
if (result.statusCode == 403) {return "forbidden"};//forbidden
}
}
return methods;
};
module.exports.init = init;
#jfriend00 As I said I don't know how node.js is supposed to be styled. I wanted to just abstract as much as possible to make the code clean and reusable
Now when I do http://localhost:9000/latest/
I get:
{"code":"InternalError","message":"first argument must be a string or Buffer"}
Uhhh, this part will simply not work:
while (typeof(result.statusCode) == 'undefined') {
//wait 1 second;
setTimeout(function(){
console.log("waiting on request at " + nconf.get('ttcpas:host') + nconf.get('ttcpas:auth_url'));
}, 1000);
}
If result.statusCode is ever undefined, this will spin forever piling up setTimeout() calls in the event queue until eventually something fills up or you run out of memory.
Because node.js is primarily single threaded, you can't loop waiting for something to change. Because you never finish this while loop, no other node.js code gets to run so result.statusCode can never change. Thus, you have an infinite loop here.
All of your nodejs code needs to be event driven, not spin/wait loops. FYI, this is similar to browser-based Javascript.

NodeJS unable to make a GET request asynchronously

I am a rookie in Nodejs and asynchronous programming. I am having a problem executing a GET request inside an asynchronous function. Here I am posting the whole code. I am trying to pull a list of all Urls , add them to a list and send the list for processing to another function.
My problem is with processing them. Inturn for each url I am executing a GET request to fetch the body and to look for image elements in it. I am looking to pass the Image url to a 3rd party api as a GET param. I am unable to execute the GET request as the control doesn't seem to reach there at all.
var async = require("async"),
request = require("request"),
cheerio = require("cheerio");
async.waterfall([
function(callback) {
var url = "someSourceUrl";
var linkList = [];
request(url, function(err, resp, body) {
var $ = cheerio.load(body);
$('.list_more li').each(function() {
//Find all urls and add them to a list
$(this).find('a').each(function() {
linkList.push($(this).attr('href'));
});
});
callback(null, linkList);
});
},
//pass all the links as a list to callback
function(liksListFetched, callback) {
for (var i in liksListFetched) {
callback(null, liksListFetched[i]);
}
}],
//***********My problem is with the below code**************
function(err, curUrl) {
var cuResp = "";
console.log("Currently Processing Url : " + curUrl);
request(curUrl, function(err, resp, body) {
var $ = cheerio.load(body);
var article = $("article");
var articleImage = article.find("figure").children('img').attr('src');
var responseGrabbed = "API response : ";
//check if there is an IMG element
if (articleImage === undefined) {
console.log("No Image Found.");
articleImage = 'none';
}
else {
//if there is an img element, pass this image url to an API,
//So do a GET call by passing imageUrl to the API as a GET param
request("http://apiurl.tld?imageurl=" + articleImage, function(error, response, resp) { //code doesn't seem to reach here
I would like to grab the response and concatenate it to the responseGrabbed var.
console.log(resp);
responseGrabbed += resp;
});
}
console.log(responseGrabbed);// api response never gets concatenated :(
console.log("_=_=_=_=_=_=__=_=_=_=_=_=__=_=_=_=_=_=__=_=_=_=_=_=_");
process.exit(0);
});
});
I appreciate if any one can help me understand the root cause. Thanks in advance.
request() is asynchronous, so when you're console logging the string, the string hasn't been built yet, you have to do the console log inside the callback :
request("http://apiurl.tld?imageurl=" + articleImage, function(error, response, resp) {
responseGrabbed += resp;
console.log(responseGrabbed);// api response never gets concatenated :(
console.log("_=_=_=_=_=_=__=_=_=_=_=_=__=_=_=_=_=_=__=_=_=_=_=_=_");
});
Same goes for terminating the process, which should be done when all the requests have finished

backbone sync callback always use error

I am having a hard time figuring out how to get the callback to work correctly when using Backbone sync. I am looking at my return packets and the response code is 200 which is no error, yet the alert("fail") statement gets called. I am requesting a response from a java servlet. Any idea guys? Thanks
Backbone.sync("read", this.model, {
url : "some url",
success: function(model, response) {
alert(response);
},
error: function(model, response) {
alert("fail");
}
});
I don't understand what you are doing...
Use this methods instead of sync:
model.fetch();
model.save();
model.destroy();
They will call sync, and they work perfectly.
I don't think it's necessary to override the original sync, it is good enough. I created a mock sync for an example application, this is how it works:
var User = Backbone.Model.extend({
notAllowedEmailHost: "gmail.com",
sync: function (method, model, options) {
if (method == "read" || method == "delete")
throw new Error("Example is not prepared for these methods.");
var email = model.get("email");
var status = 201;
if (email.indexOf(this.notAllowedEmailHost) != -1)
status = 400;
else if (method == "update")
status = 500;
options.xhr = {
status: status
};
if (status >= 400)
options.error(options.xhr);
else
options.success({
id: 1
});
}
});
The methods above create wrapper functions around your callbacks, and in sync those wrappers are called with the result. So the callback in sync is not the callback you give by the call of fetch, save or destroy functions...
Make sure that your servlet returns a JSON object, even if it's empty. This fixed the issue in my case.

Categories