Why does node-debug always break at _tickCallback function? - javascript

It's kind of stupid question, but, I can't really figure it out for 2 hours and can't find any answer on google.
I'm trying to debug my controller by dropping a break point to my save function, on the line var profile = req.body:
function save(collectionName) {
return function (req, res, next) {
var profile = req.body,
query = {};
...
...
};
}
However, the app always breaks inside _tickCallback function placed in node.js file:
// Run callbacks that have no domain.
// Using domains will cause this to be overridden.
function _tickCallback() {
var callback, threw, tock;
scheduleMicrotasks();
while (tickInfo[kIndex] < tickInfo[kLength]) {
tock = nextTickQueue[tickInfo[kIndex]++];
callback = tock.callback;
threw = true;
try {
callback();
threw = false;
} finally {
if (threw)
tickDone();
}
if (1e4 < tickInfo[kIndex])
tickDone();
}
tickDone();
}
So, I tried to step over until it went out of the function, however, it also resumed the application without going back to my break point. Any help would be really appreciated.

I think this situation happens then you use node-debug command with node 0.12.*.
This is a nodejs bug https://github.com/joyent/node/issues/25266
As a workaround you can use debugger statement (with NI >=0.10.1. I recommend NI 0.11.0), or use iojs

Related

Sinon stub check that external SOAP call was called with specific arg

I'm using Jasmine and sinon to test some node.js code. I'm using sinon stubs in this example.
I am trying to stub out a SOAP call to an external service, and I'm wondering if there's any way to ensure that the correct argument is being used in the call.
I have successfully checked if functions are returning the correct arguments in other circumstances, but unfortunately, this scenario is within a nested callback, so I'm not sure how to do it.
Here's my code snippet (I'm trying to test that "client.ExternalService.HttpPort.actualCall" is being called with the "args" variable I am expecting):
class ExternalServiceCaller extends BaseService {
constructor(util) {
super(util);
this.util = util;
}
callExternalService(body, callback){
let url = this.util.config.get('blah.my.url');
let args = {
'request':{
'Property1': body.Property1,
'Property2': body.Property2,
'Property3': body.Property3,
'Property4': body.Property4
}
};
//soap.request()
soap.createClient(url, sslOptions, function(err, client) {
//client.[wsdlName].[binding name].[operation]
client.ExternalService
.HttpPort
.actualCall(args, function(err, result) {
if(!err){
callback(null, result);
}
}, sslOptions);
});
}
}
As I said above, I'm trying to write a test to make sure that actualCall is using the expected "args" variable (making sure that the incoming body is being formatted correctly to be passed to the external call). I can do this pretty easily for the url by stubbing out soap.createClient and using sinon.assert.calledWith() like below:
describe('The function', function(){
let service;
let externalServiceStub;
let externalRequest = helper.myExternalRequestObject;
describe('should use the correct URL',function(){
beforeEach(function(){
service = new ExternalServiceCaller(tools);
externalServiceStub = sinon.stub(soap, 'createClient');
});
it ('and uses the correct URL when successful', function(){
let url = tools.config.get('blah.my.url');
service.callExternalService(myExternalRequestObject, callback => {});
sinon.assert.calledWith(externalServiceStub, url);
externalServiceStub.restore();
});
});
Unfortunately, I have no idea how to check to see that actualCall is being called with the "args" variable that I'm expecting. I could use a fake object to check it against, but I'm not sure how exactly to do that check in the first place in this scenario.
I looked into soap stub, but there isn't a whole lot of documentation and the example didn't make sense to me.
Any help would be greatly appreciated. :)

Node.js + Actionhero code not running consecutively

I'm obviously missing something very important about how Node.js and Actionhero work.
Can someone please explain to me why the following problem occurs:
I have this simple code:
Basically it reads a .css file .
buildAdsHTML: function (ads, type) {
var cssDir = path.normalize(__dirname + '/../public/css/');
switch (type) {
case 'text':
var result = '';
console.log("STEP 1");
fs.readFile(cssDir + type + 'Ads.css', {encoding: 'utf8'}, function (err, data) {
console.log(err);
console.log(data);
if (!err) {
console.log("STEP 2");
result += '<style>' + data + '</style>';
} else {
}
});
break;
}
console.log('STEP 3');
return result;
}
Now when this code is run I get an empty string as result. The console output is the following:
STEP 1 <- so far so good
STEP 3<- this should be last!
null <- this is the error variable
.some-random-css-class{
display: block;
width: 100;
min-height: 250px;
}
STEP 2
Now at some point I figured out that that fs.reafFile is async function.
So I naturaly changed it to the sync version fs.readFileSync.
Now the console output is even worse:
STEP 1
STEP 3
Thats it! Nothing else.
And I still get a empty string as result.
Like the whole code isnt even going through the swich.
I've noticed this behavior in all functions and methods of my actionhero project,
most notebly when calling next(connection). You can't just call it at the end of the method.
For every if or swich
I have to call it inside to have any actual control over the result.
Otherwise every new data I've added to the connection is lost.
What's with that ? Please explain that functionality in detail so I don't make any dumn mistakes while coding.
Finaly figured it out.
The problem was that I didnt notice that fs.readFileSync had no callback funtction.
So the proper use for that function should have been
var result= fs.readFileSync(pathToFile,{options})
return result;
I can still use some help on the problem i mentiond with next(connection);. I dont realy know why it skips whole parts of code like that.
I've found next() doesn't like it if you pass it any parameters. I had a bug in my application for a good day or two because I was doing '.then(next);'

node, async programming, callback hell

i'm trying to understand callbacks and async programming, but I'm having a bit of trouble.
here's some pseudocode :
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(url){
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
});
});
});
for (i in arrayOfFeedUrls){
scrape(arrayOfFeedUrls[i];
}
console.log(lines.length);
It obviously returns 0, as the scrape function is executed asynchronously. I understand that much, but I've tried many intricated ways and can't figure out how to write it properly. Any help/explanation would be greatly appreciated. I've read (and I'm still reading) a lot of tutorials and examples, but I think the only way for me to get it is to write some code myself. If I solve this I'll post the answer.
You could want to check this article for an introduction in Node that might help you understand async programming in Node a little better.
As far as async programming goes, async is a very popular module in Node's userland which helps you write asynchronous code effortlessly. For instance (untested pseudo-code):
function scrape (done) {
http.get(url, done);
}
function parse (res, done) {
var lines = [];
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
})
.on('end', function () {
done(null, lines);
});
}
function done (err, lines) {
if (err) { throw err; }
console.log(lines.length);
}
async.waterfall([scrape, parse], done);
This depends on if you want to scrape all urls in parallell or in series.
If you were to do it in series, you should think of it as this:
Start with the first url. Scrape. In the callback, scrape the next url. in the callback, scrape the next url.
This will give the notorious callback hell you are talking about, but that is the principle at least. That where librarires like async etc removes a lot of the headache.
When programming async calls in this manner, functions and instructions that you want to chain onto the end, such as console.log(lines.length);, must also be callbacks. So for instance, try something like this:
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(url){
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
done();
}
});
});
});
for (i in arrayOfFeedUrls){
scrape(arrayOfFeedUrls[i];
}
function done () {
if (lines.length == arrayOfFeedUrls.length) {
console.log(lines.length);
}
}
You may also want to look into promises, an alternative programming style to callbacks, which aims to avoid callback hell.
Have to admit that I'm very new to node.js and struggling to grok the callback stuff. In my limited experience, adding one more parameter to the callback function may be the trick. The hard question is, which parameter?
In your example, if the function scrape had an extra boolean "lastOne", then it could call console.log(lines) itself. Or, if it understood that a null url meant to stop. However, I don't think even this works, as I'm not sure everything will get done in order. If the 2nd URL takes forever, the last one may complete first, right??? (You could try it). In other words, I still don't know which parameter to add. Sorry...
What seems more reliable is to set a counter to urls.length, and for scrape() to decrement it each time. When the counter reaches 0, it knows that the entire process is done and it should log (or do whatever) with the result. I'm not 100% sure where to declare this counter. Coming from Java I still have little idea what is a static global, what is an instance, whatever...
Now, a true-blue node.jser would pass a function to doWhatever as an extra parameter to scrape(), so that you can do something other than console.log(). :-) But I'd settle for the check for zero.
to elaborate slightly, add a callWhenDone parameter to scrape(), and add (somewhere in all that nesting!!!)
if (--counter <= 0)
callWhenDone (lines);
Ok, so here's how i've solved the problem, feel free to comment and tell me if it's right.
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(array){
var url = array.shift();
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
}).on('end', function () {
if(array.length){
scrapeFeeds(array);
}
});
});
});
scrapeFeeds(array);
Thanks for all the answers, i'm looking more in depth to async as I've got more complicated stuff to do. Let me know what you think of my code, it's always useful.

Trying to iterate through an array and add a get function for each

I want to iterate through the style tags and write a GET function for each. The problem is the GET function is being written with a reference to 'styleTags[i]' instead of converting 'styleTags[i]' to the appropriate tag.
var styleTags = ['cont', 'ecce'];
for (var i = 0; i < styleTags.length; i++) {
app.get('/photos-' + styleTags[i], selectNav, function(req, res) {
getDynPhotos(req, res, styleTags[i]);
});
}
I'm not entirely clear what problem you are asking about, but I do see an issue with your use of i in the callback. Because app.get() is presumably asynchronous and will complete some time later, the value of i will not still be valid. You need to create a closure that will capture the value of i. There are several ways to do that. Here's one way using an immediately invoked function expression (often abbreviated IIFE):
var styleTags = ['cont', 'ecce'];
for (var i = 0; i < styleTags.length; i++) {
(function(index) {
app.get('/photos-' + styleTags[index], selectNav, function(req, res) {
getDynPhotos(req, res, styleTags[index]);
});
})(i);
}
This will freeze the value of i in the function argument that I've named index so it will still have the right value at the later time when the callback is called.
If this isn't what you were asking about (though it is still something that needs to be fixed), then please describe in more detail what you were asking about.
Don't use a for loop in Node because it doesn't create a closure. Instead, I recommend async.each:
var async = require('async');
var styleTags = ['cont', 'ecce'];
async.each(styleTags, function(styleTag, callback) {
app.get('/photos-' + styleTag, selectNav, function(req, res) {
getDynPhotos(req, res, styleTag);
});
callback();
}
I see you're trying to build a Node.js Express route.
I'm somewhat surprised this didn't work, as I thought the declaration of these routes would be synchronous.
Could you look at app.routes to see what Node is putting in place for you? I'm guessing you may have done this already, but I thought I'd mention it.
I have two suggestions if that doesn't work: use regular expressions in your route to isolate the category section of your route (then validate the route at request time), or insert your route directly into the app.routes object structure.

How can I make node wait? or perhaps a different solution?

I am using https://github.com/gpittarelli/node-ssq to query of a bunch of TF2 game servers to find out if they are on, and if so, how many players are inside.
Once I find a server that is on and has less than 6 players in it, I want to use that server's Database ID to insert into somewhere else.
Code looks like this:
for (var i = 0;i < servers.length;i++) {
ssq.info(""+servers[i].ip, servers[i].port, function (err, data) {
serverData = deepCopy(data);
serverError = deepCopy(err);
});
if (!serverError) {
if (serverData.numplayers < 6){
//its ok
currentServer = servers[i].id;
i = 99999;
}
}
else {
if (i == servers.length-1){
currentServer = 666;
}
}
}
And then right after I insert into database with https://github.com/felixge/node-mysql .
If I put a console.log(serverData) in there, the info will show up in the console AFTER it inserted into the DB and did a couple other stuff.
So how do I "stop" node, or should I be looking at this / doing this differently?
Update:
A simple solution here is to just move your if statements inside the callback function:
for (var i = 0;i < servers.length;i++) {
ssq.info(""+servers[i].ip, servers[i].port, function (err, data) {
serverData = deepCopy(data);
serverError = deepCopy(err);
// moving inside the function, so we have access to defined serverData and serverError
if (!serverError) {
if (serverData.numplayers < 6){
//its ok
currentServer = servers[i].id;
i = 99999;
/* add an additional function here, if necessary */
}
}
else {
if (i == servers.length-1){
currentServer = 666;
/* add an additional function here, if necessary */
}
}
});
// serverData and serverError are undefined outside of the function
// because node executes these lines without waiting to see if ``ssq.info``
// has finished.
}
Any additional functions within the callback to ssq.info will have access to variables defined within that function. Do be careful with nesting too many anonymous functions.
Original (nodesque) Answer
If ssq.info is an Asynchronous function (which it seem it is), Node is going to immediately execute it and move on, only dealing with the callback function (which you passed as a last parameter) when ssq.info has finished. That is why your console.log statement is going to execute immediately. This is the beauty/terror of node's asynchronous nature : )
You can use setTimeout to make Node wait, but that will hold up every other process on your server.
The better solution, in my opinion, would be to make use of Node's Event Emiters, to:
watch for an event (in this case, when a player leaves a server)
Check if the number of players is less than 6
If so, execute your query function (using a callback)
A good primer on this is: Mixu's Node Book - Control Flow. Also, see this SO post.
You should use a callback,
connection.query('INSERT INTO table', function(err, rows, fields) {
if (err) throw err;
//entry should be inserted here.
});
also the http://www.sequelizejs.com/ library is a bit more matrue, it could be an implementation problem with node-mysql

Categories