I'm obviously missing something very important about how Node.js and Actionhero work.
Can someone please explain to me why the following problem occurs:
I have this simple code:
Basically it reads a .css file .
buildAdsHTML: function (ads, type) {
var cssDir = path.normalize(__dirname + '/../public/css/');
switch (type) {
case 'text':
var result = '';
console.log("STEP 1");
fs.readFile(cssDir + type + 'Ads.css', {encoding: 'utf8'}, function (err, data) {
console.log(err);
console.log(data);
if (!err) {
console.log("STEP 2");
result += '<style>' + data + '</style>';
} else {
}
});
break;
}
console.log('STEP 3');
return result;
}
Now when this code is run I get an empty string as result. The console output is the following:
STEP 1 <- so far so good
STEP 3<- this should be last!
null <- this is the error variable
.some-random-css-class{
display: block;
width: 100;
min-height: 250px;
}
STEP 2
Now at some point I figured out that that fs.reafFile is async function.
So I naturaly changed it to the sync version fs.readFileSync.
Now the console output is even worse:
STEP 1
STEP 3
Thats it! Nothing else.
And I still get a empty string as result.
Like the whole code isnt even going through the swich.
I've noticed this behavior in all functions and methods of my actionhero project,
most notebly when calling next(connection). You can't just call it at the end of the method.
For every if or swich
I have to call it inside to have any actual control over the result.
Otherwise every new data I've added to the connection is lost.
What's with that ? Please explain that functionality in detail so I don't make any dumn mistakes while coding.
Finaly figured it out.
The problem was that I didnt notice that fs.readFileSync had no callback funtction.
So the proper use for that function should have been
var result= fs.readFileSync(pathToFile,{options})
return result;
I can still use some help on the problem i mentiond with next(connection);. I dont realy know why it skips whole parts of code like that.
I've found next() doesn't like it if you pass it any parameters. I had a bug in my application for a good day or two because I was doing '.then(next);'
Related
output = true;
if($("#password-field").css('display') != 'none') {
if(!($("#verificationCode").val())) {
output = false;
$("#code-error").html("required");
}
var codeverify = function(){
var code = document.getElementById("verificationCode").value;
coderesult
.confirm(code)
.then( function(result) {
if (result.user.uid) {
let phoneNumber = result.user.phoneNumber;
//alert(output);
alert("Verification done");
console.log(phoneNumber);
} else {
alert(output);
$("#code-error").html("no user");
output = false;
}
})
.catch(function(error) {
output = false;
$("#code-error").html("wrong");
alert(error.message);
});
}();
}
return output;
When i run this code everything works fine. but before checking the codeverify function it return the output to true even if the codeverify() function returns false
PS. I am using wizard form.
This comes down to how you write JavaScript code, I found that usually when to get to a point where my procedures are out of sync it means that I have done something wrong in previous steps. This is usually only fixed by refactoring.
Remember JavaScript does not behave the same as other languages.
What I can see from your procedure is that you are trying to do many things in one go.
Although I do not have a solution I have a suggestion, consider each action that you want your procedure to execute. Declare a separate function for each of these steps, even if your function only has one line to execute.
If there are dependencies make sure they can be resolved by parameterization.
And lastly, think pure functions. Try and structure every function to receive something and return something.
Another tip that I can give is, write your procedure to select and hold elements in variables until they are required. Consider what elements are required in execution, which of those are in the dom when execution starts and set them to variables before you start executing, then during execution if elements are added that are maybe required for later select them immediately after they are placed in the dom, this means that as your procedure executes all the ingredients are available to do whatever must be done they don't have to go find what they need on the fly.
Good Luck and happy coding.
Your coderesult.confirm(code) using promise(then & catch) so i assume it is asynchronous. You need to google yourself to learn what is async
One important thing of JS behavior is JS always process the rest of the code if there is a async function in between.
Sample:
console.log(1)
setTimeout(()=>{
console.log(2,"i suppose at position 2 but i'm at the last. This is because setTimeout is async function")
},1000)
console.log(3)
In your case, your codeverify function has async code (.confirm()) in between, so JS will process the code after codeverify (return output)until codeverify is fully completed.
Since your output was set at true since the beginning, it will return true from the last row of code return output before your codeverify completed, this is why you always get true. If you change the first row output = undefined, i believe you will see undefined result.
To solve this, one of the way is you can wrap the entire codeverify as Promise.
function codeverify(){
return new Promise((resolve,reject)=>{
var code = document.getElementById("verificationCode").value;
coderesult.confirm(code).then( function(result) {
if (result.user.uid) {
let phoneNumber = result.user.phoneNumber;
//alert(output);
alert("Verification done");
console.log(phoneNumber);
output = true
resolve(output)
} else {
alert(output);
$("#code-error").html("no user");
output = false;
resolve(output)
}
}).catch(function(error) {
output = false;
$("#code-error").html("wrong");
alert(error.message);
reject(output) //you can reject(error.message) so you can pass error message back.
});
})
}
You can execute the function like this:
codeverify().then(successData=>{
console.log(successData) //from what you resolve
}).catch(failedData=>{
console.log(failedData) //from what you reject
})
Lastly, take some time to study what is Asynchronous and What Promise to do with Async because it is everywhere in JS.
I am using writeFileSync function to write a file locally, the file does get written, however, the callback function is never called.
I did some googling, some other post are having the issue that it's either 1) passing the content went wrong or 2) having two write function at the same time.
My problem is that there are some other places in my code that is using the writeFileSync, but they are on different routes (not sure if this is the right terminology, localhost:port#/differentroutes <- something like this). I am testing only on my own route so those write functions shouldn't even be called.
Here is my code:
if(!fs.existsSync(dir)){
fs.mkdirSync(dir)
}
//content is just a string var I swear it's just a string
fs.writeFileSync('./pages/SubmissionProcess.html',content,function(err){
if(err){
throw err
}else {
console.log("YES")
}
})
I never see this "YES" nor error in my console even tho the file is already written....
Write file sync doesn't take a callback :D
Take a look at the documentation :
https://nodejs.org/api/fs.html#fs_fs_writefilesync_file_data_options
The parameters are (path, data, options)
If you want to check if the file actually wrote, you can read file sync after writing it or check the last time the file was updated. Otherwise, you should try using the async approach.
All of the synchronous methods throw rather than passing an error to a callback.
try {
fs.writeFileSync('./pages/SubmissionProcess.html', content);
console.log('YES');
} catch (e) {
console.error(e);
}
I've stumbled upon (or created myself of course) an error that I cannot model in my head. I'm iteratively calling an URL using the webdriverio client with different IDs and parsing the resulting HTML. However, the html variable gets overwritten with the last element in the loop, which results in the array containing multiple duplicates of the last html variable value:
async.forEach(test, function (id, callback) {
self.url('https://<api-page>?id=' + id).getHTML('table tbody', true).then(function(html) {
//Parse HTML
parser.write(html);
parser.end();
//Add course to person, proceed to next.
callback();
});
}, function (err) {
self.end().finally();
res.json(person);
});
Parsing is done using the htmlparser2 NPM library. The html variable always returns the last element, even though I can see it going through the different API ids with different data. I would think the error lies at when I get HTML and return it, but I cannot say why nor have any of my fixes worked.
Hopefully someone more skilled than me can see the error.
Thanks in advance,
Chris
UPDATE/Solution - See solution below
I am not sure if I understood quite well the context but the html variable is not overridden, it is just the last chunk that you 've retrieved from the self.url function call. If you want to have the whole result saved in a variable, you should keep append on every loop the result. Probably, you need something like that:
var html = '';
async.forEach(test, function (id, callback) {
self.url('https://<api-page>?id=' + id).getHTML('table tbody', true).then(function (tmpHtml) {
//Parse HTML
parser.write(tmpHtml);
parser.end();
html += tmpHtml;
//Add course to person, proceed to next.
callback();
});
}, function (err) {
self.end().finally();
res.json(person);
});
I finally figured it out and I missed that async.forEach executes the function in parallel, whereas the function I needed was async.timesSeries, which executes the functions in a loop, waiting for each function to finish before starting the next! I've attached the working code below:
async.timesSeries(3, function(n, next) {
self.url('<api-page>?id=' + n').then(function() {
console.log("URL Opened");
}).getHTML('table tbody', true).then(function(html) {
console.log("getHTML");
parser.write(html);
parser.end();
next();
});
}, function(err, results) {
//Add to person object!
self.end().finally();
res.json(person);
});
It's kind of stupid question, but, I can't really figure it out for 2 hours and can't find any answer on google.
I'm trying to debug my controller by dropping a break point to my save function, on the line var profile = req.body:
function save(collectionName) {
return function (req, res, next) {
var profile = req.body,
query = {};
...
...
};
}
However, the app always breaks inside _tickCallback function placed in node.js file:
// Run callbacks that have no domain.
// Using domains will cause this to be overridden.
function _tickCallback() {
var callback, threw, tock;
scheduleMicrotasks();
while (tickInfo[kIndex] < tickInfo[kLength]) {
tock = nextTickQueue[tickInfo[kIndex]++];
callback = tock.callback;
threw = true;
try {
callback();
threw = false;
} finally {
if (threw)
tickDone();
}
if (1e4 < tickInfo[kIndex])
tickDone();
}
tickDone();
}
So, I tried to step over until it went out of the function, however, it also resumed the application without going back to my break point. Any help would be really appreciated.
I think this situation happens then you use node-debug command with node 0.12.*.
This is a nodejs bug https://github.com/joyent/node/issues/25266
As a workaround you can use debugger statement (with NI >=0.10.1. I recommend NI 0.11.0), or use iojs
i'm trying to understand callbacks and async programming, but I'm having a bit of trouble.
here's some pseudocode :
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(url){
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
});
});
});
for (i in arrayOfFeedUrls){
scrape(arrayOfFeedUrls[i];
}
console.log(lines.length);
It obviously returns 0, as the scrape function is executed asynchronously. I understand that much, but I've tried many intricated ways and can't figure out how to write it properly. Any help/explanation would be greatly appreciated. I've read (and I'm still reading) a lot of tutorials and examples, but I think the only way for me to get it is to write some code myself. If I solve this I'll post the answer.
You could want to check this article for an introduction in Node that might help you understand async programming in Node a little better.
As far as async programming goes, async is a very popular module in Node's userland which helps you write asynchronous code effortlessly. For instance (untested pseudo-code):
function scrape (done) {
http.get(url, done);
}
function parse (res, done) {
var lines = [];
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
})
.on('end', function () {
done(null, lines);
});
}
function done (err, lines) {
if (err) { throw err; }
console.log(lines.length);
}
async.waterfall([scrape, parse], done);
This depends on if you want to scrape all urls in parallell or in series.
If you were to do it in series, you should think of it as this:
Start with the first url. Scrape. In the callback, scrape the next url. in the callback, scrape the next url.
This will give the notorious callback hell you are talking about, but that is the principle at least. That where librarires like async etc removes a lot of the headache.
When programming async calls in this manner, functions and instructions that you want to chain onto the end, such as console.log(lines.length);, must also be callbacks. So for instance, try something like this:
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(url){
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
done();
}
});
});
});
for (i in arrayOfFeedUrls){
scrape(arrayOfFeedUrls[i];
}
function done () {
if (lines.length == arrayOfFeedUrls.length) {
console.log(lines.length);
}
}
You may also want to look into promises, an alternative programming style to callbacks, which aims to avoid callback hell.
Have to admit that I'm very new to node.js and struggling to grok the callback stuff. In my limited experience, adding one more parameter to the callback function may be the trick. The hard question is, which parameter?
In your example, if the function scrape had an extra boolean "lastOne", then it could call console.log(lines) itself. Or, if it understood that a null url meant to stop. However, I don't think even this works, as I'm not sure everything will get done in order. If the 2nd URL takes forever, the last one may complete first, right??? (You could try it). In other words, I still don't know which parameter to add. Sorry...
What seems more reliable is to set a counter to urls.length, and for scrape() to decrement it each time. When the counter reaches 0, it knows that the entire process is done and it should log (or do whatever) with the result. I'm not 100% sure where to declare this counter. Coming from Java I still have little idea what is a static global, what is an instance, whatever...
Now, a true-blue node.jser would pass a function to doWhatever as an extra parameter to scrape(), so that you can do something other than console.log(). :-) But I'd settle for the check for zero.
to elaborate slightly, add a callWhenDone parameter to scrape(), and add (somewhere in all that nesting!!!)
if (--counter <= 0)
callWhenDone (lines);
Ok, so here's how i've solved the problem, feel free to comment and tell me if it's right.
var lines = [];
var arrayOfFeedUrls = [url1,url2,...];
function scrape(array){
var url = array.shift();
http.get(url, function(res) {
res.pipe(new FeedParser([options]))
.on('readable', function () {
var stream = this, item;
while (item=stream.read()) {
line = item.title;
lines.push(line);
}
}).on('end', function () {
if(array.length){
scrapeFeeds(array);
}
});
});
});
scrapeFeeds(array);
Thanks for all the answers, i'm looking more in depth to async as I've got more complicated stuff to do. Let me know what you think of my code, it's always useful.