I read many posts on how to create async functions in NodeJS but i cant figure it out! I know this is the most asked topic, but look at this sample code here:
function test2(){
console.log("Check x");
}
function test(callback){
for(var i=0;i<1000000000000;i++){}
callback();
}
console.log("Check 1");
test(test2);
console.log("Check 2");
console.log("Check 3");
Now shouldnt NodeJS consider test to be a sync function !?
and if not then how do i create it so that i can reach the logging of check 2 and 3 without waiting for the loop to end ?
Not everything in Node.js is asynchronous.
Asynchronous process only happens when I/O or Event are involved like accessing file system, handling network request, reading data from database, etc.
Example:
var fs = require('fs); //node.js built-in file system which requires I/O from storage
function getDataFromFile(callback) {
//fs.readFile is asynchronous process
fs.readFile('path/to/file', (err, data) => {
if (err) throw err;
callback(data);
});
}
getDataFromFile(function(data) {
//this is asynchronous callback from getDataFromFile()
console.log('data ' + data);
});
Related
I am currently using angularJS as my frontend framework while expressJS on node.JS to provide the REST API as my backend framework.
For my insertService function in node.JS, after I insert some value into the database, I want to commit and release the connection. However, I am getting the following error:
NJS-032: connection cannot be released because a database call is in progress
These are my commit and release functions:
function doRelease(connection)
{
console.log("before release");
connection.release(
function(err) {
if (err)
console.error(err.message);
});
console.log("after release");
}
function doCommit(connection)
{
console.log("before commit");
connection.commit(
function(err) {
if (err)
console.error(err.message);
});
console.log("after commit");
doRelease(connection);
}
This is how I am calling them:
app.post('/addService', function(req, res)
{
console.log("addService is called");
oracledb.getConnection(
DBconfig,
function(err, connection)
{
if (err) { console.error(err); return; }
connection.execute(
"Insert into mylist Values (MYLIST_ID_SEQUENCE.nextval ,'"+req.body.name+"','"+req.body.description+"')",
function(err, result)
{
if (err) { console.error(err); doRelease(connection); return; }
console.log("added mylist: "+req.body.name);
doCommit(connection);
}
);
}
);
})
This is the print out:
addService is called
before commit
after commit
before release
after release
NJS-032: connection cannot be released because a database call is in progress
How should I handle this issue? Should I sleep for 1 second before calling release? Should I recursively call release until it is successful?
Thanks
Before I answer your question I have to point out that your code is currently open to SQL injection vulnerabilities. Values from end users (in this case from req.body) should not be concatenated into the SQL, they should be "bound" in with bind variables.
Also, you're API will not scale if you're getting one off connections. You should create a connection pool and get connections from the pool.
Finally, you can use autoCommit (in the execute options object) to save an unnecessary round trip.
Now to your question, you have to wait until the commit finishes before releasing the connection. In doCommit, move the call to doRelease so that it's in the callback to connection.commit:
function doCommit(connection) {
console.log("before commit");
connection.commit(function(err) {
if (err) {
console.error(err.message);
}
console.log("after commit");
doRelease(connection);
});
}
On another note, I have a series on building REST APIs you might want to check out. In part 2, on database basics, I show how you can simplify these types of simple statement executions. There are links to the code in GitHub so you should be able to pull it down to see how it works for you.
Here's my code:
console.log("Path 1: " + fullName);
fs.stat(fullName, function(err, stats) {
console.log("Path 2: " + fullName);
}, function(err) { // I don't know if this part actually does something
console.log("An error occurred: " + err); // but I saw it in a different SO answer
});
The code simply doesn't run for some files. As in, my logs will show Path 1 with the file but not path 2 with the file, and also none of the "an error occurred". I was thinking maybe the files have an invalid character, because they all have equal signs in them. They look like this:
...file.device_type=mobile.jsx
Even if that's the case, why no error or anything? And if so, how can I actually stat these files?
You aren't logging or checking for an error. fs.stat() accepts two parameters only, the filename and a callback function. You are passing three parameters, the filename and two separate callbacks. So that second callback is just wrong. Then, in the first callback, you need to check the err variable to see if an error occurred.
This is the correct usage:
fs.stat(fullName, function(err, stats) {
if (err) {
console.log("Error in fs.stat(): ", err);
} else {
console.log("Got stats: ", stats);
}
});
If you're using this proper form and you still don't see either message in the console, then I'd suggest putting an exception handler around it to see if something else is going on:
try {
console.log("about to call fs.stat()");
fs.stat(fullName, function(err, stats) {
if (err) {
console.log("Error in fs.stat(): ", err);
} else {
console.log("Got stats: ", stats);
}
});
} catch(e) {
console.log("fs.stat() exception: ", e);
}
In looking at the source code for fs.stat(), there are several ways it could throw a synchronous exception, particularly if it detects invalid arguments passed to it. As usual, the node.js documentation does not describe that behavior, but you can see it in the code. This is the code for fs.stat():
fs.stat = function(path, callback) {
callback = makeStatsCallback(callback);
if (handleError((path = getPathFromURL(path)), callback))
return;
if (!nullCheck(path, callback)) return;
var req = new FSReqWrap();
req.oncomplete = callback;
binding.stat(pathModule._makeLong(path), req);
};
Both makeStatsCallback() and handleError() can throw an exception (when you look at their implementations in that same file).
I do not know where you got the notion of passing two callbacks to fs.stat(). As documented here, it does not accept two callback functions. It looks remotely like a promisified version of the fs library where every async operation returns a promise and then you can pass two callbacks to fs.statPromise.then(fn1, fn2), but I have no idea if that's where you saw this or not.
https://nodejs.org/api/fs.html#fs_fs_stat_path_callback
As per the documentation it shouldn't have the 3rd param function
The Async callback in Meteor.call does not wait for the result from the Meteor.method.This is the code.
Meteor.call("fetchData",function (err,res) {
if (err){
console.log("error ", err);
}else {
console.log("success ", res);
return res;
}
});//calling this from onRendered of client/somejs.js
Here is the method
fetchData :function(){
HTTP.call("POST","http://localhost:8080",{
data:'{"apple":"grape"}'
},function (err,res) {
if (err){
console.log("error ", err);
}else {
console.log("success ", res);
return res;
}
})
}//Server/methods.js
When Meteor.call is triggered,i get a log on the Server as success with its result.
On the client i get success undefined .
The call on the client does not wait for the result.Also i tried Fibers and Synchronous execution on the server.It does not work for me.In this case a publish is blocked(i guess due to the API call).
Another thing is that i tried the same with a DB query instead of API call.That works fine.I get the result from the Method.
Where am i going wrong.Help.
Thanks
Sanjith.
You were on the right path with futures. By default, Meteor's methods are async, so some "waiting" mechanism is needed on the client. For this, I'd recommend either using Meteor.wrapAsync or Promises. Here are two detailed explanations on implementing both:
https://themeteorchef.com/snippets/synchronous-methods/#tmc-using-wrapasync
https://themeteorchef.com/snippets/promise-based-modules/#tmc-calling-our-promise-based-module-from-the-client
The second link is more focused on structuring your code using promises, but gives a nice demo of how to call a method that relies on a Promise's response.
I am building a NodeJS server using Express4. I use this server as a middleman between frontend angular app and 3rd party API.
I created a certain path that my frontend app requests and I wish on that path to call the API multiple times and merge all of the responses and then send the resulting response.
I am not sure how to do this as I need to wait until each API call is finished.
Example code:
app.post('/SomePath', function(req, res) {
var merged = [];
for (var i in req.body.object) {
// APIObject.sendRequest uses superagent module to handle requests and responses
APIObject.sendRequest(req.body.object[i], function(err, result) {
merged.push(result);
});
}
// After all is done send result
res.send(merged);
});
As you can see Im calling the API within a loop depending on how many APIObject.sendRequest I received within request.
How can I send a response after all is done and the API responses are merged?
Thank you.
Check out this answer, it uses the Async module to make a few requests at the same time and then invokes a callback when they are all finished.
As per #sean's answer, I believe each would fit better than map.
It would then look something like this:
var async = require('async');
async.each(req.body.object, function(item, callback) {
APIObject.sendRequest(item, function(err, result)) {
if (err)
callback(err);
else
{
merged.push(result);
callback();
}
}
}, function(err) {
if (err)
res.sendStatus(500); //Example
else
res.send(merged);
});
First of all, you can't do an async method in a loop, that's not correct.
You can use the async module's map function.
app.post('/SomePath', function(req, res) {
async.map(req.body.object, APIObject.sendRequest, function(err, result) {
if(err) {
res.status(500).send('Something broke!');
return;
}
res.send(result);
});
});
I have a node.js server connected to a PostgreSQL database using the pg module. At one point I will insert data into two different database tables for a single HTTP POST. If the first query fails, the second should not be executed, but I have some trouble achieving this.
My generalized query function looks like this:
// General insertion function. If endResponse is true, the response will be ended,
// if it is false it will just write to the response and not end it (unless errors occurs).
function performInsertQuery(request, response, query, endResponse) {
var pg = require('pg');
var client = new pg.Client(request.app.get('postgreConnection'));
client.connect(function(error) {
if (error)
{
message = 'Could not connect to PostgreSQL database: ' + error;
response.end(message);
console.error(message);
client.end();
}
else
{
client.query(query, function (error, result)
{
if (error)
{
var message = 'Error running query ' + query + ': ' + error;
response.writeHead(500, {'content-type':'application/json'});
response.end(message);
console.error(message);
client.end();
}
else
{
var message = 'Query performed: ' + query;
if (endResponse)
{
response.end(message);
}
else
{
response.write(message + '\n');
}
console.log(message);
client.end();
}
});
}
});
}
Later, I have something like the following:
// query = some query
performInsertQuery(request, response, query, false);
// Stop running here if there was any errors running the query.
// Currently, this gets executed even though the first query fails.
// anotherQuery = another query
performInsertQuery(request, response, anotherQuery, true);
I have tried returning true and false from the function performInsertQuery, but since these are inner functions the result is not returned properly from the outer functions. Also, some if it is run asynchronously, which makes things a bit harder as well. I was not able to make it work with try/catch around the call to performInsertQuery either. I guess I could do another query to see if data was inserted, but this seems unnecessary and not very robust.
What would be the best way to return a success or failure state from performInsertQuery?
I know this doesn't exactly handle your question as you intended(dealing with this entirely in node.js), but this sounds like an excellent usecase for a Postgres transaction....Do not commit results unless all insertions/updates are successful. SQL transactions are built for scenarios like yours.
Here are the docs and example code for it with your node module.
https://github.com/brianc/node-postgres/wiki/Transactions
http://www.postgresql.org/docs/9.2/static/tutorial-transactions.html