Please could I ask for some advice on a control flow issue with node and redis? (aka Python coder trying to get used to JavaScript)
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
Basically I'd like to query a set, and then when I have the results for the set, I need to carry out a get for each result. When I've got all the data, I need to broadcast it back to the client.
Currently I do this inside two callbacks, using a global object, which seems messy. I'm not even sure if it's safe (will the code wait for one client.get to complete before starting another?).
The current code looks like this:
var all_users = [];
// Get all the users for this page.
client.smembers("page:" + current_page_id, function (err, user_ids ) {
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
}
});
But this seems very messy. Is it really the best way to do this? How can I be sure that all lookups have been performed before calling socket.broadcast?
scratches head Thanks in advance for any advice.
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
That's what Node is. (I'm pretty sure that this topic was discussed more than enough times here, look through other questions, it's definitely there)
How can I be sure that all lookups have been performed before calling socket.broadcast?
That's what is err for in callback function. This is kinda Node's standard - first parameter in callback is error object (null if everything fine). So just use something like this to be sure no errors occurred:
if (err) {
... // handle errors.
return // or not, it depends.
}
... // process results
But this seems very messy.
You'll get used to it. I'm actually finding it nice, when code is well formatted and project is cleverly structured.
Other ways are:
Using libraries to control async code-flow (Async.js, Step.js, etc.)
If spaghetti-style code is what you think mess is, define some functions to process results and pass them as parameters instead of anonymous ones.
If you totally dislike writing stuff callback-style, you might want to try streamlinejs:
var all_users = [];
// Get all the users for this page.
var user_ids = client.smembers("page:" + current_page_id, _);
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
var name = client.get('user:' + user_ids[i] + ':name', _);
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
}
socket.broadcast('all_users', all_users);
Note that a disadvantage of this variant is that only one username will be fetched at a time. Also, you should still be aware of what this code really does.
Async is a great library and you should take a look. Why ? Clean code / process / easy to track .. etc
Also, keep in mind that all your async function will be processed after your for loop. In you exemple, it may result in wrong "i" value. Use closure :
for (var i = 0; i < user_ids.length; i++) { (function(i) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
})(i)}
What you should do to know when it's finish is use a recursive pattern like async ( i think ) do. It's much simple then doing it yourself.
async.series({
getMembers: function(callback) {
client.smembers("page:" + current_page_id, callback);
}
}, function(err, results) {
var all_users = [];
async.forEachSeries(results.getMembers, function(item, cb) {
all_users.push(item);
cb();
}, function(err) {
socket.broadcast('all_users', all_users);
});
});
This code may not be valid, but you should be able to figure out how to do it.
Step library is good too ( and only 30~ line of code i think)
I don't understand why client.smembers and client.get (Redis lookups)
need to be callbacks rather than simply being statements - it makes
life very complicated.
Right, so everyone agrees callback hell is no bueno. As of this writing, callbacks are a dying feature of Node. Unfortunately, the Redis library does not have native support for returning Promises.
But there is a module you can require in like so:
const util = require("util");
This is a standard library that is included in the Node runtime and has a bunch of utility functions we can use, one of them being "promisify":
https://nodejs.org/api/util.html#util_util_promisify_original
Now of course when you asked this question seven years ago, util.promisify(original) did not exist as it was added in with the release of -v 8.0.0, so we can now update this question with an updated answer.
So promisify is a function and we can pass it a function like client.get() and it will return a new function that take the nasty callback behavior and instead wraps it up nice and neat to make it return a Promise.
So promisify takes any function that accepts a callback as the last argument and makes it instead return a Promise and it sounds like thats the exact behavior that you wanted seven years ago and we are afforded today.
const util = require("util");
client.get = util.promisify(client.get);
So we are passing a reference to the .get() function to util.promisify().
This takes your function, wraps it up so instead of implementing a callback, it instead returns a Promise. So util.promisify() returns a new function that has been promisified.
So you can take that new function and override the existing one on client.get().
Nowadays, you do not have to use a callback for Redis lookup. So now you can use the async/await syntax like so:
const cachedMembers = await client.get('user:' + user_ids[i]);
So we wait for this to be resolved and whatever it resolves with will be assigned to cachedMembers.
The code can be even further cleaned up to be more updated by using an ES6 array helper method instead of your for loop. I hope this answer is useful for current users, otherwise the OP was obsolete.
Related
I have a tool who's basic idea is as follows:
//get a bunch of couchdb databases. this is an array
const jsonFile = require('jsonfile');
let dbList = getDbList();
const filePath = 'some/path/to/file';
const changesObject = {};
//iterate the db list. do asynchronous stuff on each iteration
dbList.forEach(function(db){
let merchantDb = nano.use(db);
//get some changes from the database. validate inside callback
merchantDb.get("_changes", function(err,changes){
validateChanges(changes);
changesObject['db'] = changes.someAttribute;
//write changes to file
jsonFile.writeFile(filePath, changesObject, function (err) {
if (err) {
logger.error("Unable to write to file: ");
}
});
})
const validateChanges = function(changes) {
if (!validateLogic(changes) sendAlertMail();
}
For performance improvements the iteration is not done synchronously. Therefore there can be multiple iterations running in 'parallel'. My question is can this cause any data inconsistencies and/or any issues with the file writing process?
Edit:
The same file gets written to on each iteration.
Edit:2
The changes are stored as a JSON object with key value pairs. The key being the db name.
If you're really writing to a single file, which you appear to be (though it's hard to be sure), then no; you have a race condition in which multiple callbacks will try to write to the same file, possibly at the same time (remember, I/O isn't done on the JavaScript thread in Node unless you use the *Sync functions), which will at best mean the last one wins and will at worst mean I/O errors because of overlap.
If you're writing to separate files for each db, then provided there's no cross-talk (shared state) amongst validateChanges, validateLogic, sendAlertMail, etc., that should be fine.
Just for detail: It will start tasks (jobs) getting the changes and then writing them out; the callbacks of the calls to get won't be run until later, when all of those jobs are queued.
You are creating closures in loops, but the way you're doing it is okay, both because you're doing it within the forEach callback and because you're not using db in the get callback (which would be fine with the forEach callback but not with some other ways you might loop arrays). Details on that aspect in this question's answers if you're interested.
This line is suspect, though:
let merchantDb = nano.use('db');
I suspect you meant (no quotes):
let merchantDb = nano.use(db);
For what it's worth, it sounds from the updates to the question and your various comments like the better solution would be not to write out the file separately each time. Instead, you want to gather up the changes and then write them out.
You can do that with the classic Node-callback APIs you're using like this:
let completed = 0;
//iterate the db list. do asynchronous stuff on each iteration
dbList.forEach(function(db) {
let merchantDb = nano.use(db);
//get some changes from the database. validate inside callback
merchantDb.get("_changes", function(err, changes) {
if (err) {
// Deal with the fact there was an error (don't return)
} else {
validateChanges(changes);
changesObject[db] = changes.someAttribute; // <=== NOTE: This line had 'db' rather than db, I assume that was meant to be just db
}
if (++completed === dbList.length) {
// All done, write changes to file
jsonFile.writeFile(filePath, changesObject, function(err) {
if (err) {
logger.error("Unable to write to file: ");
}
});
}
})
});
Following the documentation, I am using this in my JS:
var store = StoreClient('my secret');
store.set('active', true);
var status = store.get('active');
The variable status never has a value. I'm clearly not using the library correctly.
For context, this is inside a switch statement that does something like this for many of the cases, where some of them need to set or get a value from the StoreClient.
The documentation uses this example:
var store = StoreClient('your secret here');
store
.set('hello', 'world')
.then(function() {
return store.get('hello');
})
.then(function(value) {
// value === 'world'
return store.delete('hello');
})
.then(function() {
callback();
})
.catch(callback);
Because I'm on the amateur side, I'm not super familiar with promises. In that example, it's unclear to me which parts of the are required in order to [a] set, and eventually, [b] get a value. I suggest including an example that doesn't have set/get/delete combined into one.
I tried this:
var store = StoreClient('my secret');
store
.set('active', true)
.then(function() {
return store.get('active');
})
.then(function() {
callback();
})
.catch(callback);
... but then I get an error that there is no output variable, even though I haven't touched the output variable at the bottom of the script.
David from Zapier's Platform team here.
Sorry about the confusion in the docs. I'll give you a quick answer on how to fix your code and a long one as to why. In the meantime, I'll make a note to update the docs with more sensical examples.
Short
Two big things:
A. Promises pick whatever was returned in the last function. If you don't bring them along, they're lost. Your code should read:
.then(function(storedVal) { // <- that variable is missing in your code
console.log('stored val is', storedVal);
})
B. You need to provide a value to the second argument of callback. There's a better example here.
.then(function(storedVal) {
callback(null, {active: storedVal});
})
Long
Here's some of the nitty gritty on how to make all Zapier code work great.
Callback
Your code runs inside AWS Lambda, which always needs to know when you're finished. It executes all of your code in a special function with a certain set of arguments. The pertinent one here is callback, a function that you can call when you're ready to exit (or have an error). You can read more about that setup here.
Like most node callbacks, the callback has the function signature callback (error, result). To throw an error, you pass something in the first spot:
callback({msg: 'thing went wrong'});
To pass a result, use the second (and nothing in the first)
callback(null, {myData: 4});
So, not passing anything there is why the zap result isn't seeing any data.
Promises
In general, callbacks suck and are confusing to work with, so we designed StoreClient to return promises. There's a lot of materials about promises online so I won't go into the details here. The important thing is that whatever gets returned from a promise's function is the argument in the next one. For example:
Promise.resolve(1)
.then(function(val) {
// val === 1
return Promise.resolve(val + 1)
})
.then(function(val) {
// val === 2
})
There's a more practical example in these docs:
var store = StoreClient('your secret here');
var outCount;
store
.get('some counter')
.then(function(count) {
count = (count || 0) + 1;
outCount = count;
return store.set('some counter', count);
})
.then(function() {
callback(null, {'the count': outCount});
})
.catch(callback);
Hopefully that clears things up a bit!
Also, if you want to give Python a try, you can do the same code, but much simpler (example here).
Either way, let us know if there's anything else we can do to help!
I'm writing a Service Wrapper in AngularJS for Odoo Server, which has all the methods that the server supports and return a deferred promise when the service is call. E.g.:
$scope.runRPC = function(){
odooSvc.fields_get('res.users').then(
function(result){
console.log(result); //return a list of users
}
);
}
However, I need it to be synchronous, and here a reason why.
In Odoo, it has its own JSON rpc API, which has several methods that depend on each other.
For example,
search_read: give u a list of everything on the model u query on
fields_get: give u the list of fields the model has
and much more.
Usually in a working application, we need to call 2 or more API methods to get the final data we want. However, because in Java, everything works asynchronously.The code I image would be nesty and complicated.
So when I make each API calls the depends on one another. it would look like this:
$scope.setLoginToContactEmail = function(){
odooSvc.search_read('res.users').then(
function(users){
for(var i=0; i < user.length; i++){
user = users[0];
login = user.login
partner_id = user.partner_id
odooSvc.read_model('res.partner', partner_id).then(
function(partner){
if(login === partner.email){
odooSvc.write_model('res.partner', partner_id, {email: login}).then(function(msg){
console.log(msg);
});
}
}
)
}
}
);
}
Vs if I could get those API run synchronously or awaits for the data to arrived before I proceed on another call. it would look more simple:
$scope.setLoginToContactEmail = function(){
var users = odooSvc.search_read('res.users');
for(var i=0; i < user.length; i++){
user = users[0];
login = user.login
partner_id = user.partner_id
partner = odooSvc.read_model('res.partner', partner_id);
if (login === partner.email){
odooSvc.write_model('res.partner', partner_id, {email: login});
}
}
}
Please advise. Thanks.
Here is a plunker that takes care for Babel transpilation:
<body ng-app="app" ng-controller="AsyncController">
<p>{{ message }}</p>
</body>
angular.module('app', []).controller('AsyncController', ['$timeout', '$scope', async function ($timeout, $scope) {
$scope.message = 'no timeout';
$scope.message = await $timeout(() => 'timeout', 2000);
$scope.$apply();
}]);
async...await is as simple as that in TypeScript and ES.next. Two things here should be noticed.
The first one is this context inside async controller - it may differ from what is expected. This may not be a problem when classes are used and async methods are bound if necessary. This is a problem for non-OOP code when constructor function is async and cannot reach its this right from the start.
Another one is that async...await is powered by native Promises. This means that $scope.$apply() should be called to trigger digest cycles, despite the fact that digest-friendly $timeout has been used.
Async is better than sync.
However, callbacks can become very messy, so we have promises. However, promises can also become messy, although not quite as bad. Async-await is the best way to get sync-looking async code, but you have to transpile. It depends how much tooling you want to use.
Here is how I would write your ES5 code (without starting a new line after .then(, which reduces the indents a bit, and I also made some changes in the for loop as I wasn't sure what you meant):
$scope.setLoginToContactEmail = function () {
odooSvc.search_read('res.users').then(function (users) {
for (var i = 0; i < users.length; i++) {
var user = users[i]
var login = user.login
var partner_id = user.partner_id
odooSvc.read_model('res.partner', partner_id).then(function (partner) {
if (login === partner.email) {
odooSvc.write_model('res.partner', partner_id, { email: login }).then(function (msg) {
console.log(msg)
})
}
})
}
})
}
With ES6 and the proposal for async functions that can become:
$scope.setLoginToContactEmail = async function () {
const users = await odooSvc.search_read('res.users')
for (let user of users) {
const { login, partner_id } = user
const partner = await odooSvc.read_model('res.partner', partner_id)
if (login === partner.email) {
const msg = await odooSvc.write_model('res.partner', partner_id, { email: login })
console.log(msg)
}
}
}
It depends on how much transpiling you want to do. Personally, I would adopt part of ES6 (let/const, destructuring, arrow functions, template strings, modules, unicode improvements, spread operator / rest parameters, improved object literals, and possibly class literals), the stuff that you use most frequently / isn't too difficult to transpile. Maybe also use async-await: it's not a part of ES6 or ES2016, but it is at stage 3 now so it is pretty stable, and it does make async code a lot easier to read. The caveat is that you have to transpile new ES6/ES2016/etc. features using Babel or TypeScript, and use a polyfill for promises (which async-await uses internally).
TL;DR: if you find yourself descending into async hell, the async-await proposal is probably the best solution.
Asynchronous methods are the power of JavaScript which we must be utilized. And the reason is, if the task will take so long time to execute then we do not have to wait and meanwhile can put other task in execution.
We can see its advantage over the UI where we can put long taking time task in the callback and can avoid to UI be grayout.
So I would suggest to adopt aync approach.
But the way you are deciding the execution of the methods is not the better way in Angular way.
Angular has provided you $q.when(method()).then(successCallback(response));, through it control would come in successCallback only when method() execution is done and method() response can be used to make another promise call. This approach would help you to reduce the complexity of the code because you tried to make the chain of callbacks which is not correct conventionally.
$scope.setLoginToContactEmail = function(){
$q.when(searchRead()).then(function(res) {
modelRead(res);
});
}
function searchRead() {
odooSvc.search_read('res.users').then(
// #TODO
}
);
}
function modelRead(res) {
odooSvc.read_model('res.partner').then(
// #TODO
)
}
I have a function that pulls out from database a random question from Questions collection.
Game_Questions.js - console.log below prints out correct value (string I need), so I thought that return will let yield give me back same value.
exports.random_Question = function *() {
yield Questions.findRandom().limit(1).exec(function(err,question){
console.log("rand q: " + question[0].text);
return question[0].text;
});
}
Game.js:
var Game_Questions = require('./backend/Game_Questions');
And here I want to access question[0].text value from random_Question function from code snippet above (Game_Questions.js). What I've tried so far:
var found_Question = Game_Questions.random_Question();
var found_Question = Game_Questions.random_Question().next().value;
Those two return [Object object] which after using JSON.stringify() shows that the object is:
{"value":{"emitter":{"domain":null,"_events":{}},"emitted":{},"ended":true},"done":false}
I also tried using co(function*()) but it also didn't let me take out the value. Please help how to access it?
The answer by #remus is a callback approach and Koa was designed explicitly to ditch callbacks. So while it's perfectly good code and would fit an Express application it is completely at odds with the design philosophy behind Koa.
From the looks of it you are using Mongoose which has supported promises for async operations since version 4.0 (which was released Apr 2015) which should allow a yield approach to be taken. Note I'm making an assumption you are working with Mongoose - I hope I'm not wrong!
Here is some nice documentation on how Mongoose would fit nicely with koa.
So first of all make sure you are using a version of Mongoose that supports using yield. If not you'll have to use the #remus approach or manually wrap each of your methods so they are yield compatible (i.e. wrapping with promises).
But if you are using a compatible version (4.0 and upwards) then your code would look something like the following:
exports.random_Question = function *() {
var result;
try {
result = yield Questions.findRandom().limit(1).exec();
} catch(e) {
console.log(e.stack);
throw e;
}
console.log("rand q: " + result[0].text);
return result[0].text;
}
Note that I'm assuming the result is an array based on the code you supplied.
The above example doesn't necessarily have to be a generator function. It could also be a normal function that returns a Promise. So alternatively something like this could also be done:
exports.random_Question = function() {
return Questions.findRandom()
.limit(1)
.exec()
.then(function() {
// I'm assuming mongoose assigns the value
// being resolved in exec() to 'this'
var question = this[0];
console.log("rand q: " + question.text);
return question.text;
}).catch(function(e) {
console.log(e.stack);
throw e;
});
}
So for the randomQuestion function all that is important is that it can be yielded by co which handles the Koa application flow control – check tj/co on GitHub for the different objects you can yield.
So finally getting back to the Koa Middleware we can yield either of the above code snippets in the exact same manner. So we'd do:
var koa = require("koa");
var app = module.exports = koa();
var Game_Questions = require('./backend/Game_Questions');
app.use(function*() {
var resultText;
try {
resultText = yield Game_Questions.random_Question();
} catch(e) {
this.throw(500);
}
this.body = resultText;
this.status = 200;
});
app.listen(3000);
Something else to note is that I'm a little unsure of the findRandom method in the mongoose query since I don't know if it plays nicely with the Promise features of mongoose. Personally I'd get a normal mongoose query working using yield before reintroducing findRandom just to make sure it's not causing an issue.
My answer is getting a bit long at this point so I'll leave it at that.
Your syntax is pretty strange, but not sure if that's specific to Koa or not?
Because Node.js is event based, use a callback instead:
exports.random_Question = function(callback) {
Questions.findRandom().limit(1).exec(function(err, question){
callback(err, question);
});
}
And use it:
var Game_Questions = require('./backend/Game_Questions');
Game_Questions.random_Question(function(err, question) {
console.log(question);
});
Of some concern as well is your question states you're trying to reference Game_Questions.randomQuestion() when your function is actually named random_Question.
As a js/node newcomer, I'm having some problems understanding how I can get around this issue.
Basically I have a list of objects that I would like to save to a MongoDB database if they don't already exist.
Here is some code:
var getDataHandler = function (err, resp, body) {
var data = JSON.parse(body);
for (var i=0; i < data.length; i++) {
var item = data[i];
models.Entry.findOne({id: item.id}, function(err, res) {
if (err) { }
else if (result === null) {
var entry = new models.Entry(item);
feedbackEntry.save(function(err, result) {
if (err) {}
});
}
});
}
}
The problem I have is that because it is asynchronous, once the new models.Entry(item) line is executed the value of item will be equal to the last element in the data array for every single callback.
What kind of pattern can I use to avoid this issue ?
Thanks.
Two kinds of patterns are available :
1) Callbacks. That is you go on calling functions from your functions by passing them as parameters. Callbacks are generally fine but, especially server side when dealing with database or other asynchronous resources, you fast end in "callback hell" and you may grow tired of looking for tricks to reduce the indentation levels of your code. And you may sometimes wonder how you really deal with exceptions. But callbacks are the basis : you must understand how to deal with that problem using callbacks.
2) Promises. Using promises you may have something like that (example from my related blog post) :
db.on(userId) // get a connection from the pool
.then(db.getUser) // use it to issue an asynchronous query
.then(function(user){ // then, with the result of the query
ui.showUser(user); // do something
}).finally(db.off); // and return the connection to the pool
Instead of passing the next function as callback, you just chain with then (in fact it's a little more complex, you have other functions, for example to deal with collections and parallel resolution or error catching in a clean way).
Regarding your scope problem with the variable evolving before the callback is called, the standard solution is this one :
for (var i=0; i<n; i++) {
(function(i){
// any function defined here (a callback) will use the value of i fixed when iterating
})(i);
});
This works because calling a function creates a scope and the callback you create in that scope retains a pointer to that scope where it will fetch i (that's called a closure).