waits for async data in AngularJS or Javascript (Browser) - javascript

I'm writing a Service Wrapper in AngularJS for Odoo Server, which has all the methods that the server supports and return a deferred promise when the service is call. E.g.:
$scope.runRPC = function(){
odooSvc.fields_get('res.users').then(
function(result){
console.log(result); //return a list of users
}
);
}
However, I need it to be synchronous, and here a reason why.
In Odoo, it has its own JSON rpc API, which has several methods that depend on each other.
For example,
search_read: give u a list of everything on the model u query on
fields_get: give u the list of fields the model has
and much more.
Usually in a working application, we need to call 2 or more API methods to get the final data we want. However, because in Java, everything works asynchronously.The code I image would be nesty and complicated.
So when I make each API calls the depends on one another. it would look like this:
$scope.setLoginToContactEmail = function(){
odooSvc.search_read('res.users').then(
function(users){
for(var i=0; i < user.length; i++){
user = users[0];
login = user.login
partner_id = user.partner_id
odooSvc.read_model('res.partner', partner_id).then(
function(partner){
if(login === partner.email){
odooSvc.write_model('res.partner', partner_id, {email: login}).then(function(msg){
console.log(msg);
});
}
}
)
}
}
);
}
Vs if I could get those API run synchronously or awaits for the data to arrived before I proceed on another call. it would look more simple:
$scope.setLoginToContactEmail = function(){
var users = odooSvc.search_read('res.users');
for(var i=0; i < user.length; i++){
user = users[0];
login = user.login
partner_id = user.partner_id
partner = odooSvc.read_model('res.partner', partner_id);
if (login === partner.email){
odooSvc.write_model('res.partner', partner_id, {email: login});
}
}
}
Please advise. Thanks.

Here is a plunker that takes care for Babel transpilation:
<body ng-app="app" ng-controller="AsyncController">
<p>{{ message }}</p>
</body>
angular.module('app', []).controller('AsyncController', ['$timeout', '$scope', async function ($timeout, $scope) {
$scope.message = 'no timeout';
$scope.message = await $timeout(() => 'timeout', 2000);
$scope.$apply();
}]);
async...await is as simple as that in TypeScript and ES.next. Two things here should be noticed.
The first one is this context inside async controller - it may differ from what is expected. This may not be a problem when classes are used and async methods are bound if necessary. This is a problem for non-OOP code when constructor function is async and cannot reach its this right from the start.
Another one is that async...await is powered by native Promises. This means that $scope.$apply() should be called to trigger digest cycles, despite the fact that digest-friendly $timeout has been used.

Async is better than sync.
However, callbacks can become very messy, so we have promises. However, promises can also become messy, although not quite as bad. Async-await is the best way to get sync-looking async code, but you have to transpile. It depends how much tooling you want to use.
Here is how I would write your ES5 code (without starting a new line after .then(, which reduces the indents a bit, and I also made some changes in the for loop as I wasn't sure what you meant):
$scope.setLoginToContactEmail = function () {
odooSvc.search_read('res.users').then(function (users) {
for (var i = 0; i < users.length; i++) {
var user = users[i]
var login = user.login
var partner_id = user.partner_id
odooSvc.read_model('res.partner', partner_id).then(function (partner) {
if (login === partner.email) {
odooSvc.write_model('res.partner', partner_id, { email: login }).then(function (msg) {
console.log(msg)
})
}
})
}
})
}
With ES6 and the proposal for async functions that can become:
$scope.setLoginToContactEmail = async function () {
const users = await odooSvc.search_read('res.users')
for (let user of users) {
const { login, partner_id } = user
const partner = await odooSvc.read_model('res.partner', partner_id)
if (login === partner.email) {
const msg = await odooSvc.write_model('res.partner', partner_id, { email: login })
console.log(msg)
}
}
}
It depends on how much transpiling you want to do. Personally, I would adopt part of ES6 (let/const, destructuring, arrow functions, template strings, modules, unicode improvements, spread operator / rest parameters, improved object literals, and possibly class literals), the stuff that you use most frequently / isn't too difficult to transpile. Maybe also use async-await: it's not a part of ES6 or ES2016, but it is at stage 3 now so it is pretty stable, and it does make async code a lot easier to read. The caveat is that you have to transpile new ES6/ES2016/etc. features using Babel or TypeScript, and use a polyfill for promises (which async-await uses internally).
TL;DR: if you find yourself descending into async hell, the async-await proposal is probably the best solution.

Asynchronous methods are the power of JavaScript which we must be utilized. And the reason is, if the task will take so long time to execute then we do not have to wait and meanwhile can put other task in execution.
We can see its advantage over the UI where we can put long taking time task in the callback and can avoid to UI be grayout.
So I would suggest to adopt aync approach.
But the way you are deciding the execution of the methods is not the better way in Angular way.
Angular has provided you $q.when(method()).then(successCallback(response));, through it control would come in successCallback only when method() execution is done and method() response can be used to make another promise call. This approach would help you to reduce the complexity of the code because you tried to make the chain of callbacks which is not correct conventionally.
$scope.setLoginToContactEmail = function(){
$q.when(searchRead()).then(function(res) {
modelRead(res);
});
}
function searchRead() {
odooSvc.search_read('res.users').then(
// #TODO
}
);
}
function modelRead(res) {
odooSvc.read_model('res.partner').then(
// #TODO
)
}

Related

What is the correct pattern with generators and iterators for managing a stream

I am trying to figure out how to arrange a pair of routines to control writing to a stream using the generator/iterator functions in ES2015. Its a simple logging system to use in nodejs
What I am trying to achieve is a function that external processes can call to write to a log.I am hoping that the new generator/iterator functions means that if it needs to suspend/inside this routine that is transparent.
stream.write should normally return immediately, but can return false to say that the stream is full. In this case it needs to wait for stream.on('drain',cb) to fire before returning
I am thinking that the actual software that writes to the stream is a generator function which yields when it is ready to accept another request, and that the function I provide to allow external people to call the stream is an interator, but I might have this the wrong way round.
So, something like this
var stopLogger = false;
var it = writer();
function writeLog(line) {
it.next(line);
})
function *writer() {
while (!stopLogger) {
line = yield;
if(!stream.write) {
yield *WaitDrain(); //can't continue until we get drain
}
}
});
function *waitDrain() {
//Not sure what do do here to avoid waiting
stream.on('drain', () => {/*do I yield here or something*/});
I found the answer here https://davidwalsh.name/async-generators
I have it backwards.
The code above should be
var stopLogger = false;
function *writeLog(line) {
yield writer(line)
})
var it = writeLog();
function writer(line) {
if (stopLogger) {
setTimeout(()=>{it.next();},1};//needed so can get to yield
} else {
if(stream.write(line)) {
setTimeout(()=>{it.next();},1}; //needed so can get to yeild
}
}
}
stream.on('drain', () => {
it.next();
}
I haven't quite tried this, just translated from the above article, and there is some complication around errors etc which the article suggests can be solved by enhancing the it operator to return a promise which can get resolved in a "runGenerator" function, But it solved my main issue, which was about how should the pattern work.

Getting value of generator function, koa

I have a function that pulls out from database a random question from Questions collection.
Game_Questions.js - console.log below prints out correct value (string I need), so I thought that return will let yield give me back same value.
exports.random_Question = function *() {
yield Questions.findRandom().limit(1).exec(function(err,question){
console.log("rand q: " + question[0].text);
return question[0].text;
});
}
Game.js:
var Game_Questions = require('./backend/Game_Questions');
And here I want to access question[0].text value from random_Question function from code snippet above (Game_Questions.js). What I've tried so far:
var found_Question = Game_Questions.random_Question();
var found_Question = Game_Questions.random_Question().next().value;
Those two return [Object object] which after using JSON.stringify() shows that the object is:
{"value":{"emitter":{"domain":null,"_events":{}},"emitted":{},"ended":true},"done":false}
I also tried using co(function*()) but it also didn't let me take out the value. Please help how to access it?
The answer by #remus is a callback approach and Koa was designed explicitly to ditch callbacks. So while it's perfectly good code and would fit an Express application it is completely at odds with the design philosophy behind Koa.
From the looks of it you are using Mongoose which has supported promises for async operations since version 4.0 (which was released Apr 2015) which should allow a yield approach to be taken. Note I'm making an assumption you are working with Mongoose - I hope I'm not wrong!
Here is some nice documentation on how Mongoose would fit nicely with koa.
So first of all make sure you are using a version of Mongoose that supports using yield. If not you'll have to use the #remus approach or manually wrap each of your methods so they are yield compatible (i.e. wrapping with promises).
But if you are using a compatible version (4.0 and upwards) then your code would look something like the following:
exports.random_Question = function *() {
var result;
try {
result = yield Questions.findRandom().limit(1).exec();
} catch(e) {
console.log(e.stack);
throw e;
}
console.log("rand q: " + result[0].text);
return result[0].text;
}
Note that I'm assuming the result is an array based on the code you supplied.
The above example doesn't necessarily have to be a generator function. It could also be a normal function that returns a Promise. So alternatively something like this could also be done:
exports.random_Question = function() {
return Questions.findRandom()
.limit(1)
.exec()
.then(function() {
// I'm assuming mongoose assigns the value
// being resolved in exec() to 'this'
var question = this[0];
console.log("rand q: " + question.text);
return question.text;
}).catch(function(e) {
console.log(e.stack);
throw e;
});
}
So for the randomQuestion function all that is important is that it can be yielded by co which handles the Koa application flow control – check tj/co on GitHub for the different objects you can yield.
So finally getting back to the Koa Middleware we can yield either of the above code snippets in the exact same manner. So we'd do:
var koa = require("koa");
var app = module.exports = koa();
var Game_Questions = require('./backend/Game_Questions');
app.use(function*() {
var resultText;
try {
resultText = yield Game_Questions.random_Question();
} catch(e) {
this.throw(500);
}
this.body = resultText;
this.status = 200;
});
app.listen(3000);
Something else to note is that I'm a little unsure of the findRandom method in the mongoose query since I don't know if it plays nicely with the Promise features of mongoose. Personally I'd get a normal mongoose query working using yield before reintroducing findRandom just to make sure it's not causing an issue.
My answer is getting a bit long at this point so I'll leave it at that.
Your syntax is pretty strange, but not sure if that's specific to Koa or not?
Because Node.js is event based, use a callback instead:
exports.random_Question = function(callback) {
Questions.findRandom().limit(1).exec(function(err, question){
callback(err, question);
});
}
And use it:
var Game_Questions = require('./backend/Game_Questions');
Game_Questions.random_Question(function(err, question) {
console.log(question);
});
Of some concern as well is your question states you're trying to reference Game_Questions.randomQuestion() when your function is actually named random_Question.

Writing insert statements with knex.js on Node API confusion

I've a problem I can't really seem to wrap my head around. It's very specific to the Knex.JS implementation and I'm sure has nothing to do with PostgreSQL.
The following implementation works. When inserting a moderate amount (~500 statements). On larger amounts this fails due to other reasons. Regardless, the following will not work for my use case, I need something like the next section.
import knex = require("knex");
(function (items) {
let db = knex.table("items");
db.truncate();
let foo = [];
items.forEach(function(item) {
foo.push({
id : item.id,
item_data : JSON.stringify(item)
});
});
db.insert(foo).then(function () { /*..*/ });
}(items))
But the following doesn't:
import knex = require("knex");
(function (items) {
let db = knex.table("items");
db.truncate();
let foo = [];
items.forEach(function(item) {
db.then(function() {
return db.insert(foo).into("items");
});
});
db.then(function () { console.log("Done"); });
}(items))
What doesn't work is this:
An inconsistent amount of rows are inserted. In some implementations it's a lot MORE than I have items (?!)
I get a lot of duplicate key errors in this implementation, since I have a unique constraint
Additional information:
The set contains no duplicate keys
I'm using PostgreSQL as backend
The question is mostly how to implement the desired behaviour. The ideal situation deals in chunks of say 500 "items". I've already posted a question with the project (https://github.com/tgriesser/knex/issues/826) but I'm hoping some people of the Knex.JS community are more active here on SO.
Your solution is correct (promise chaining), however since you're using Knex it ships with Bluebird which already provides a utility method for this:
var Promise = require("bluebird"); // also used internally by Knex so free to require
Promise.each(items, db.insert.bind(db));
Would do the same thing as:
items.forEach(function(item) {
chain = chain.then(function () {
return db.insert(item);
});
});
I've found the solution. I'm not entirely convinced the problem is the fault of Knex.js or if it's my own lack of experiences with Promises in general.
I found inspiration in the work done by Tim Griesser here: https://github.com/tgriesser/knex/blob/batch-insert/lib/util/batch-insert.js
Basically what he did is to add chunks to a promise chain. Perhaps this can be done directly on the Knex library, but for readability I've kept it separate.
import knex = require("knex");
(function (items) {
let db = knex.table("items");
// This is the basic operation to add a promise to the chain.
chain = chain.then(function() { return db.truncate(); });
let foo = [];
items.forEach(function(item) {
// Add db.insert() promises to our promise chain
// This can easily be changed to include chunks and/or streams
chain = chain.then(function () {
return db.insert(item);
});
});
// Start resolving the promises once our db.then() is invoked.
return db.then(function(){
return chain.then();
});
}(items));

Node.js async to sync

How can I make this work
var asyncToSync = syncFunc();
function syncFunc() {
var sync = true;
var data = null;
query(params, function(result){
data = result;
sync = false;
});
while(sync) {}
return data;
}
I tried to get sync function from async one,
I need it to use FreeTds async query as sync one
Use deasync - a module written in C++ which exposes Node.js event loop to JavaScript. The module also exposes a sleep function that blocks subsequent code but doesn't block entire thread, nor incur busy wait. You can put the sleep function in your while loop:
var asyncToSync = syncFunc();
function syncFunc() {
var sync = true;
var data = null;
query(params, function(result){
data = result;
sync = false;
});
while(sync) {require('deasync').sleep(100);}
return data;
}
Nowadays this generator pattern can be a fantastic solution in many situations:
// nodejs script doing sequential prompts using async readline.question function
var main = (function* () {
// just import and initialize 'readline' in nodejs
var r = require('readline')
var rl = r.createInterface({input: process.stdin, output: process.stdout })
// magic here, the callback is the iterator.next
var answerA = yield rl.question('do you want this? ', res=>main.next(res))
// and again, in a sync fashion
var answerB = yield rl.question('are you sure? ', res=>main.next(res))
// readline boilerplate
rl.close()
console.log(answerA, answerB)
})() // <-- executed: iterator created from generator
main.next() // kick off the iterator,
// runs until the first 'yield', including rightmost code
// and waits until another main.next() happens
You can do it with node-sync lib
var sync = require('sync');
sync(function(){
var result = query.sync(query, params);
// result can be used immediately
})
Notice: your query must use standart callback call (with error first): callback(error, result).
If you can't change query method, just create .async() wrapper (see github link).
I've been using syncrhonize.js with great success. There's even a pending pull request (which works quite well) to support async functions which have multiple parameters. Far better and easier to use than node-sync imho. Added bonus that it has easy-to-understand and thorough documentation, whereas node-sync does not.
The issue you are having is that your tight while loop is blocking. So I don't think your query callback will ever be run. I think you need to use setTimeout or the like to prevent the function from blocking, but should you do so, the function will return before the callback is called. This functionality must be implemented at a lower level.
If you are in the browser, you might check out this article. In node you have to rely on the implementation of whatever you're querying. It may or may not provide synchronous methods.

Control flow issue with node/redis and callbacks?

Please could I ask for some advice on a control flow issue with node and redis? (aka Python coder trying to get used to JavaScript)
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
Basically I'd like to query a set, and then when I have the results for the set, I need to carry out a get for each result. When I've got all the data, I need to broadcast it back to the client.
Currently I do this inside two callbacks, using a global object, which seems messy. I'm not even sure if it's safe (will the code wait for one client.get to complete before starting another?).
The current code looks like this:
var all_users = [];
// Get all the users for this page.
client.smembers("page:" + current_page_id, function (err, user_ids ) {
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
}
});
But this seems very messy. Is it really the best way to do this? How can I be sure that all lookups have been performed before calling socket.broadcast?
scratches head Thanks in advance for any advice.
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
That's what Node is. (I'm pretty sure that this topic was discussed more than enough times here, look through other questions, it's definitely there)
How can I be sure that all lookups have been performed before calling socket.broadcast?
That's what is err for in callback function. This is kinda Node's standard - first parameter in callback is error object (null if everything fine). So just use something like this to be sure no errors occurred:
if (err) {
... // handle errors.
return // or not, it depends.
}
... // process results
But this seems very messy.
You'll get used to it. I'm actually finding it nice, when code is well formatted and project is cleverly structured.
Other ways are:
Using libraries to control async code-flow (Async.js, Step.js, etc.)
If spaghetti-style code is what you think mess is, define some functions to process results and pass them as parameters instead of anonymous ones.
If you totally dislike writing stuff callback-style, you might want to try streamlinejs:
var all_users = [];
// Get all the users for this page.
var user_ids = client.smembers("page:" + current_page_id, _);
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
var name = client.get('user:' + user_ids[i] + ':name', _);
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
}
socket.broadcast('all_users', all_users);
Note that a disadvantage of this variant is that only one username will be fetched at a time. Also, you should still be aware of what this code really does.
Async is a great library and you should take a look. Why ? Clean code / process / easy to track .. etc
Also, keep in mind that all your async function will be processed after your for loop. In you exemple, it may result in wrong "i" value. Use closure :
for (var i = 0; i < user_ids.length; i++) { (function(i) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
})(i)}
What you should do to know when it's finish is use a recursive pattern like async ( i think ) do. It's much simple then doing it yourself.
async.series({
getMembers: function(callback) {
client.smembers("page:" + current_page_id, callback);
}
}, function(err, results) {
var all_users = [];
async.forEachSeries(results.getMembers, function(item, cb) {
all_users.push(item);
cb();
}, function(err) {
socket.broadcast('all_users', all_users);
});
});
This code may not be valid, but you should be able to figure out how to do it.
Step library is good too ( and only 30~ line of code i think)
I don't understand why client.smembers and client.get (Redis lookups)
need to be callbacks rather than simply being statements - it makes
life very complicated.
Right, so everyone agrees callback hell is no bueno. As of this writing, callbacks are a dying feature of Node. Unfortunately, the Redis library does not have native support for returning Promises.
But there is a module you can require in like so:
const util = require("util");
This is a standard library that is included in the Node runtime and has a bunch of utility functions we can use, one of them being "promisify":
https://nodejs.org/api/util.html#util_util_promisify_original
Now of course when you asked this question seven years ago, util.promisify(original) did not exist as it was added in with the release of -v 8.0.0, so we can now update this question with an updated answer.
So promisify is a function and we can pass it a function like client.get() and it will return a new function that take the nasty callback behavior and instead wraps it up nice and neat to make it return a Promise.
So promisify takes any function that accepts a callback as the last argument and makes it instead return a Promise and it sounds like thats the exact behavior that you wanted seven years ago and we are afforded today.
const util = require("util");
client.get = util.promisify(client.get);
So we are passing a reference to the .get() function to util.promisify().
This takes your function, wraps it up so instead of implementing a callback, it instead returns a Promise. So util.promisify() returns a new function that has been promisified.
So you can take that new function and override the existing one on client.get().
Nowadays, you do not have to use a callback for Redis lookup. So now you can use the async/await syntax like so:
const cachedMembers = await client.get('user:' + user_ids[i]);
So we wait for this to be resolved and whatever it resolves with will be assigned to cachedMembers.
The code can be even further cleaned up to be more updated by using an ES6 array helper method instead of your for loop. I hope this answer is useful for current users, otherwise the OP was obsolete.

Categories