I've a problem I can't really seem to wrap my head around. It's very specific to the Knex.JS implementation and I'm sure has nothing to do with PostgreSQL.
The following implementation works. When inserting a moderate amount (~500 statements). On larger amounts this fails due to other reasons. Regardless, the following will not work for my use case, I need something like the next section.
import knex = require("knex");
(function (items) {
let db = knex.table("items");
db.truncate();
let foo = [];
items.forEach(function(item) {
foo.push({
id : item.id,
item_data : JSON.stringify(item)
});
});
db.insert(foo).then(function () { /*..*/ });
}(items))
But the following doesn't:
import knex = require("knex");
(function (items) {
let db = knex.table("items");
db.truncate();
let foo = [];
items.forEach(function(item) {
db.then(function() {
return db.insert(foo).into("items");
});
});
db.then(function () { console.log("Done"); });
}(items))
What doesn't work is this:
An inconsistent amount of rows are inserted. In some implementations it's a lot MORE than I have items (?!)
I get a lot of duplicate key errors in this implementation, since I have a unique constraint
Additional information:
The set contains no duplicate keys
I'm using PostgreSQL as backend
The question is mostly how to implement the desired behaviour. The ideal situation deals in chunks of say 500 "items". I've already posted a question with the project (https://github.com/tgriesser/knex/issues/826) but I'm hoping some people of the Knex.JS community are more active here on SO.
Your solution is correct (promise chaining), however since you're using Knex it ships with Bluebird which already provides a utility method for this:
var Promise = require("bluebird"); // also used internally by Knex so free to require
Promise.each(items, db.insert.bind(db));
Would do the same thing as:
items.forEach(function(item) {
chain = chain.then(function () {
return db.insert(item);
});
});
I've found the solution. I'm not entirely convinced the problem is the fault of Knex.js or if it's my own lack of experiences with Promises in general.
I found inspiration in the work done by Tim Griesser here: https://github.com/tgriesser/knex/blob/batch-insert/lib/util/batch-insert.js
Basically what he did is to add chunks to a promise chain. Perhaps this can be done directly on the Knex library, but for readability I've kept it separate.
import knex = require("knex");
(function (items) {
let db = knex.table("items");
// This is the basic operation to add a promise to the chain.
chain = chain.then(function() { return db.truncate(); });
let foo = [];
items.forEach(function(item) {
// Add db.insert() promises to our promise chain
// This can easily be changed to include chunks and/or streams
chain = chain.then(function () {
return db.insert(item);
});
});
// Start resolving the promises once our db.then() is invoked.
return db.then(function(){
return chain.then();
});
}(items));
Related
I just started playing around Jasmine and I'm still struggling on the spyon/mocking things, e.g., I have a function
module.exports = (() => {
....
function getUserInfo(id) {
return new Promise((resolve, reject) => {
redis.getAsync(id).then(result => {
resolve(result)
})
})
}
return { getUserInfo: getUserInfo }
})()
Then I start writing the Jasmine spec
describe('Test user helper', () => {
let userInfo
beforeEach(done => {
userHelper.getUserInfo('userid123')
.then(info => {
userInfo = info
done()
})
})
it('return user info if user is found', () => {
expect(userInfo).toEqual('info of userid 123')
})
})
It runs well, but my question is how can I mock the redis.getAsync call, so it can become a real isolated unit test?
Thanks.
Good question. You can mock out the redis dependency but only if you rewrite you code, slightly, to be more testable.
Here, that means making redis a parameter to the factory that returns the object containing getUserInfo.
Of course, this changes the API, callers now need to call the export to get the object. To fix this, we can create a wrapper module that calls the function with the standard redis object, and returns the result. Then we move the actual factory into an inner module, which still allows it to be tested.
Here is what that might well look like
user-helper/factory.js
module.exports = redis => {
....
function getUserInfo(id) {
return redis.getAsync(id); // note simplified as new Promise was not needed
}
return {getUserInfo};
};
user-helper/index.js
// this is the wrapper that preserves existing API
module.exports = require('./factory')(redis);
And now for the test
const userHelperFactory = require('./user-helper/factory');
function createMockRedis() {
const users = [
{userId: 'userid123'},
// etc.
];
return {
getAsync: function (id) {
// Note: I do not know off hand what redis returns, or if it throws,
// if there is no matching record - adjust this to match.
return Promise.resolve(users.find(user => user.userId === id));
}
};
}
describe('Test user helper', () => {
const mockRedis = createMockRedis();
const userHelper = userHelperFactory(mockRedis);
let userInfo;
beforeEach(async () => {
userInfo = await userHelper.getUserInfo('userid123');
});
it('must return user info when a matching user exists', () => {
expect(userInfo).toEqual('info of userid 123');
});
});
NOTE: As discussed in comments, this was just my incidental approach to the situation at hand. There are plenty of other setups and conventions you can use but the primary idea was just based on the existing export of the result of an IIFE, which is a solid pattern, and I leveraged the NodeJS /index convention to preserve the existing API. You could also use one file and export via both module.exports = factory(redis) and module.exports.factory = factory, but that would, I believe, be less idiomatic in NodeJS. The broader point was that being able to mock for tests, and testability in general is just about parameterization.
Parameterization is wonderfully powerful, and its simplicity is why developers working in functional languages sometimes laugh at OOP programmers, such as yours truly, and our clandestine incantations like "Oh glorious Dependency Injection Container, bequeath unto me an instanceof X" :)
It is not that OOP or DI get it wrong it is that testability, DI, IOC, etc. are just about parameterization.
Interestingly, if we were loading redis as a module, and if we were using a configurable module loader, such as SystemJS, we could do this by simply using loader configuration at the test level. Even Webpack lets you do this to some extent, but for NodeJS you would need to monkey patch the Require Function, or create a bunch of fake packages, which are not good options.
To the OP's specific response
Thanks! That's a good idea, but practically, it seems it's quite strange when I have tons of file to test in which I will need to create a factory and index.js for each of them.
You would need to restructure your API surface and simply export factories that consuming code must call, rather than the result of applying those factories, to reduce the burden, but there are tradeoffs and default instances are helpful to consumers.
I'm writing a Service Wrapper in AngularJS for Odoo Server, which has all the methods that the server supports and return a deferred promise when the service is call. E.g.:
$scope.runRPC = function(){
odooSvc.fields_get('res.users').then(
function(result){
console.log(result); //return a list of users
}
);
}
However, I need it to be synchronous, and here a reason why.
In Odoo, it has its own JSON rpc API, which has several methods that depend on each other.
For example,
search_read: give u a list of everything on the model u query on
fields_get: give u the list of fields the model has
and much more.
Usually in a working application, we need to call 2 or more API methods to get the final data we want. However, because in Java, everything works asynchronously.The code I image would be nesty and complicated.
So when I make each API calls the depends on one another. it would look like this:
$scope.setLoginToContactEmail = function(){
odooSvc.search_read('res.users').then(
function(users){
for(var i=0; i < user.length; i++){
user = users[0];
login = user.login
partner_id = user.partner_id
odooSvc.read_model('res.partner', partner_id).then(
function(partner){
if(login === partner.email){
odooSvc.write_model('res.partner', partner_id, {email: login}).then(function(msg){
console.log(msg);
});
}
}
)
}
}
);
}
Vs if I could get those API run synchronously or awaits for the data to arrived before I proceed on another call. it would look more simple:
$scope.setLoginToContactEmail = function(){
var users = odooSvc.search_read('res.users');
for(var i=0; i < user.length; i++){
user = users[0];
login = user.login
partner_id = user.partner_id
partner = odooSvc.read_model('res.partner', partner_id);
if (login === partner.email){
odooSvc.write_model('res.partner', partner_id, {email: login});
}
}
}
Please advise. Thanks.
Here is a plunker that takes care for Babel transpilation:
<body ng-app="app" ng-controller="AsyncController">
<p>{{ message }}</p>
</body>
angular.module('app', []).controller('AsyncController', ['$timeout', '$scope', async function ($timeout, $scope) {
$scope.message = 'no timeout';
$scope.message = await $timeout(() => 'timeout', 2000);
$scope.$apply();
}]);
async...await is as simple as that in TypeScript and ES.next. Two things here should be noticed.
The first one is this context inside async controller - it may differ from what is expected. This may not be a problem when classes are used and async methods are bound if necessary. This is a problem for non-OOP code when constructor function is async and cannot reach its this right from the start.
Another one is that async...await is powered by native Promises. This means that $scope.$apply() should be called to trigger digest cycles, despite the fact that digest-friendly $timeout has been used.
Async is better than sync.
However, callbacks can become very messy, so we have promises. However, promises can also become messy, although not quite as bad. Async-await is the best way to get sync-looking async code, but you have to transpile. It depends how much tooling you want to use.
Here is how I would write your ES5 code (without starting a new line after .then(, which reduces the indents a bit, and I also made some changes in the for loop as I wasn't sure what you meant):
$scope.setLoginToContactEmail = function () {
odooSvc.search_read('res.users').then(function (users) {
for (var i = 0; i < users.length; i++) {
var user = users[i]
var login = user.login
var partner_id = user.partner_id
odooSvc.read_model('res.partner', partner_id).then(function (partner) {
if (login === partner.email) {
odooSvc.write_model('res.partner', partner_id, { email: login }).then(function (msg) {
console.log(msg)
})
}
})
}
})
}
With ES6 and the proposal for async functions that can become:
$scope.setLoginToContactEmail = async function () {
const users = await odooSvc.search_read('res.users')
for (let user of users) {
const { login, partner_id } = user
const partner = await odooSvc.read_model('res.partner', partner_id)
if (login === partner.email) {
const msg = await odooSvc.write_model('res.partner', partner_id, { email: login })
console.log(msg)
}
}
}
It depends on how much transpiling you want to do. Personally, I would adopt part of ES6 (let/const, destructuring, arrow functions, template strings, modules, unicode improvements, spread operator / rest parameters, improved object literals, and possibly class literals), the stuff that you use most frequently / isn't too difficult to transpile. Maybe also use async-await: it's not a part of ES6 or ES2016, but it is at stage 3 now so it is pretty stable, and it does make async code a lot easier to read. The caveat is that you have to transpile new ES6/ES2016/etc. features using Babel or TypeScript, and use a polyfill for promises (which async-await uses internally).
TL;DR: if you find yourself descending into async hell, the async-await proposal is probably the best solution.
Asynchronous methods are the power of JavaScript which we must be utilized. And the reason is, if the task will take so long time to execute then we do not have to wait and meanwhile can put other task in execution.
We can see its advantage over the UI where we can put long taking time task in the callback and can avoid to UI be grayout.
So I would suggest to adopt aync approach.
But the way you are deciding the execution of the methods is not the better way in Angular way.
Angular has provided you $q.when(method()).then(successCallback(response));, through it control would come in successCallback only when method() execution is done and method() response can be used to make another promise call. This approach would help you to reduce the complexity of the code because you tried to make the chain of callbacks which is not correct conventionally.
$scope.setLoginToContactEmail = function(){
$q.when(searchRead()).then(function(res) {
modelRead(res);
});
}
function searchRead() {
odooSvc.search_read('res.users').then(
// #TODO
}
);
}
function modelRead(res) {
odooSvc.read_model('res.partner').then(
// #TODO
)
}
I have a question about the best way to use Koa with Postgres. I also (really) like using Bluebird, so I've gone with this approach.
'use strict';
var db = require('./modules/db.js');
var koa = require('koa');
var app = koa();
app.use(function *(){
yield db.withConnection(function *(client){
let id = this.request.query.id;
let simple_data = yield client.queryAsync('select name from table1 where id = $1', [id]);
this.response.body = simple_data;
}.bind(this));
});
app.listen(3000);
This is the db.js file, basically it uses things mentioned in the Bluebird docs.
... bla bla, some imports
Promise.promisifyAll(pg);
function getSqlConnection() {
var close;
return pg.connectAsync(connectionString).spread(function(client, done) {
close = done;
return client;
}).disposer(function() {
if (close) close();
});
}
function* withConnection(func){
yield Promise.using(getSqlConnection(), function (client){
return Promise.coroutine(func)(client);
});
}
module.exports.withConnection = withConnection;
Do you have any suggestions on improving this. I really like it for now, I've tested it extensively (under load, making errors/exceptions, etc), and it seems to work correctly. I'm pretty new with these generators and other ES6 stuff, so it's possible that I'm missing something.
My question is basically why so little people use this approach (i find it hard to find examples on this online)?
I'm also fine with using other libraries besides pg and bluebird, but i like those due to the number of downloads they have,I prefer using popular stuff because i find it easier to find blog posts, help and documentation for those. Thanks!
Bluebird is a promise library, a very good one at that, but it should not be used as a guidance of how or what database library to use. All that Promise.promisifyAll(pg); stuff is actually quite poor next to all the promise solutions that exist out there - knex, massive.js, pg-promise, etc.
And if you want the best combination of pg + bluebird, then try pg-promise.
var promise = require('bluebird');
var options = {
promiseLib: promise // Use Bluebird as the promise library
};
var pgp = require("pg-promise")(options);
var db = pgp('postgres://username:password#host:port/database');
db.query('select name from table1 where id = $1', [1])
.then(function (data) {
})
.catch(function (error) {
});
The library supports ES6 generators also, so you can write the code exactly like in your example.
From Tasks example:
db.task(function * (t) {
let user = yield t.one("select * from users where id=$1", 123);
return yield t.any("select * from events where login=$1", user.name);
})
.then(function (events) {
// success;
})
.catch(function (error) {
// error;
});
You could also try pg-native.
From pg module docs:
"node-postgres contains a pure JavaScript protocol implementation
which is quite fast, but you can optionally use native bindings for a
20-30% increase in parsing speed. Both versions are adequate for
production workloads. To use the native bindings, first install
pg-native. Once pg-native is installed, simply replace require('pg')
with require('pg').native."
https://github.com/brianc/node-postgres
I have a function that pulls out from database a random question from Questions collection.
Game_Questions.js - console.log below prints out correct value (string I need), so I thought that return will let yield give me back same value.
exports.random_Question = function *() {
yield Questions.findRandom().limit(1).exec(function(err,question){
console.log("rand q: " + question[0].text);
return question[0].text;
});
}
Game.js:
var Game_Questions = require('./backend/Game_Questions');
And here I want to access question[0].text value from random_Question function from code snippet above (Game_Questions.js). What I've tried so far:
var found_Question = Game_Questions.random_Question();
var found_Question = Game_Questions.random_Question().next().value;
Those two return [Object object] which after using JSON.stringify() shows that the object is:
{"value":{"emitter":{"domain":null,"_events":{}},"emitted":{},"ended":true},"done":false}
I also tried using co(function*()) but it also didn't let me take out the value. Please help how to access it?
The answer by #remus is a callback approach and Koa was designed explicitly to ditch callbacks. So while it's perfectly good code and would fit an Express application it is completely at odds with the design philosophy behind Koa.
From the looks of it you are using Mongoose which has supported promises for async operations since version 4.0 (which was released Apr 2015) which should allow a yield approach to be taken. Note I'm making an assumption you are working with Mongoose - I hope I'm not wrong!
Here is some nice documentation on how Mongoose would fit nicely with koa.
So first of all make sure you are using a version of Mongoose that supports using yield. If not you'll have to use the #remus approach or manually wrap each of your methods so they are yield compatible (i.e. wrapping with promises).
But if you are using a compatible version (4.0 and upwards) then your code would look something like the following:
exports.random_Question = function *() {
var result;
try {
result = yield Questions.findRandom().limit(1).exec();
} catch(e) {
console.log(e.stack);
throw e;
}
console.log("rand q: " + result[0].text);
return result[0].text;
}
Note that I'm assuming the result is an array based on the code you supplied.
The above example doesn't necessarily have to be a generator function. It could also be a normal function that returns a Promise. So alternatively something like this could also be done:
exports.random_Question = function() {
return Questions.findRandom()
.limit(1)
.exec()
.then(function() {
// I'm assuming mongoose assigns the value
// being resolved in exec() to 'this'
var question = this[0];
console.log("rand q: " + question.text);
return question.text;
}).catch(function(e) {
console.log(e.stack);
throw e;
});
}
So for the randomQuestion function all that is important is that it can be yielded by co which handles the Koa application flow control – check tj/co on GitHub for the different objects you can yield.
So finally getting back to the Koa Middleware we can yield either of the above code snippets in the exact same manner. So we'd do:
var koa = require("koa");
var app = module.exports = koa();
var Game_Questions = require('./backend/Game_Questions');
app.use(function*() {
var resultText;
try {
resultText = yield Game_Questions.random_Question();
} catch(e) {
this.throw(500);
}
this.body = resultText;
this.status = 200;
});
app.listen(3000);
Something else to note is that I'm a little unsure of the findRandom method in the mongoose query since I don't know if it plays nicely with the Promise features of mongoose. Personally I'd get a normal mongoose query working using yield before reintroducing findRandom just to make sure it's not causing an issue.
My answer is getting a bit long at this point so I'll leave it at that.
Your syntax is pretty strange, but not sure if that's specific to Koa or not?
Because Node.js is event based, use a callback instead:
exports.random_Question = function(callback) {
Questions.findRandom().limit(1).exec(function(err, question){
callback(err, question);
});
}
And use it:
var Game_Questions = require('./backend/Game_Questions');
Game_Questions.random_Question(function(err, question) {
console.log(question);
});
Of some concern as well is your question states you're trying to reference Game_Questions.randomQuestion() when your function is actually named random_Question.
Please could I ask for some advice on a control flow issue with node and redis? (aka Python coder trying to get used to JavaScript)
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
Basically I'd like to query a set, and then when I have the results for the set, I need to carry out a get for each result. When I've got all the data, I need to broadcast it back to the client.
Currently I do this inside two callbacks, using a global object, which seems messy. I'm not even sure if it's safe (will the code wait for one client.get to complete before starting another?).
The current code looks like this:
var all_users = [];
// Get all the users for this page.
client.smembers("page:" + current_page_id, function (err, user_ids ) {
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
}
});
But this seems very messy. Is it really the best way to do this? How can I be sure that all lookups have been performed before calling socket.broadcast?
scratches head Thanks in advance for any advice.
I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.
That's what Node is. (I'm pretty sure that this topic was discussed more than enough times here, look through other questions, it's definitely there)
How can I be sure that all lookups have been performed before calling socket.broadcast?
That's what is err for in callback function. This is kinda Node's standard - first parameter in callback is error object (null if everything fine). So just use something like this to be sure no errors occurred:
if (err) {
... // handle errors.
return // or not, it depends.
}
... // process results
But this seems very messy.
You'll get used to it. I'm actually finding it nice, when code is well formatted and project is cleverly structured.
Other ways are:
Using libraries to control async code-flow (Async.js, Step.js, etc.)
If spaghetti-style code is what you think mess is, define some functions to process results and pass them as parameters instead of anonymous ones.
If you totally dislike writing stuff callback-style, you might want to try streamlinejs:
var all_users = [];
// Get all the users for this page.
var user_ids = client.smembers("page:" + current_page_id, _);
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
var name = client.get('user:' + user_ids[i] + ':name', _);
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
}
socket.broadcast('all_users', all_users);
Note that a disadvantage of this variant is that only one username will be fetched at a time. Also, you should still be aware of what this code really does.
Async is a great library and you should take a look. Why ? Clean code / process / easy to track .. etc
Also, keep in mind that all your async function will be processed after your for loop. In you exemple, it may result in wrong "i" value. Use closure :
for (var i = 0; i < user_ids.length; i++) { (function(i) {
client.get('user:' + user_ids[i] + ':name', function(err, name) {
var myobj = {};
myobj[user_ids[i]] = name;
all_users.push(myobj);
// Broadcast when we have got to the end of the loop,
// so all users have been added to the list -
// is this the best way? It seems messy.
if (i === (user_ids.length - 1)) {
socket.broadcast('all_users', all_users);
}
});
})(i)}
What you should do to know when it's finish is use a recursive pattern like async ( i think ) do. It's much simple then doing it yourself.
async.series({
getMembers: function(callback) {
client.smembers("page:" + current_page_id, callback);
}
}, function(err, results) {
var all_users = [];
async.forEachSeries(results.getMembers, function(item, cb) {
all_users.push(item);
cb();
}, function(err) {
socket.broadcast('all_users', all_users);
});
});
This code may not be valid, but you should be able to figure out how to do it.
Step library is good too ( and only 30~ line of code i think)
I don't understand why client.smembers and client.get (Redis lookups)
need to be callbacks rather than simply being statements - it makes
life very complicated.
Right, so everyone agrees callback hell is no bueno. As of this writing, callbacks are a dying feature of Node. Unfortunately, the Redis library does not have native support for returning Promises.
But there is a module you can require in like so:
const util = require("util");
This is a standard library that is included in the Node runtime and has a bunch of utility functions we can use, one of them being "promisify":
https://nodejs.org/api/util.html#util_util_promisify_original
Now of course when you asked this question seven years ago, util.promisify(original) did not exist as it was added in with the release of -v 8.0.0, so we can now update this question with an updated answer.
So promisify is a function and we can pass it a function like client.get() and it will return a new function that take the nasty callback behavior and instead wraps it up nice and neat to make it return a Promise.
So promisify takes any function that accepts a callback as the last argument and makes it instead return a Promise and it sounds like thats the exact behavior that you wanted seven years ago and we are afforded today.
const util = require("util");
client.get = util.promisify(client.get);
So we are passing a reference to the .get() function to util.promisify().
This takes your function, wraps it up so instead of implementing a callback, it instead returns a Promise. So util.promisify() returns a new function that has been promisified.
So you can take that new function and override the existing one on client.get().
Nowadays, you do not have to use a callback for Redis lookup. So now you can use the async/await syntax like so:
const cachedMembers = await client.get('user:' + user_ids[i]);
So we wait for this to be resolved and whatever it resolves with will be assigned to cachedMembers.
The code can be even further cleaned up to be more updated by using an ES6 array helper method instead of your for loop. I hope this answer is useful for current users, otherwise the OP was obsolete.