Exporting Node module from promise result - javascript

I'm trying to rewrite a module to return a different value than before, but now it uses an async call to get that value. (with child_process if it matters). I've wrapped it in a Promise, but that's not critical to me - it can be in the original child_process callback, but the problem is I can't chain the promise to everywhere in the app because I need this to become synchronous. Here's my module:
const { exec } = require("child_process");
const platformHome = process.env[process.platform === "win32" ? "USERPROFILE" : "HOME"];
const getExecPath = new Promise((resolve, reject) => {
const path = process.env.GEM_HOME;
if (path) {
resolve(path);
return;
}
exec("gem environment", (err, stdout, err) => {
if (err) {
reject(err);
}
const line = stdout.split(/\r?\n/).find(l => ~l.indexOf("EXECUTABLE DIRECTORY"));
if (line) {
resolve(line.substring(line.indexOf(": ") + 2));
} else {
reject(undefined);
}
});
});
let GEM_HOME = undefined;
getExecPath
.then(path => (GEM_HOME = path))
.catch(() => (GEM_HOME = `${platformHome}/.gem/ruby/2.3.0`))
.then(
() => (module.exports = GEM_HOME) // or simply return it
);
Clearly, when requiring the module, this doesn't work - and if I return the promise itself, and use then after require - my next module.exports will be async, and this chain will carry on. How do I avoid this pattern?

Modules in Node that you load with require() are loaded synchronously and it is not possible for require to return any value that is loaded asynchronously. It can return a promise but then users of that module would have to use it as:
require('module-name').then(value => {
// you have your value here
});
It would not be possible to write:
var value = require('module-name');
// you cannot have your value here because this line
// will get evaluated before that value is available
Of course you can have the promise resolved inside of your module and make it set a property on the exported object by adding something like this:
module.exports = { GEM_HOME: null };
and changing:
module.exports = GEM_HOME
to:
module.exports.GEM_HOME = GEM_HOME
In that case, every other module that uses this module as:
var x = require('module-name');
will have x.GEM_HOME originally set to null but it would eventually get changed to a correct value some time later. It would not be available right away though, because require() returns before the promise is settled and the value is set.
There is an ongoing discussion to introduce asynchronous module loading with different syntax and semantics that may be suited for your use case. It's a controversial subjects and it's worth reading all of the rationale behind it - see:
Node.js, TC-39, and Modules by James M Snell from iBM
ES6 Module Interoperability - Node.js Enhancement Proposals
In Defense of .js - A Proposal for Node.js Modules by Dave Herman, Yehuda Katz and Caridy Patiño
Discussion on the Pull Request #3 of node-eps (002: ES6 module interop)
See also this answer for more details:
javascript - Why is there a spec for sync and async modules?

Node.js modules are loaded synchronously.
You can deal with this exporting the Promise value.
## your module.js
module.exports = new Promise()
and:
## your app.js
const mod = require('./module');
mod.then((value => ...);

Related

refactoring code from syncronous to asyncronous flow

What I have
I have to take code that was build for web with react and adapt it to react native. One of the biggest problems is, web storage(localStorage) works synchronously, while react-native storage implementations works asynchronously AsyncStorage. So all code that run/use web storage must now be wrapped inside an async/await or use promises in react-native implementation.
Code before refactoring - with synchronous storage
// get items from localstorage - synchronous wrapper implementation
export const getTokens = () => ({
auth:
localStorage.getItem(TOKEN_STORAGE_KEY.AUTH) ||
sessionStorage.getItem(TOKEN_STORAGE_KEY.AUTH),
refresh:
localStorage.getItem(TOKEN_STORAGE_KEY.REFRESH) ||
sessionStorage.getItem(TOKEN_STORAGE_KEY.REFRESH)
});
const refreshPromise = useRef<Promise<boolean>>();
const refreshToken = (): Promise<boolean> => {
if (!!refreshPromise.current) {
return refreshPromise.current;
}
return new Promise(resolve => {
// getting items from storage synchronous
const token = getTokens().refresh;
// nested promise x_x
return tokenRefresh({ variables: { token } }).then(refreshData => {
if (!!refreshData.data.tokenRefresh?.token) {
// this is other method that use localstorage inside promises, more problems x_x
setAuthToken(refreshData.data.tokenRefresh.token, persistToken);
return resolve(true);
}
return resolve(false);
});
});
};
Problem
The problem is that I have encountered situations where storage is used inside a promise, and to adapt it to react-native I have to use asyncronous flow. i.e use async/await inside of a new Promise() constructor that is an anti-pattern.
the above brings a lot of problems, although using async/await inside the promise constructor is valid, firstly it makes the code less readable and secondly more difficult to debbug and check for errors. Another problem is synchronous implementation also nests promises which makes the code more complex to refactor.
Current code - with asynchronous storage
// get items from localstorage - asynchronous wrapper implementation
export const getTokens = async () => {
const auth = await SecureStore.getItemAsync(TokenStorageKey.AUTH);
const refresh = await SecureStore.getItemAsync(TokenStorageKey.REFRESH);
return auth && refresh
? {
auth,
refresh,
}
: null;
};
const refreshPromise = useRef<Promise<boolean>>();
const refreshToken = (): Promise<boolean> => {
if (refreshPromise.current) {
return refreshPromise.current;
}
return new Promise(async resolve => {
// getting items from storage asynchronous
const tokens = await getTokens();
const refresh = tokens?.refresh;
// nested promise x_x
return tokenRefresh({variables: {refresh}}).then(async refreshData => {
if (refreshData.data?.tokenRefresh?.token) {
// this is other method that use localstorage inside promises, more problems x_x
await setAuthToken(refreshData.data.tokenRefresh.token);
return resolve(true);
}
return resolve(false);
});
});
};
What I want
according to this eslint rule:
If a Promise executor function is using await, this is usually a sign that it is not actually necessary to use the new Promise constructor, or the scope of the new Promise constructor can be reduced.*
I'm not clear how to achieve what the eslint rule mentions or make this code more escalable and easier to debug, Removing promises and avoiding nesting of asynchronous processes.
Indeed, you don't need the new Promise(...) wrapper since you already have promise-driven operations. You were just wrapping existing promises in a new manually created promise which is not necessary (and considered an anti-pattern). You can change to this:
const refreshToken = async (): Promise < boolean > => {
if (refreshPromise.current) {
return refreshPromise.current;
}
// getting items from storage asynchronous
const tokens = await getTokens();
const refresh = tokens?.refresh;
const refreshData = await tokenRefresh({ variables: { refresh } });
if (refreshData.data?.tokenRefresh?.token) {
// this is other method that use localstorage inside promises, more problems x_x
await setAuthToken(refreshData.data.tokenRefresh.token);
return true;
}
return false;
};
Note, I also removed the mixture of await and .then() since you usually don't want to mix those two programming styles.
FYI, I don't know your typed-syntax here so please forgive any slight syntax mistakes in that regard. This should still show you the general way this can be done without wrapping it in a new Promise(...).

Wait all promises in a map function

I want to wait to read all my pictures in a map function
I tried this
let buffer = [];
// Folder of the dataset.
const rootFolder = './dataset'
console.log("Entering in folder dataset");
fs.readdirSync(rootFolder);
// For each folders
const files = fs.readdirSync(rootFolder).map(dirName => {
if(fs.lstatSync(path.join(rootFolder, dirName)).isDirectory()){
console.log(`Entering in folder ${path.join(rootFolder, dirName)}`);
// For each files
fs.readdirSync(path.join(rootFolder, dirName)).map(picture => {
if(fs.lstatSync(path.join(rootFolder, dirName, picture)).isFile()){
if(picture.startsWith("norm")){
return fileToTensor(path.join(rootFolder, dirName, picture)).then((img) => {
buffer.push(img);
}).catch((error) => {console.log(error)});
}
}
});
}
});
Promise.all(files);
console.log(buffer);
async function fileToTensor(path) {
return await sharp(path)
.removeAlpha()
.raw()
.toBuffer({ resolveWithObject: true });
}
But my buffer is still empty...
I know promises exist but I don't know how can I include them in map(map())
Thanks you :)
I would refactor the above code to this:
let files = [];
// loop each dir.
fs.readdirSync(rootFolder).forEach(dirName => {
// if it's a directory, procede.
if(fs.lstatSync(path.join(rootFolder, dirName)).isDirectory()){
console.log(`Entering in folder ${path.join(rootFolder, dirName)}`);
fs.readdirSync(path.join(rootFolder, dirName)).forEach(picture => {
if(fs.lstatSync(path.join(rootFolder, dirName, picture)).isFile()){
// If lstatsync says it's a file and if it starts with "norm"
if(picture.startsWith("norm")){
// push a new promise to the array.
files.push(new Promise((resolve, reject) => {
fileToTensor(path.join(rootFolder, dirName, picture)).then((img) => {
buffer.push(img);
resolve();
}).catch((error) => {console.log(error); reject(error);});
}));
}
}
});
}
});
// Resolve all promises.
Promise.all(files).then(() => {
// Then do whatever you need to do.
console.log(buffer);
}).catch((errors) => {
console.log('one ore more errors occurred', errors);
});
Basically, here is what I did:
Removed .map, since it's not necessary in this context. Also, in your case, not all code paths returned a result, hence not every callback returned a result.
Pushed each needed item to the files array, which is a Promise[].
Called Promise.all on the files array. Each resolved promise will push the result to the buffer array. I would've handled it in a different way, but still, this is the fastest I could think of.
Registered a callback on Promise.all, so that buffer will be defined.
As a side note, there are a lot of third party libraries that helps you to avoid having nested loops and promises looping the file system. I've just posted this to try giving something that could actually work from the existing code, despite an entire refactor would be clever here, and a preliminary analysis of available node libraries would also help to make the code easier to read and to mantain.
First of all a few advices:
DON't use arrow functions for anything you cannot put in a single line (they aren't intended for that and this wrecks readability)
Check that each callback you pass to .map() actually return something (first one doesn't. It seems you missed a return before inner fs.readdir(..)....
Better try to name all functions (except arrow ones in the cases that they're good choice). This way not only I could name it to better identify them in the previous point but also stack traces would be much more readable and useful (traceable).
That being said, you are reading directories (and subdirectories) synchronously to finally return promises (I understand that fileToTensor() is expected to return a promise). It may not have a major impact on the overall execution time because I suppose actual file processings would be much more expensive BUT this is a bad pattern because you are blocking the event loop during the tree scan (so, if your code is for a server that needs to attend other petitions, you are pulling performance a bit down...).
Finally, as others already said, there are libraries, such as glob that eases that task.
On the other hand, if you want to do it by yourself (as an understanding exercise) I myself implemented my own library for the same task before knowing about glob which could serve you as a simpler example.
Hye I've bit updated your code please go through once. It might be helpful :)
let fsReadDir = Util.promisify(Fs.readdir);
let fsStat = Util.promisify(Fs.stat);
let picturePromises = [];
let directories = await fsReadDir(rootFolder);
for (let dirIndex = 0; dirIndex < directories.length; dirIndex++) {
let file = directories[dirIndex];
let stat = await fsStat(path[pathIndex] + '/' + file);
if (stat.isDirectory()) {
let pictures = await fsReadDir(path.join(rootFolder, dirName));
for (let picIndex = 0; picIndex < pictures.length; picIndex++) {
let stat = await fsStat(path.join(rootFolder, dirName, pictures[picIndex]));
if (stat.isFile()) {
if (picture.startsWith("norm")) {
let pTensor = fileToTensor(path.join(rootFolder, dirName, pictures[picIndex])).then((img) => {
buffer.push(img);
}).catch((error) => { console.log(error) });
picturePromises.push(pTensor);
}
}
}
}
}
Promise.all(picturePromises);
console.log(buffer);
async function fileToTensor(path) {
return await sharp(path)
.removeAlpha()
.raw()
.toBuffer({ resolveWithObject: true });
}

Merged gulp tasks never fire `end` event

I've got a gulp task that loops through a folder looking for sub folders and outputs a JavaScript file based upon the contents of each folder. Below is a more visual example.
src
assets
scripts
critical
loadCSS.init.js
legacy
flexibility.init.js
picturefill.init.js
modern
connectivity.js
layzr.init.js
menu_list.js
scroll-hint.init.js
slideout.init.js
swiper.init.js
service-worker
service-worker.js
becomes:
dev
assets
scripts
critical.js
legacy.js
modern.js
service-worker.js
This is achieved by reading the contents of the src/assets/scripts directory, then running a loop against each folder (critical, legacy, modern, service-worker) and sending the contents of each folder to a Gulp tasks which get merged together with merge-stream.
All this works great, except that once the tasks are merged back together, I want to trigger a notification if the compilation succeeded. If I try to pipe anything to the merged streams, it doesn't work. It just returns the merged streams, and never continues on.
If I un-promisify my PROCESS_SCRIPTS function and don't use merge-stream (i.e. only processing one manually specified folder), it works fine, so I'm at a loss as to what's going on.
Here's my full task:
module.exports = {
scripts(gulp, plugins, ran_tasks, on_error) {
// task-specific plugins
const ESLINT = require("gulp-eslint");
const WEBPACK = require("webpack-stream");
// process scripts
const PROCESS_SCRIPTS = (js_directory, destination_file_name = "modern.js", compare_file_name = "modern.js", source = [global.settings.paths.src + "/assets/scripts/*.js"]) => {
return new Promise((resolve, reject) => {
const WEBPACK_CONFIG = {
mode: "development",
};
// update webpack config for the current target destination and file name
WEBPACK_CONFIG.mode = plugins.argv.dist ? "production" : WEBPACK_CONFIG.mode;
WEBPACK_CONFIG.output = {
filename: destination_file_name
};
const TASK = gulp.src(source)
// prevent breaking on error
.pipe(plugins.plumber({errorHandler: on_error}))
// check if source is newer than destination
.pipe(plugins.newer(js_directory + "/" + compare_file_name))
// lint all scripts
.pipe(ESLINT())
// print lint errors
.pipe(ESLINT.format())
// run webpack
.pipe(WEBPACK(WEBPACK_CONFIG))
// generate a hash and add it to the file name
.pipe(plugins.hash({template: "<%= name %>.<%= hash %><%= ext %>"}))
// output scripts to compiled directory
.pipe(gulp.dest(js_directory))
// generate a hash manfiest
.pipe(plugins.hash.manifest(".hashmanifest-scripts", {
deleteOld: true,
sourceDir: js_directory
}))
// output hash manifest in root
.pipe(gulp.dest("."))
// reject after errors
.on("error", () => {
reject(TASK);
})
// return the task after completion
.on("end", () => {
resolve(TASK);
});
});
};
// scripts task, lints, concatenates, & compresses JS
return new Promise ((resolve) => {
// set JS directory
const JS_DIRECTORY = plugins.argv.dist ? global.settings.paths.dist + "/assets/scripts" : global.settings.paths.dev + "/assets/scripts";
// set the source directory
const SOURCE_DIRECTORY = global.settings.paths.src + "/assets/scripts";
// set up an empty merged stream
const MERGED_STREAMS = plugins.merge();
// get the script source folder list
const SCRIPT_FOLDERS = plugins.fs.readdirSync(SOURCE_DIRECTORY);
// get the script destination file list
const SCRIPT_FILES = plugins.fs.existsSync(JS_DIRECTORY) ? plugins.fs.readdirSync(JS_DIRECTORY) : false;
// process all the script folders
const PROCESS_SCRIPT_FOLDERS = () => {
return Promise.resolve().then(() => {
// shift to the next folder
const FOLDER_NAME = SCRIPT_FOLDERS.shift();
// find the existing destination script file name
const FILE_NAME = SCRIPT_FILES ? SCRIPT_FILES.find((name) => {
return name.match(new RegExp(FOLDER_NAME + ".[a-z0-9]{8}.js"));
}) : FOLDER_NAME + ".js";
// process all scripts, update the stream
return PROCESS_SCRIPTS(JS_DIRECTORY, FOLDER_NAME + ".js", FILE_NAME, SOURCE_DIRECTORY + "/" + FOLDER_NAME + "/**/*").then((processed) => {
MERGED_STREAMS.add(processed);
});
}).then(() => SCRIPT_FOLDERS.length > 0 ? PROCESS_SCRIPT_FOLDERS() : resolve());
};
PROCESS_SCRIPT_FOLDERS().then(() => {
// wrap up
return MERGED_STREAMS
// prevent breaking on error
.pipe(plugins.plumber({
errorHandler: on_error,
}))
// notify that task is complete, if not part of default or watch
.pipe(plugins.gulpif(gulp.seq.indexOf("scripts") > gulp.seq.indexOf("default"), plugins.notify({
title: "Success!",
message: "Scripts task complete!",
onLast: true,
})))
// push task to ran_tasks array
.on("data", () => {
if (ran_tasks.indexOf("scripts") < 0) {
ran_tasks.push("scripts");
}
})
// resolve the promise on end
.on("end", () => {
resolve();
});
});
});
}
};
Also visible on my GitHub: https://github.com/JacobDB/new-site/blob/master/gulp-tasks/scripts.js
EDIT: I've tried a few things, I'll detail them here.
console.log("hello world") never fires after MERGED_STREAMS.on("data"), MERGED_STREAMS.on("error"), or MERGED_STREAMS.on("end").
Moving const MERGED_STREAMS = plugins.merge(); to a module-level variable (i.e. just after const WEBPACK = require("webpack-stream")) does not change the outcome.
Doing #2 and then using MERGED_STREAMS.add(gulp.src(source) ...) instead of adding the stream after the promise completes does not change the outcome, except when leaving .pipe(gulp.dist(".")), which is required to output a .hashmanifest, and always marks the task as ran.
Disabling webpack, hash, or eslint, in any combination does not make a difference.
Changing PROCESS_SCRIPTS from returning a promise to return the stream, then processing each folder as individual variables, then merging them manually does appear to correctly trigger the task as ran, but webpack can only be run once, so it only outputs one file – critical.hash.js. Note: I haven't tested this method in conjunction with disabling hash, which may be causing it to be marked as correctly ran if .hashmanifest is always being output.
Splitting the linting step and the webpack step in to separate task kind of causes the task to be correctly marked as ran, but only if the lint task is not a promise, which results in unexpected end of stream errors in the console.
EDIT 2: Updated with a revised version of my task based on #Louis's advice.
There are many problems with the code above. One major issue that makes the code hard to follow and debug is that you use new Promise where you don't need it. Generally, if you have new Promise and the logic inside the promise executor will resolve or reject depending on the result of another promise, then you don't need to use new Promise.
Sometimes people have code like this:
function foo() {
const a = getValueSynchronously();
const b = getOtherValueSynchronously();
return doSomethingAsynchronous(a, b).then(x => x.someMethod());
}
Suppose that doSomethigAsynchronous returns a promise. The problem with the foo function above is that if getValueSynchronously and getOtherValueSynchronously fail, then foo will raise an exception, but if doSomethingAsynchronous fails, then it will reject a promise. So code that uses foo has to handle synchronous exceptions and asynchronous rejections if it wants to handle all possible failures. Sometimes people feel they can fix the issue by causing all failures to be promise rejections:
function foo() {
return new Promise((resolve, reject) => {
const a = getValueSynchronously();
const b = getOtherValueSynchronously();
doSomethingAsynchronous(a, b).then(x => x.someMethod()).then(resolve, reject);
});
}
In the code above, if getValueSynchronously or getOtherValueSynchronously fail, that's a promise rejection. But the problem with the code above is that it is easy to get it wrong. You can forget to call reject everywhere it is needed. (As a matter of fact, you do have this error in your code. You have nested promises whose rejection won't be propagated up. They are just lost, which means if an error occurs your code will just stop, without you knowing why.) Or you may be tempted to call `resolve way down in a nested function, which makes the logic hard to follow.
You can just as well do:
function foo() {
return Promise.resolve().then(() => {
const a = getValueSynchronously();
const b = getOtherValueSynchronously();
return doSomethingAsynchronous(a, b);
}).then(x => x.someMethod());
}
You can use Promise.resolve() to enter the promisified world (hmm... "the promised land?"). In the code above, you do not have to remember to call reject. If the code inside the callback to .then fails for any reason, you get a rejected promise.
I also noticed in a number of places you return a value from the executor function you pass to new Promise. Your code would behave exactly the same if you did not use return there. To illustrate, this code:
function foo() {
return new Promise((resolve, reject) => {
return doSomethingAsynchronous().then(resolve, reject);
});
}
behaves the exactly same way as this code:
function foo() {
return new Promise((resolve, reject) => {
doSomethingAsynchronous().then(resolve, reject);
});
}
The value returned from the executor is ignored. End of story. If you think the value you return from your executors are doing something, then that's incorrect.

Return when first promise resolves

Goal
I have a bunch of file names in an array, and would like to read the contents of the first of the files that exists. They're config files, so it's important that the order is deterministic, so I can't use .race(). The version I have below maps over each file in order, tries to load it, and if it loads successfully, calls resolve.
Problems
Here are a couple of issues with this implementation:
Calling resolve(...) doesn't actually exit the loop, so the program opens every file in the list, even when doesn't need to.
The rejection condition (at this is required to reject when we don't receive any files) seems like a hack. However, if it's not here, the promise is never rejected.
The resolution code seems suspiciously like a promise anti-pattern.
Are there any better ways to do structure this? I could probably do it with a single Promise.filter call, but I don't want to query every file if I don't need to.
Thanks
Code
var Promise = require('bluebird');
var fs = Promise.promisifyAll(require('fs'));
var _ = require('lodash');
new Promise((resolve, reject) => {
// Resolve with the first of the files below that exists
return Promise.mapSeries(
['./file_that_doesntexist.json', '../file_that_might_exist.json', './file_that_doesnt_exist_either.json', '../file_that_exists.json']
, (filename) => fs.readFileAsync(filename, 'utf-8')
.then(file => {
resolve([filename, file]);
return true;
})
.catch(_.stubFalse)
)
.then(files => { // this is required to reject when we don't receive any files
if(!files.some(x => x))
reject('did not receive any files');
});
})
.then(function([filename, configFile]) {
// do something with filename and configFile
})
.catch(function(err) {
console.error(err)
})
This can be achieved by recursion but also by building a catch chain using Array#reduce():
var paths = ['./file_that_doesntexist.json', '../file_that_might_exist.json', './file_that_doesnt_exist_either.json', '../file_that_exists.json'];
// Resolve with the first of the files below that exists
paths.reduce(function(promise, path) {
return promise.catch(function(error) {
return fs.readFileAsync(path, 'utf-8').then(file => [path, file]);
});
}, Promise.reject())
.then(function([filename, configFile]) {
// do something with filename and configFile
})
.catch(function(err) {
console.error('did not receive any files', err);
});
The catch chain ensures that every time fs.readFileAsync(path, 'utf-8') fails, the next path is tried.
The first successful fs.readFileAsync(path, 'utf-8') will drop through to .then(function([filename, configFile]) {...}.
Total failure will drop through to .catch(function(err) {...}.
If you want sequential iteration, just use a recursive approach:
var Promise = require('bluebird');
var fs = Promise.promisifyAll(require('fs'));
function readFirstOf(filenames)
if (!filenames.length)
return Promise.reject(new Error('did not receive any files'));
return fs.readFileAsync(filenames[0], 'utf-8')
.then(file =>
[filenames[0], file]
, err =>
readFirstOf(filenames.slice(1))
);
}
readFirstOf(['./file_that_doesntexist.json', '../file_that_might_exist.json', './file_that_doesnt_exist_either.json', '../file_that_exists.json'])
.then(function([filename, configFile]) {
// do something with filename and configFile
})
.catch(function(err) {
console.error(err)
})
If you want to try to read them all in parallel and the select the first successful in the list, you can use Promise.map + .reflect() and then just filter the results (e.g. via _.find).
There is this hackish approach to solve this problem neatly. You may invert the promises like;
var invert = pr => pr.then(v => Promise.reject(v), x => Promise.resolve(x));
which in fact comes handy when used with Promise.all() to get the first resolving promise by ignoring the rejected ones. I mean when inverted, all promises rejected (resolved) may go unnoticed while the first resolving (rejecting) one gets caught at the .catch() stage of Promise.all(). Cool..!
Watch this;
var invert = pr => pr.then(v => Promise.reject(v), x => Promise.resolve(x)),
promises = [Promise.reject("No such file"),
Promise.reject("No such file either"),
Promise.resolve("This is the first existing files content"),
Promise.reject("Yet another missing file"),
Promise.resolve("Another file content here..!")];
Promise.all(promises.map(pr => invert(pr)))
.catch(v => console.log(`First successfully resolving promise is: ${v}`));

How to unit test a function which calls another that returns a promise?

I have a node.js app using express 4 and this is my controller:
var service = require('./category.service');
module.exports = {
findAll: (request, response) => {
service.findAll().then((categories) => {
response.status(200).send(categories);
}, (error) => {
response.status(error.statusCode || 500).json(error);
});
}
};
It calls my service which returns a promise. Everything works but I am having trouble when trying to unit test it.
Basically, I would like to make sure that based on what my service returns, I flush the response with the right status code and body.
So with mocha and sinon it looks something like:
it('Should call service to find all the categories', (done) => {
// Arrange
var expectedCategories = ['foo', 'bar'];
var findAllStub = sandbox.stub(service, 'findAll');
findAllStub.resolves(expectedCategories);
var response = {
status: () => { return response; },
send: () => {}
};
sandbox.spy(response, 'status');
sandbox.spy(response, 'send');
// Act
controller.findAll({}, response);
// Assert
expect(findAllStub.called).to.be.ok;
expect(findAllStub.callCount).to.equal(1);
expect(response.status).to.be.calledWith(200); // not working
expect(response.send).to.be.called; // not working
done();
});
I have tested my similar scenarios when the function I am testing returns itself a promise since I can hook my assertions in the then.
I also have tried to wrap controller.findAll with a Promise and resolve it from the response.send but it didn't work neither.
You should move your assert section into the res.send method to make sure all async tasks are done before the assertions:
var response = {
status: () => { return response; },
send: () => {
try {
// Assert
expect(findAllStub.called).to.be.ok;
expect(findAllStub.callCount).to.equal(1);
expect(response.status).to.be.calledWith(200); // not working
// expect(response.send).to.be.called; // not needed anymore
done();
} catch (err) {
done(err);
}
},
};
The idea here is to have the promise which service.findAll() returns accessible inside the test's code without calling the service. As far as I can see sinon-as-promised which you probably use does not allow to do so. So I just used a native Promise (hope your node version is not too old for it).
const aPromise = Promise.resolve(expectedCategories);
var findAllStub = sandbox.stub(service, 'findAll');
findAllStub.returns(aPromise);
// response = { .... }
controller.findAll({}, response);
aPromise.then(() => {
expect(response.status).to.be.calledWith(200);
expect(response.send).to.be.called;
});
When code is difficult to test it can indicate that there could be different design possibilities to explore, which promote easy testing. What jumps out is that service is enclosed in your module, and the dependency is not exposed at all. I feel like the goal shouldn't be to find a way to test your code AS IS but to find an optimal design.
IMO The goal is to find a way to expose service so that your test can provide a stubbed implementation, so that the logic of findAll can be tested in isolation, synchronously.
One way to do this is to use a library like mockery or rewire. Both are fairly easy to use, (in my experience mockery starts to degrade and become very difficult to maintain as your test suite and number of modules grow) They would allow you to patch the var service = require('./category.service'); by providing your own service object with its own findAll defined.
Another way is to rearchitect your code to expose the service to the caller, in some way. This would allow your caller (the unit test) to provide its own service stub.
One easy way to do this would be to export a function contstructor instead of an object.
module.exports = (userService) => {
// default to the required service
this.service = userService || service;
this.findAll = (request, response) => {
this.service.findAll().then((categories) => {
response.status(200).send(categories);
}, (error) => {
response.status(error.statusCode || 500).json(error);
});
}
};
var ServiceConstructor = require('yourmodule');
var service = new ServiceConstructor();
Now the test can create a stub for service and provide it to the ServiceConstructor to exercise the findAll method. Removing the need for an asynchronous test altogether.

Categories