I have a node application and I took the following functionality and put it in separate file in new folder as a new module. In this file I need to handle some action like save delete edit etc. I have two questions:
Should I separate the functionality inside this file to actions and
expose it differently?
In any case how should I call to this functionality with the
parameters which is needed to all the actions like req, res, path?
I'm looking for concrete examples.
This is the code that I use:
module.exports = function () {
const fs = require('fs')
function fileAction (req, res, urlAction) {
switch (urlAction) {
case 'save':
const writeStream = fs.createWriteStream('c://myfile.txt', { flags: 'w' })
req.pipe(writeStream)
req.on('end', function () {
console.log('Finish to update data file')
})
res.end()
break
case 'delete':
case 'update':
default:
}
}
I like this approach more than implementing functions inside and export lexical scope.
Simply with this approach I feel like the concern of "exporting" is separated from the implementation of the functions. In addition to that you can rename the functions you are going to export. Most importantly you might control better what you want and do not want to export.
var delete= function(){
};
var update = function(){
};
var save = function(){
};
module.exports.update = update;
module.exports.delete = delete;
module.exports.save = save;
Then you'll be able to call methods from your main file:
var file = require('./file.js');
file.save();
file.delete();
file.update();
You should do something more object-oriented:
module.exports = {
save: function () {
},
delete: function () {
},
update: function () {
}
}
Then you'll be able to call methods from your main file:
const FileLib = require('./fileLib.js')
FileLib.save()
If you plan to use this as logic inside an Express application, you do not really need to use req and res directly from inside your module except if you are writing an Express middleware or a router.
But what I would recommend you is to use your library from the router:
const FileLib = require('./fileLib.js')
router.put('/file/:id', function (req, res) {
// Do your stuff with your library
FileLib.save(req.param('fileToSave'))
res.send()
})
Your library should not be too coupled to the express architecture unless it's a middleware.
Writing RESTful Express routing might also be a good idea. Use HTTP verbs to specify your action to the API.
Related
There is a function in one of my files in project which I changed it's name and can't access it with new name! still the old name is available to call
I tried deleting node_modules and install it again using npm i
there is the code to both files I'm using:
soapCall.js
before:
function call(username, password){
...
}
module.exports = call
after:
function checkUser(username, password){
...
}
module.exports = checkUser
how I imported and used:
app.js
const soap = require('../../models/soapCall');
...
soap.checkUser(username, password);
it's weired that still I can't access the new function name
I was using name call before that and STILL can use call function in my app.js file
call is already available as a method on the function prototype - Function.prototype.call. This means soap is a function, which is why call works, but checkUser doesn't.
soap is a function because you're exporting a function from your file, and simply renaming it in your main file. If you want to change the name, either change the import name:
const checkUser = require("../../models/soapCall");
Or export an object and use it as such:
module.exports = { checkUser };
// Main file
const soap = require("../../models/soapCall");
soap.checkUser(...);
The object method will also allow you to export multiple functions from the one file - you can get these into their own variables with destructuring:
module.exports = { checkUser, otherFunc };
// Main file
const { checkUser, otherFunc } = require("../../models/soapCall");
checkUser(...); // Calls checkUser function
otherFunc(...); // Calls otherFunc function
You are exporting a function not an Object, so you need to directly call soap().
And about being able to run call, it's part of function prototype. So you got confused with Function.prototype.call()
Use it like this, as per the comment by #Wiktor Zychla
soapCall.js
exports.checkUser(username, password){
}
app.js
const soap = require('../../models/soapCall');
soap.checkUser(username, password);
Within an ExpresS API I'm using the Postmark library to send an email, which is initiated like this:
var postmark = require("postmark");
var client = new postmark.Client("aaaa-bbbbb-cccc");
And then used to send a password reset mail later on with:
client.sendEmailWithTemplate(
// Options here
);
Now, I would like to test this function has been called, but I have difficulties finding out how to mock/spy on this.
I have tried the following (simplified):
const request = require("supertest");
const app = require("../server/app");
const postmark = require("postmark");
jest.mock("postmark");
describe("API Tests", () => {
test("it should give a reset link when requesting with existing e-mail address", () => {
return request(app)
.post("/api/auth/passwordreset")
.send({
email: "user1#test.test"
})
.then(response => {
expect(postmark.Client).toHaveBeenCalled();
});
});
});
This works, but it's only testing if postmark has been used, since I can't figure out how to actually test the client.sendEmailWithTemplate method
Any suggestions on how to accomplish this?
EDIT: following up on #samanime answer I created a repo to illustrate the 'challenge'
https://github.com/Hyra/jest_test_example
You can specifically mock out the Client function that is returned by the mocked postmark to return an object with mocked functions.
In Jest, you can provide specific mocking code for a node_modules by creating a folder named __mocks__ at the same level as node_modules, i.e.,
/project-root
/node_modules
/__mocks__
Note, that is two underscores on each side.
In there, make a function named <package_name>.js (in your case, postmark.js). It will then load whatever is exported by that when you use the mock.
In that file, you can mock it out as needed. Something like this would probably work:
// global jest
module.exports = {
Client: jest.fn(() => ({
sendEmailWithTemplate: jest.fn(() => {})
}))
};
It doesn't have to be as compact as this, but basically it makes postmark have a function called Client which returns an object that has a functiono called sendEmailWithTemplate, both of which are mocks/spys.
Then you can just check if postmark.Client.sendEmailWithTemplate was called.
The one gotcha is you'll need to be sure to reset all of these in between tests. You could do this manually in your beforeEach(), but if you are going to reuse it, I like to add an extra function named __reset() which will reset the code and just call that:
// global jest
const mockedPostmark = {
Client: jest.fn(() => ({
sendEmailWithTemplate: jest.fn(() => {})
}))
};
mockedPostmark.__reset = () => {
mockedPostmark.Client.mockClear();
mockedPostmark.Client.sendEmailWithTemplate.mockClear();
};
module.exports = mockedPostmark;
You can add additional functions as needed as well.
I'm new to NodeJs development coming from the .NET world
i'm searching the web for best practices regrading DI / DIP in Javascript
In .NET i would declare my dependencies at the constructor whereas in javascript i see a common pattern is to declare dependencies in the module level via a require statement.
for me it looks like that when i use require i'm coupled to a specific file while using a constructor to receive my dependency is more flexible.
What would you recommend doing as a best practice in javascript? (I'm looking for the architectural pattern and not an IOC technical solution)
searching the web i came along this blog post (which has some very interesting discussion in the comments):
https://blog.risingstack.com/dependency-injection-in-node-js/
it summerizes my conflict pretty good.
here's some code from the blog post to make you understand what i'm talking about:
// team.js
var User = require('./user');
function getTeam(teamId) {
return User.find({teamId: teamId});
}
module.exports.getTeam = getTeam;
A simple test would look something like this:
// team.spec.js
var Team = require('./team');
var User = require('./user');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
this.sandbox.stub(User, 'find', function() {
return Promise.resolve(users);
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});
VS DI:
// team.js
function Team(options) {
this.options = options;
}
Team.prototype.getTeam = function(teamId) {
return this.options.User.find({teamId: teamId})
}
function create(options) {
return new Team(options);
}
test:
// team.spec.js
var Team = require('./team');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
var fakeUser = {
find: function() {
return Promise.resolve(users);
}
};
var team = Team.create({
User: fakeUser
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});
Regarding your question: I don't think that there is a common practice in the JS community. I've seen both types in the wild, require modifications (like rewire or proxyquire) and constructor injection (often using a dedicated DI container). However, personally, I think not using a DI container is a better fit with JS. And that's because JS is a dynamic language with functions as first-class citizens. Let me explain that:
Using DI containers enforce constructor injection for everything. It creates a huge configuration overhead for two main reasons:
Providing mocks in unit tests
Creating abstract components that know nothing about their environment
Regarding the first argument: I would not adjust my code just for my unit tests. If it makes your code cleaner, simpler, more versatile and less error-prone, then go for out. But if your only reason is your unit test, I would not take the trade-off. You can get pretty far with require modifications and monkey patching. And if you find yourself writing too many mocks, you should probably not write a unit test at all, but an integration test. Eric Elliott has written a great article about this problem.
Regarding the second argument: This is a valid argument. If you want to create a component that only cares about an interface, but not about the actual implementation, I would opt for a simple constructor injection. However, since JS does not force you to use classes for everything, why not just use functions?
In functional programming, separating stateful IO from actual processing is a common paradigm. For instance, if you're writing code that is supposed to count file types in a folder, one could write this (especially when he/she is coming from a language that enforce classes everywhere):
const fs = require("fs");
class FileTypeCounter {
countFileTypes(dirname, callback) {
fs.readdir(dirname, function (err) {
if (err) return callback(err);
// recursively walk all folders and count file types
// ...
callback(null, fileTypes);
});
}
}
Now if you want to test that, you need to change your code in order to inject a fake fs module:
class FileTypeCounter {
constructor(fs) {
this.fs = fs;
}
countFileTypes(dirname, callback) {
this.fs.readdir(dirname, function (err) {
// ...
});
}
}
Now, everyone who is using your class needs to inject fs into the constructor. Since this is boring and makes your code more complicated once you have long dependency graphs, developers invented DI containers where they can just configure stuff and the DI container figures out the instantiation.
However, what about just writing pure functions?
function fileTypeCounter(allFiles) {
// count file types
return fileTypes;
}
function getAllFilesInDir(dirname, callback) {
// recursively walk all folders and collect all files
// ...
callback(null, allFiles);
}
// now let's compose both functions
function getAllFileTypesInDir(dirname, callback) {
getAllFilesInDir(dirname, (err, allFiles) => {
callback(err, !err && fileTypeCounter(allFiles));
});
}
Now you have two super-versatile functions out-of-the-box, one which is doing IO and the other one which processes the data. fileTypeCounter is a pure function and super-easy to test. getAllFilesInDir is impure but a such a common task, you'll often find it already on npm where other people have written integration tests for it. getAllFileTypesInDir just composes your functions with a little bit of control flow. This is a typical case for an integration test where you want to make sure that your whole application is working correctly.
By separating your code between IO and data processing, you won't find the need to inject anything at all. And if you don't need to inject anything, that's a good sign. Pure functions are the easiest thing to test and are still the easiest way to share code between projects.
In the past, DI containers as we know them from Java and .NET did not exist. With Node 6 came ES6 Proxies which opened up the possibility of such containers - Awilix for example.
So let's rewrite your code to modern ES6.
class Team {
constructor ({ User }) {
this.User = user
}
getTeam (teamId) {
return this.User.find({ teamId: teamId })
}
}
And the test:
import Team from './Team'
describe('Team', function() {
it('#getTeam', async function () {
const users = [{id: 1, id: 2}]
const fakeUser = {
find: function() {
return Promise.resolve(users)
}
}
const team = new Team({
User: fakeUser
})
const team = await team.getTeam()
expect(team).to.eql(users)
})
})
Now, using Awilix, let's write our composition root:
import { createContainer, asClass } from 'awilix'
import Team from './Team'
import User from './User'
const container = createContainer()
.register({
Team: asClass(Team),
User: asClass(User)
})
// Grab an instance of Team
const team = container.resolve('Team')
// Alternatively...
const team = container.cradle.Team
// Use it
team.getTeam(123) // calls User.find()
That's as simple as it gets; Awilix can handle object lifetimes as well, just like the .NET / Java containers did. This lets you do cool stuff like inject the current user to your services, intantiating your services once per http request, etc.
I am trying to follow the facade design pattern in a node.js application where I have on object that is used with the rest of the application called controller.js as the facade. The controller controls calls to objects user.js, animal.js, and house.js which are all separate files.
In controller.js I do
var housecontroller = require("./controllers/housecontroller");
...
I want to call something like controller.getHouse() in another file (client). How do I make it so that I can do that and not have to call housecontroller.getHouse()?
Each of my controllers are formatted as follows
module.exports = {
getHouse:function(){...},
...
}
I'm a bit confused on how to properly export things in order to get this to work. I import/export the controllers and their methods in controller.js as follows
module.exports = {
getHouse : housecontroller.getHouse,
...
};
I use only the house in the examples but it's implied I do the same for user and animal which all have several methods. In the client, I just import controller.js and use its methods.
var controller = require("./controller");
controller.getHouse();
According to your code/naming you could have a file controller.js in controllers folder with something like this
var housecontroller = require('./housecontroller');
var carcontroller = require('./carcontroller');
module.exports = {
getHouse: housecontroller.controller,
getCar: carcontroller.controller
};
The client code could be:
var controller = require('./controllers/controller');
controller.getHouse();
I added carcontroller as example of extension.
If you only have one function to offer from each controller you can change your code and these examples to:
//housecontroller.js
module.exports = getHouse;
//controller.js
var housecontroller = require('./housecontroller');
var carcontroller = require('./carcontroller');
module.exports = {
getHouse: housecontroller,
getCar: carcontroller
};
Although I don't recommend it because you are reducing the opportunity to offer more functions from that module in the future
In the routes folder of a Node.Js app I have a file entries.js which has the following function:
exports.form = function(req, res){
res.render('post', { title: 'Post' });
};
Is it actually possible to launch something like this from another exports function in the same file, such as:
exports.something = function(req, res){
this.form(req.res);
};
Where this.form refers to exports.form function in the same file.
Thank you!
The value of this by default will point to the object whose member the function is. In this case, both functions are members of the same object, thus using this.form should work as expected.
The exceptions to this rule are when the function in question is used together with bind, call, or apply.