Mediate and share data between different modules - javascript

I am just trying to get my head around event driven JS, so please bear with me. There are different kinds of modules within my app. Some just encapsulate data, others manage a part of the DOM. Some modules depend on others, sometimes one module depends on the state of multiple other modules, but I don't want them to communicate directly or pass one module to the other just for easy access.
I tried to create the simplest scenario possible to illustrate my problem (the actual modules are much more complex of course):
I have a dataModule that just exposes some data:
var dataModule = { data: 3 };
There is a configModule that exposes modifiers for displaying that data:
var configModule = { factor: 2 };
Finally there is a displayModule that combines and renders the data from the two other modules:
var displayModule = {
display: function(data, factor) {
console.log(data * factor);
}
};
I also have a simple implementation of pub-sub, so I could just mediate between the modules like this:
pubsub.subscribe("init", function() {
displayModule.display(dataModule.data, configModule.factor);
});
pubsub.publish("init"); // output: 6
However this way I seem to end up with a mediator that has to know all of the module-instances explicitly - is there even a way to avoid that? Also I don't know how this would work if there are multiple instances of these modules. What is the best way to avoid global instance-variables? I guess my question is what would be the most flexible way to manage something like that? Am I on the right track, or is this completely wrong? Sorry for not being very precise with my question, I just need someone to push me in the right direction.

You are on the right track, I'll try to give you that extra push you're talking about:
It you want loose coupling, pub-sub is a good way to go.
But, you don't really need that "mediator", each module should ideally be autonomous and encapsulate its own logic.
This is done in the following way: each module depends on the pubsub service, subscribe to all relevant events and act upon them. Each module also publishes events which might be relevant to others (code samples in a minute, bear with me).
I think the bit you might be missing here is that modules, which use events, will hardly never be just plain models. They will have some logic in them and can also hold a model (which they update when receiving events).
So instead of a dataModule you are more likely to have a dataLoaderModule which will publish the data model (e.g. {data: 3}), once he finishes loading.
Another great requirement you set is sharing data while avoiding global instance-variables - this is a very important concept and also a step in the right direction. What you miss in your solution for this is - Dependency Injection or at least a module system which allows defining dependencies.
You see, having an event driven application doesn't necessarily mean that every piece of the code should communicate using events. An application configuration model or a utility service is definitely something I would inject (when using DI, like in Angular), require (when using AMD/CommonJS) or import (when using ES6 modules).
(i.e. rather then communicating with a utility using events).
In your example it's unclear whether configModule is a static app configuration or some knob I can tweak from the UI. If it's a static app config - I would inject it.
Now, let's see some examples:
Assuming the following:
Instead of a dataModule we have a dataLoaderModule
configModule is a static configuration model.
We are using AMD modules (and not ES6 modules, which I prefer), since I see you stuck to using only ES5 features (I see no classes or consts).
We would have:
data-loader.js (aka dataLoaderModule)
define(['pubsub'], function (pubsub) {
// ... load data using some logic...
// and publish it
pubsub.publish('data-loaded', {data: 3});
});
configuration.js (aka configModule)
define([], function () {
return {factor: 2};
});
display.js (aka displayModule)
define(['configuration', 'pubsub'], function (configuration, pubsub) {
var displayModule = {
display: function (data, factor) {
console.log(data * factor);
}
};
pubsub.subscribe('data-loaded', function (data) {
displayModule.display(data, configuration.factor);
});
});
That's it.
You will notice that we have no global variables here (not even pubsub), instead we are requiring (or injecting) our dependencies.
Here you might be asking: "and what if I meant for my config to change from the UI?", so let's see that too:
In this case, I rather rename configModule to settingsDisplayModule (following your naming convention).
Also, in a more realistic app, UI modules will usually hold a model, so let's do that too.
And lets also call them "views" instead of "displayModules", and we will have:
data-loader.js (aka dataLoaderModule)
define(['pubsub'], function (pubsub) {
// ... load data using some logic...
// and publish it
pubsub.publish('data-loaded', {data: 3});
});
settings-view.js (aka settingsDisplayModule, aka config)
define(['pubsub'], function (pubsub) {
var settingsModel = {factor: 2};
var settingsView = {
display: function () {
console.log(settingsModel);
// and when settings (aka config) changes due to user interaction,
// we publish the new settings ...
pubsub.publish('setting-changed', settingsModel);
}
};
});
data-view.js (aka displayModule)
define(['pubsub'], function (pubsub) {
var model = {
data: null,
factor: 0
};
var view = {
display: function () {
if (model.data && model.factor) {
console.log(model.data * model.factor);
} else {
// whatever you do/show when you don't have data
}
}
};
pubsub.subscribe('data-loaded', function (data) {
model.data = data;
view.display();
});
pubsub.subscribe('setting-changed', function (settings) {
model.factor = settings.factor;
view.display();
});
});
And that's it.
Hope it helps :)
If not - comment!

You do not need a mediator. Just import data, config, and display and call display(data, config) where you need to.
// import data
// import config
function render(){
display(data, config)
}

Related

Javascript dependency injection & DIP in node: require vs constructor injection

I'm new to NodeJs development coming from the .NET world
i'm searching the web for best practices regrading DI / DIP in Javascript
In .NET i would declare my dependencies at the constructor whereas in javascript i see a common pattern is to declare dependencies in the module level via a require statement.
for me it looks like that when i use require i'm coupled to a specific file while using a constructor to receive my dependency is more flexible.
What would you recommend doing as a best practice in javascript? (I'm looking for the architectural pattern and not an IOC technical solution)
searching the web i came along this blog post (which has some very interesting discussion in the comments):
https://blog.risingstack.com/dependency-injection-in-node-js/
it summerizes my conflict pretty good.
here's some code from the blog post to make you understand what i'm talking about:
// team.js
var User = require('./user');
function getTeam(teamId) {
return User.find({teamId: teamId});
}
module.exports.getTeam = getTeam;
A simple test would look something like this:
// team.spec.js
var Team = require('./team');
var User = require('./user');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
this.sandbox.stub(User, 'find', function() {
return Promise.resolve(users);
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});
VS DI:
// team.js
function Team(options) {
this.options = options;
}
Team.prototype.getTeam = function(teamId) {
return this.options.User.find({teamId: teamId})
}
function create(options) {
return new Team(options);
}
test:
// team.spec.js
var Team = require('./team');
describe('Team', function() {
it('#getTeam', function* () {
var users = [{id: 1, id: 2}];
var fakeUser = {
find: function() {
return Promise.resolve(users);
}
};
var team = Team.create({
User: fakeUser
});
var team = yield team.getTeam();
expect(team).to.eql(users);
});
});
Regarding your question: I don't think that there is a common practice in the JS community. I've seen both types in the wild, require modifications (like rewire or proxyquire) and constructor injection (often using a dedicated DI container). However, personally, I think not using a DI container is a better fit with JS. And that's because JS is a dynamic language with functions as first-class citizens. Let me explain that:
Using DI containers enforce constructor injection for everything. It creates a huge configuration overhead for two main reasons:
Providing mocks in unit tests
Creating abstract components that know nothing about their environment
Regarding the first argument: I would not adjust my code just for my unit tests. If it makes your code cleaner, simpler, more versatile and less error-prone, then go for out. But if your only reason is your unit test, I would not take the trade-off. You can get pretty far with require modifications and monkey patching. And if you find yourself writing too many mocks, you should probably not write a unit test at all, but an integration test. Eric Elliott has written a great article about this problem.
Regarding the second argument: This is a valid argument. If you want to create a component that only cares about an interface, but not about the actual implementation, I would opt for a simple constructor injection. However, since JS does not force you to use classes for everything, why not just use functions?
In functional programming, separating stateful IO from actual processing is a common paradigm. For instance, if you're writing code that is supposed to count file types in a folder, one could write this (especially when he/she is coming from a language that enforce classes everywhere):
const fs = require("fs");
class FileTypeCounter {
countFileTypes(dirname, callback) {
fs.readdir(dirname, function (err) {
if (err) return callback(err);
// recursively walk all folders and count file types
// ...
callback(null, fileTypes);
});
}
}
Now if you want to test that, you need to change your code in order to inject a fake fs module:
class FileTypeCounter {
constructor(fs) {
this.fs = fs;
}
countFileTypes(dirname, callback) {
this.fs.readdir(dirname, function (err) {
// ...
});
}
}
Now, everyone who is using your class needs to inject fs into the constructor. Since this is boring and makes your code more complicated once you have long dependency graphs, developers invented DI containers where they can just configure stuff and the DI container figures out the instantiation.
However, what about just writing pure functions?
function fileTypeCounter(allFiles) {
// count file types
return fileTypes;
}
function getAllFilesInDir(dirname, callback) {
// recursively walk all folders and collect all files
// ...
callback(null, allFiles);
}
// now let's compose both functions
function getAllFileTypesInDir(dirname, callback) {
getAllFilesInDir(dirname, (err, allFiles) => {
callback(err, !err && fileTypeCounter(allFiles));
});
}
Now you have two super-versatile functions out-of-the-box, one which is doing IO and the other one which processes the data. fileTypeCounter is a pure function and super-easy to test. getAllFilesInDir is impure but a such a common task, you'll often find it already on npm where other people have written integration tests for it. getAllFileTypesInDir just composes your functions with a little bit of control flow. This is a typical case for an integration test where you want to make sure that your whole application is working correctly.
By separating your code between IO and data processing, you won't find the need to inject anything at all. And if you don't need to inject anything, that's a good sign. Pure functions are the easiest thing to test and are still the easiest way to share code between projects.
In the past, DI containers as we know them from Java and .NET did not exist. With Node 6 came ES6 Proxies which opened up the possibility of such containers - Awilix for example.
So let's rewrite your code to modern ES6.
class Team {
constructor ({ User }) {
this.User = user
}
getTeam (teamId) {
return this.User.find({ teamId: teamId })
}
}
And the test:
import Team from './Team'
describe('Team', function() {
it('#getTeam', async function () {
const users = [{id: 1, id: 2}]
const fakeUser = {
find: function() {
return Promise.resolve(users)
}
}
const team = new Team({
User: fakeUser
})
const team = await team.getTeam()
expect(team).to.eql(users)
})
})
Now, using Awilix, let's write our composition root:
import { createContainer, asClass } from 'awilix'
import Team from './Team'
import User from './User'
const container = createContainer()
.register({
Team: asClass(Team),
User: asClass(User)
})
// Grab an instance of Team
const team = container.resolve('Team')
// Alternatively...
const team = container.cradle.Team
// Use it
team.getTeam(123) // calls User.find()
That's as simple as it gets; Awilix can handle object lifetimes as well, just like the .NET / Java containers did. This lets you do cool stuff like inject the current user to your services, intantiating your services once per http request, etc.

Using shared module (between separate modules)

So I have a module I have created that does a kind of "state" routing for me. I made my own little version to get my exact intended effect, and it seems to be working great until I plug it into separate modules to test.
I inject it into the 2 separate modules, define the information in the .config of each module I need to use it, then call it in a controller to use my change state kind of effect.
It had been going pretty good until I plugged it into separate modules, and now what seems to be happening is the module I have created to handle all of this is creating separate instances for each module. Let me show you what I mean:
Here is an example of one of the modules using it for testing -
angular.
module('urlTesting2', [ 'urlTesting'])
.config(function($moduleObjectProvider) {
var callback = function(name, obj) {
console.log(name, obj);
}
$moduleObjectProvider.$get().set("module2", callback)
.addState("calender", ["day", "week", "month"]);
}).controller("testControl2", function($scope, checkUrl) {
$scope.addSecond = function() {
checkUrl.goState("module2", "calender", ["yes", "no", "maybe"]);
}
});
So it's injected, and in the config I call the provider and set a new modules with states. In the controller I just call goState. This works great when its just by itself. The issue is when I add a separate module in doing the same. I have a fiddle here showing the problem -
https://jsfiddle.net/7hn3ovgz/1/
So - I like to test this in my own browser window but fiddle seems to be the easiest way to share this. It will not change the actual url in the browser but it will still log all the effects.
Basically what I think is happening is when I click to change state in a module, it fires it twice and looks for the state in the other module too (which isn't there). My desired effect was that ALL modules setting a config would be all in one place. So when you do the .set - it just adds the object into a variable called currentModules in the provider. It seems like the configs are setting separate instances (like a closure) of this, instead of pushing all the config set() into one big object for reference.
Apologies if this is unclear, hopefully the fiddle will show clearly enough, and thank you for taking the time to read.
Seems like the issue is the injector for the provider, every time it is called it creates a new instance of that function, so all you should have to do is switch
function $moduleObjectProvider() {
var currentModules = {};
to
var currentModules = {};
function $moduleObjectProvider() {
or restructure the provider not to be an injected function if possible

How can requirejs provide a new "class" instance for each define with properties based on the filename?

Use case is implementing a mediator pattern where each instance is a postal.js channel.
Explaining it with the actual code would require some postal.js knowledge, so my example below is conceptually similar, but only uses requirejs.
// animalSounds.js
define([],function(){
return {
cat:'meow',
dog:'woof'
};
});
// animalFactory.js ... the requirejs plugin
define(['animalSounds'],function(animalSounds){
function animalClass(name){
this.sound = animalSounds[name];
}
return {
load:function(name, req, onload, config){
onload(new animalClass(name));
}
}
});
// cat.js
define(['animalFactory'],function(animalInstance){
animalInstance.sound === 'meow'; // should be true
});
// dog.js
define(['animalFactory'],function(animalInstance){
animalInstance.sound === 'woof'; // should be true
});
Basically it's a factory that returns a new instance whenever it's required. That part I can create fine. What I'm struggling with is getting dynamic properties from other sources based on what module is requiring it. While I could define the key as a string in each file, I'm hoping to use the filename as a key to stay DRY.
I would just minimally break the DRY principle and have cat, for instance, like this:
define(['animalFactory!cat'],function(animalInstance){
animalInstance.sound === 'meow'; // should be true
});
The fact of the matter is while it would be possible to do what you want using a loader, the resulting code would be complicated, and would obscure the logic of your application. For instance, requiring animalFactory multiple times and expecting different results (which is what you have in your question) goes against one of the fundamental principles of RequireJS: requiring the same module name over and over again should return the same value over and over again.
Note that it is possible to require the module module (it is a special module) and then use module.id to know the name of the module you are in but unfortunately, this is not available for building the dependency list on your define.

Pattern for sharing a library between angularjs and node.js

How can I share a library between angularjs and node.js?
For example an angularjs service is often a reusable piece of code. Let's take a URL library as an example (pick apart and construct URLs).
The same library should be usable in node.js.
My constraint is that I want to share the library code, but I do not want to restrict myself to any loader library on the browser side. So if I need to use RequireJS in the browser, I need to disable any loading part so that can be controlled elsewhere.
So how do I share code?
What you'll see in a lot of different places that support multiple environments is wrapping the entire returned value from your 'service' into a parameter passed to an initialization function from a closure. The one catch to keep in mind with angular is that service probably shouldn't have other dependencies to remain environment agnostic (If this was a simple utility file for example, there would not likely be conflict).
As an example consider:
(function( myService){
if (typeof module !== 'undefined' && module.exports ) {
module.exports = myService;
} else if( angular ){
angular.module('yourModule', [])
.factory('serviceNameHere', function(){ return myService; });
} else {
window.myService = myService;
}
}(function(){
function foo(){/* Do something */}
function bar(){/* Do something else */}
return {
foo: foo,
bar: bar
}
}()))
You could still have dependencies if desired via nodes require syntax, or angular's dependency injection, but the service would likely need modification as it moved from one project to another.

Node.JS - Using prototype in a module

So I'm writing a whole bunch of vendor-specific files in node which all have a similar controller pattern, so it makes sense for me to cut them out and put into a common file.
You can see my common controller file here: https://gist.github.com/081a04073656bf28f46b
Now when I use them in my multiple modules, each consecutively loaded module is overwriting the first. This is because the file is only required once and passed dynamically through to each module on load (this allows me to add extra modules and these modules are able to add their own routes, for example). You can see an example module here: https://gist.github.com/2382bf93298e0fc58599
You can see here on line 53 I've realised that we need to create a seperate instance every time, so I've tried to create a new instance by copying the standardControllers object into a new object, then initialising the new object. This has zero impact on the code, and the code behaves in exactly the same way.
Any ideas guys? I'm in a bit of a jam with this one!
First thing I'd do is try to make things simpler and reduce coupling by invoking the single responsibility principle, et al.
http://www.codinghorror.com/blog/2007/03/curlys-law-do-one-thing.html
Put those Schemas into their own files, eg
models/client.js
models/assistant.js
models/contact.js
I've also found that embedded docs + mongoose is generally a PITA. I'd probably promote all those to top level docs.
You don't need to enclose your object's keys in quotes.
routes = {
list: function() {} // no quotes is aok
}
Also 'list' in typical REST apps is called 'index'. Anyway.
Ok, I'd break this up differently. Since you're requiring stuff from the index.js file in the middleware, they become tightly coupled, which is bad. in fact, I think I'd rewrite this whole thing so it was tidier. Sorry.
I'd probably replace your 'middleware' file with an express-resource controller
https://github.com/visionmedia/express-resource (built by author of express). This is a good framework for restful controllers, such as what you're building. The auto-loader is really sweet.
You may also want to look at: http://mcavage.github.com/node-restify/ It's new, I haven't tried it out, but I've heard good things.
Since what you're building is basically an automated mongoose-crud system, with optional overriding, I'd create an express-resource controller as your base
/controllers/base_controller.js
and it might look like
var BaseController = function() {} // BaseController constructor
BaseController.prototype.index = function() {
// copy from your middleware
}
BaseController.prototype.show = function() {
// copy from your middleware
}
BaseController.prototype.create = function() {
// copy from your middleware
}
// etc
module.exports = BaseController
Then I'd do something like:
/controllers/some_resource_controller.js
which might look something like:
var BaseController = require('./base_controller')
var NewResourceController = function() {
// Apply BaseController constructor (i.e. call super())
BaseController.apply(this, arguments)
}
NewResourceController.prototype = new Base()
NewResourceController.prototype.create = function() {
// custom create method goes here
}
module.exports = NewResourceController
Then to use it, you can do:
var user = app.resource(myResourceName, new ResourceController());
…inside some loop which sets myResourceName to be whatever crud you're trying to set up.
Here's some links for you to read:
http://tobyho.com/2011/11/11/js-object-inheritance/
http://yehudakatz.com/2011/08/12/understanding-prototypes-in-javascript/
Also, it sounds like you're not writing tests. Write tests.
http://www.codinghorror.com/blog/2006/07/i-pity-the-fool-who-doesnt-write-unit-tests.html

Categories