I am looking at trying to get the knockout.mapping.merge library working in node, however I am unable to work out how to achieve extension of ko objects within the module scope as I am unable to extend the underlying ko instance.
So for example in my module I am exporting all the methods relating to the mapping.merge logic, however there is also some logic which extends the base observables to give them the ability to register their merge preferences:
knockout.observable.fn.withMergeMethod = function(method) {
this.mergeMethod = method;
return this;
}
knockout in the above scope would be the result from require("knockout"); but that scope only matches this. I was thinking I could possibly have some sort of init method which passes in the knockout constructor to enrich it in a node environment but was wondering if there was a better solution?
it seems like nodejs require only has one instance of the given dependency so if you update it in other defines it will apply those changes to all objects using that require:
Modules initialization in node.js and persisting their state between requests
Or so says this question, although I am not sure if that is a nice way to go in node, so will wait and see if there are any better answers.
Related
Coming from a C# background, I used interfaces to base my mock objects off of. I created custom mock objects myself and created the mock implementation off a C# interface.
How do you do something like this in JS or node? Create an interface that you can "mock" off of and also the interface would serve for the real class that would also be able to implement that interface? Does this even make sense in JS or or the Node World?
For example in Java, same deal, define an interface with method stubs and use that as the basis to create real class or mock class off of.
Unfortunately you're not going to find the standard interface as a part of JavaScript. I've never used C#, but I've used Java, and correct me if I'm wrong, but it looks like you're talking about creating interfaces and overriding methods for both mock testing purposes, as well as being able to implement those interfaces in other classes.
Because this isn't a standard JavaScript feature, I think you'll find that there are going to be a lot of very broad answers here. However, to get an idea of how some popular libraries implement this, I might suggest looking at how AngularJS looks at mock testing (there are many resources online, just Google it. As a starting point, look at how they use the ngMock module with Karma and Jasmine.)
Also, because of JavaScript's very flexible nature, you'll find that you can override any sort of "class method" (that is, any function object that is a member of another object, whether that be a new'ed "class" or a plain object) by simply re-implementing it wherever you need to... there's no special syntax for it. To understand where and how you would accomplish this, I'd suggest looking from the ground up at how JavaScript using prototypal/prototypical inheritance. A starting point might be considering an example like this:
function Cat(config) {
if(typeof config !== 'undefined') {
this.meow = config.meow; // where config can possibly implement certain mock methods
}
}
Cat.prototype = {
this.meow = function() {
// the meow that you want to use as part of your "production" code
};
};
var config = {};
config.meow = function() {
// some mock "meow" stuff
};
var testCat = new Cat(config); // this one will use the mock "Cat#meow"
var realCat = new Cat(); // this one will use the prototype "Cat#meow"
In the above example, because of how JavaScript looks up the prototype chain, if it sees an implementation in the class itself, it'll stop there and use that method (thus, you've "overridden" the prototype method). However, in that example, if you don't pass in a config, then it'll look all the way up to the prototype for the Cat#meow method, and use that one.
TL;DR: there's not one good way to implement JavaScript interfaces, especially ones that double as mocks (there's not even a best way to implement dependency injection... that's also a foreign concept to JavaScript itself, even though many libraries do successfully implement it for cetain use-cases.)
I struggle to find a statisfying solution on how to expose service instances whose methods need to be accessed through multiple parts of my applications.
The situation
First things first, by a 'service', I mean an instance of a function that holds properties & methods which are exposed through an API.
Consider a REST service whose purpose it is to provide convenient methods to access REST points on a server. I would make the following assumptions on that service:
It must be available throughout the application. It is likely that as the app grows, there will be new components that need access.
There is no need of multiple instances of this service. We can consider it a singleton.
My solutions
I can think of 2 possible solutions:
Concatenating scripts & utilizing the global object
I could combine all my script files (e.g rest.service.js, app.js) into a single file and create an object property on the global object (like App).
Then, I could attach service instances to this object. This allows me to do something like this from everywhere within the app:
App.restService.get()
However, even if I wrap each service in an IIFE, i still have to add some variables on window in order to retrieve instances.
Using commonJS / AMD modules
I could require() my service instances from everywhere by using require.js / browserify
The issues
Now I got a headache because on the one hand, people are telling me that polluting the global object is bad practice. Singletons are bad practice also.
On the other hand, we make a lot of effort to 'uglify' scripts, each byte saved considered an enhancement. Using browserify would lead to the same script injected in multiple files, though (I'm using web-components, therefore I've got a lot of isolated scripts). Not mentioning the fact that I have no idea on how to provide a state-safe service using browserify.
So how should I approach this problem?
How should I expose standard services that may or may not be instantiated multiple times? How should I implement state-safe ones?
Just a starting point (but too long to be a comment) I really enjoy the strategy used by AngularJs, where you always instantiate services within a container - and every time you instantiate something you also specify which modules should be injected into it:
angular.module('myApp.services', []); // the second argument are the dependencies (an empty array
At any point, you can retrieve your modules and add functionalities:
var services = angular.module('myApp.services');
services.factory('yourServiceName', //
['other', 'service', 'dependencies'],
function(other, service, dependencies){
other.doStuff();
service.doStuff();
dependencies.doStuff();
[..]
});
You can then inject your module in other modules
var myApp = angular.module('na', ['yourServiceName'])
In angular, the app is instantiated by the framework itself - but I guess you can develop a entry point for your app, so that you can use your services.
..unfortunately, I do not know exactly how this pattern is implemented - probably all the modules are stored within an instance of the application, so the global namespace is not polluted.
This problem also confuses me a lot, I think there are two points I can figure out:
1) There must be an entry point for each service in global, otherwise it is impossible to get the one you need everywhere. It's not good to add many things in global, but I think service reference is the one deserved.
2) Config the service object other than initialization, for example, they can be only one ajax service object with different configuration to do different things. There are objects, so they can be merged and extended.
This is an interesting topic, I would like to see more opinions about, not just management of services, also other resources like templates, objects, files, etc.
I've been tinkering with AngularJS and I've built up a small collection of directives and services that I would like to package into a single JS file so that I can use them anywhere.
I have some website specific settings that my module will need for API calls and that sort of thing. I'm just wondering what the Angular way of making configurable modules is. Obviously I don't want to have to modify my reusable JS file for each website, as that kind of defeats the purpose of having it. Seeing as the values are going to remain the same for each website it seems like an awful lot of hassle to pass them in as an argument on each function call, and I'd rather stay away from global variables as much as possible.
I've searched a lot of questions for the answers I seek, and the closest pattern I've found so far is to have my reusable module be dependant on a not included module called "settings" or something and then define that module in the page's JS file, allowing the reusable module to pull the values from it. Here's an example to show what I mean.
This seems backwards to me. It's kind of like having a function pull values from global values instead of passing the values in as arguments.
Is this really the best way of doing this, or is there an alternative?
It sounds like you're looking for a provider.
You should use the Provider recipe only when you want to expose an API for application-wide configuration that must be made before the application starts. This is usually interesting only for reusable services whose behavior might need to vary slightly between applications.
Here's a very basic example of a provider:
myMod.provider('greeting', function() {
var text = 'Hello, ';
this.setText = function(value) {
text = value;
};
this.$get = function() {
return function(name) {
alert(text + name);
};
};
});
This creates a new service, just like you might with myMod.service or myMod.factory, but provides an additional API that is available at config time—namely, a setText method. You can get access to the provider in config blocks:
myMod.config(function(greetingProvider) {
greetingProvider.setText("Howdy there, ");
});
Now, when we inject the greeting service, Angular will call the provider's $get method (injecting any services it asks for in its parameters) and gives you whatever it returns; in this case, $get returns a function that, when called with a name, will alert the name with whatever we've set with setText:
myMod.run(function(greeting) {
greeting('Ford Prefect');
});
// Alerts: "Howdy there, Ford Prefect"
This is exactly how other providers, like $httpProvider and $routeProvider work.
For more information on providers and dependency injection in general, check out this SO question on dependency injection.
I'm currently using a mediator that sits in-between all my modules and allows them to communicate between one another. All modules must go through the mediator to send out messages to anything that's listening. I've been doing some reading on RequireJS but I've not found any documentation how best you facilitate communication between modules.
I've looked at signals but if I understand correctly signals aren't really that useful if you're running things through a mediator. I'm just left wondering what else I could try. I'm quite keen on using a callback pattern of some kind but haven't got past anything more sophisticated than a simple lookup table in the mediator.
Here's the signal implementation I found: https://github.com/millermedeiros/js-signals
Here's something else I found: http://ryanflorence.com/publisher.js/
Is there a standardized approach to this problem or must everything be dependency-driven?
Using a centralized event manager is a fairly common and pretty scalable approach. It's hard to tell from your question what problem, if any, you're having with an events model. The typical thing is as follows (using publisher):
File 1:
require(['publisher','module1'],function(Publisher,Module1) {
var module = new Module1();
Publisher.subscribe('globaleventname', module.handleGlobalEvent, module);
});
File 2:
require(['publisher','module2'],function(Publisher,Module2) {
var module = new Module2();
module.someMethod = function() {
// method code
// when method needs module1 to run its handler
Publisher.publish('globaleventname', 'arguments', 'to', 'eventhandlers');
};
});
The main advantage here is loose coupling; rather than objects knowing methods of other objects, objects can fire events and other objects know how to handle that particular application state. If an object doesn't exist that handles the event, no error is thrown.
What problems are you having with this approach?
Here's something you might want to try out:
https://github.com/naugtur/overlord.js
It can do a bit more than an ordinary publisher or mediator. It allows creating a common API for accessing any methods of any modules.
This is kind of a shameless plug, because it's my own tool, but it seems quite relevant to the question.
Support for require.js has been added.
I am looking for ways to build a system where I do not need to load all source files in order to play the application. My past project had over 200 .js files (I am not kidding!) and it was really slow to do a page reload to test features you develop.
I looked into Dojo and I see how they have built a dynamic loader. Basically you just load a single core component, then everything else will be loaded when needed.
I am thinking about implementing a factory method in my application that allows me to construct new instances of objects in JavaScript:
var user = MyApp.create('MyApp.model.User');
instead of:
var user = new MyApp.model.User();
The reasoning why I'd like to ditch the new keyword is because the former approach allows me to dynamically load the component in a lazy-loaded fashion if it does not exist already. The factory method can just look if the target object is defined, and if it is not, it would load it.
The only problem I am facing with that is the fact IDEs no longer understand user is type of MyApp.model.User which is certainly not a good thing.
Is there a way to solve this dilemma? Can I somehow JsDoc that factory method?
If your factory method returns various types of objects, based on the argument, then you can't document the return value for the factory method itself (using #returns) in a way that IDEs can make sense of. At least I'm not aware of any way to do it.
But you can solve your problem, easily, by annotating the variable which holds the object, like this:
/**
* #type {MyApp.model.User}
*/
var user = MyApp.create('MyApp.model.User');
Of course, I don't know if your particular IDE can interpret this. Mine can (using Komodo).