EventDriven NodeJS App Architecture with Lifecycle Hooks - javascript

I have a general question about developing a Nodejs App with own lifecycles and events to track serveral activities globally.
What do I need?
Any
References
Tips and Tricks
Blueprints
Frameworks
Background
In have developed a backend with database, authentification and fileupload. I made that very pragmatic and it works fine. But now I want to develop a new better architecture, which is event driven to track activities and lifecycle hooks to init modules.
I was researching about Libs, which seems to be great, but I need to know, if I am on a right way.
Here is the recent backend, which I want to renew from core:
#fractools/node
Goal
The Node needs to be more abstract and event driven to track activities more globally for the hole process.
I was thinking about a core module, which defines the basement for events and hooks which listen to the hole application process.
So other modules like database handling, or authentification etc can be wrapped onto it.
My questions
How can I have a helper like a activity logger, which is listening through the hole app, so I dont need to import a logger in every single action? Is it possible? Does this make any sense?
Example Case
Core module tracks database acivities:
System puts Data into Database, core listen to it, tracks it and pushs this information into a Logger, which uses a Database.
Issue
I stuck for example at one simple architecture thing:
If I have a Database Class, and I want the Core, which is an extended EventErmitter, listen to database actions, I cant have a logger inside Core, which uses database, because i cant reference those modules to eachother at once:
// Core Module
const EventErmitter = require('events');
const Database = require('./database)';
module.exports = class Core extends EventEmitter {
constructor() {
super();
this.logger = new Database('Logger');
this.on('mounted', () => {
this.logger.put('instance mounted')
});
this.on('putData', () => {
this.logger.put('dataPut')
});
this.emit('mounted');
}
}
// Database Module
const Core = require('./core');
const PouchDB = require('pouchdb'); // Which database lib to use is not important yet
let core = new Core();
module.exports = class Database {
constructor(name, options) {
this.id = generateID();
this.name = name;
this.created = JSON.stringify(new Date()); // TODO doubled Quotes
this.options = options || {};
this.db = new PouchDB(name);
core.emits('mounted')
}
put(data) {
this.db.put(data);
core.emits('dataPut')
}
}
So I noticed, I need any starting point. :s
I have had a look into NodeJS EventErmitter and found out that the NodeJS Process is also laying on EventErmitter.
I also have had a look into several Tuts for NodeJS Lifecycle Events, but it does not really to help to start the architecture anyway.
I hope, some of you have good references or 'blueprints', which helps me to develope a good core for my node. :)
Have a nice day so far and stay safe!
Greetings

Related

How to correctly make an entry point to my module, which is containing multiple classess?

I've started to develop a desktop app with node and electron. It has a package, which is implementing connection with some API. It is structured as one base class, and some derrived classes in this way:
ApiBase
ApiAuth extends ApiBase
ApiAuth.login()
ApiAuth.logout()
etc...
ApiTasks extends ApiBase
ApiTasks.getTaskList()
etc...
etc...
And now, i want to make nice and convinient way to use these classes in my app. So i need to create some entry point, which will provide an access to my API implementation. But, i do not have much expirience to make it right.
I thought about something like this:
index.js:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
login: apiAuth.login,
logout: apiAuth.logout,
getTaskList: apiTasks.getTaskList,
etc...
}
somwhere at the app:
const api = require("./lib/someApi");
// need to get task list for some reason
api.getTaskList(param1, param2)
But there are some problems with this approach i managed:
it is a problem to pass host param to the constructors in index.js dynamicly
i am not sure if creating this instances everytime requiring index.js is a rigth thing
So i want to know about some approches i can use here, because i do now even know where to start research. Thank you.
I think that you identified some of the most crucial decisions with this:
it is a problem to pass host param to the constructors in index.js dynamicly
IMO Configuration and the interface are important considerations. Even though it can be refactored after the fact an easy to configure and consume interface will help reduce adoption of your library. As you pointed out the configuration is static right now and very brittle. Ie a change to the URL will cascade to all clients and require all clients to update.
A first intuitive alternative may be to allow dynamic configuration of the current structure:
apiAuth = new ApiAuth(process.env.API_AUTH_URL || 'www.sample-host.com');
apiTasks = new ApiTasks(process.env.API_TASKS_URL || 'www.sample-host.com');
While this allows client to dynamically configure the URL, the configuration is "implicit". IMO this is unintuitive and difficult to document. Also it's not explicit and requires a client to look in the code to see the environmental variables and instantiation flow.
I would favor exposing these classes to the client directly. I would consider this approach "explicit" as it forces the client to explicitly configure/instantiate your components. I think it's like providing your clients with primitives and allowing them to compose, build, and configure them in whatever way they want:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
module.exports = {
auth: ApiAuth,
tasks: ApiTasks
}
This automatically namespaces the api behind its functions (auth|tasks) AND requires that the client instantiatae the classes before using:
const api = require("./lib/someApi");
const auth = new api.auth(process.env.SOMETHING, 'some-url');
This pulls the configuration further out in the architecture. It forces the client to decide how it wants to get the URL and explicitly instantiate the library. What if one of your clients doesn't use login/logout? This may be more flexible in that case.
i am not sure if creating this instances everytime requiring index.js is a rigth thing
If instantiation should remain hidden, another alternative would be to provide a builder function in order to encapsulate it:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
auth: {
build: (url) => {
return new ApiAuth(url);
}
},
tasks: {
build: (url) => {
return new ApiTasks(url);
}
}
}
This should still hide each class but still allows the client to decide how it configures each class:
const api = require("./lib/someApi");
const auth = api.auth.build('my-url');
auth.login();

Reusing Database Connections With Azure Functions Using Javascript

I cannot find clear information on how to manage database connections (MongoDB in my case) from an Azure function written in Javascript.
The Microsoft document below says to not create a connection for each invocation of the function by using static variables in C# using .NET Framework Data Provider for SQL Server and the pooling is handled by the client connection. It does not describe how to do this in Javascript.
https://learn.microsoft.com/en-us/azure/azure-functions/manage-connections
A solution of creating a global variable to hold the database client between invocations is described here but the author is not confident this is the correct way to do it.
http://thecodebarbarian.com/getting-started-with-azure-functions-and-mongodb.html
Has anyone used this in production or understand if this is the correct approach?
Yes, there's a very close equivalence between C#/SQL storing a single SqlConnection instance in a static variable and JS/MongoDB storing a single Db instance in a global variable. The basic pattern for JS/MongoDB in Azure Functions is (assuming you're up to date for async/await - alternatively you can use callbacks as per your linked article):
// getDb.js
let dbInstance;
module.exports = async function() {
if (!dbInstance) {
dbInstance = await MongoClient.connect(uri);
}
return dbInstance;
};
// function.js
const getDb = require('./getDb.js');
module.exports = async function(context, trigger) {
let db = await getDb();
// ... do stuff with db ..
};
This will mean you only instantiate one Db object per host instance. Note this isn't one per Function App - if you're using a dedicated App Service Plan then there will be the number of instances you've specified in the plan, and if you're using a Consumption Plan then it'll vary depending on how busy your app is.

How do I access an object from the main process from a render process [Electron]

I'm trying to create a tool for editing files containing a object that is related to my companies business logic. I'm using electron to do so.
I've created a javascript class which represents the object, handles its internals, and provides buisness functions on it:
class Annotation {
constructor() {
this._variables = []
this._resourceGenerators = []
}
get variables() {
return this._variables
}
get resourceGenerators() {
return this._resourceGenerators
}
save(path) {
...
}
static load(path) {
...
}
};
module.exports = Annotation;
I create the object in my main process, and I have an event handler which gives render processes access to it:
const {ipcMain} = require('electron')
const Annotation = require('./annotation.js');
... Do electron window stuff here ...
var annotation = new Annotation()
ipcMain.on('getAnnotation', (event, path) => {
event.returnValue = annotation
})
I've just found out that sending an object through ipcMain.sendSync uses JSON.stringify to pass the annotation, meaning it looses the getters/functions on it.
I'm fairly new to web/electron development; what is the proper way of handling this? Previously I had handlers in main for dealing with most of the functions that the render processes needed, but main started to become very bloated, so I'm trying to refactor it somewhat.
TL; DR: RECONSTRUCT OBJECT ON RECEIVER SIDE.
Description: Electron's main architectural design is based on multi-process, separating main (node.js) and each renderer (chromium) processes and allow to communicate between processes via IPC mechanism. And due to several reason (efficiency, performance, security, etcs) Electron's OOTO IPC only allows serializable POJO to be sent / received. Once receiver have those data, you may need reconstruct desired object from there.
If your intention around access is to share references like true singleton, that's not available.
The first thing I would suggest is that in most cases, you don't need to transfer anything to the main process. The main process is mostly for creating windows and accessing Electron API's which are restricted to the main process. Everything else should and can be done from the renderer including access to all node modules. You can write files, access databases, etc all from the renderer.
Read this article about the differences between the main and renderer processes and what you should be using each for.

Should I cache firebase refs?

I'm developing a web app backed with firebase realtime database.
The app's frontend is quite complex and there are several methods that write data to the db. I have several utils that look like this:
var utils = {
setSomething: function(id, item) {
var myRef = firebase.database().ref('my/path');
myRef.set(item).then(something);
}
}
The question here is: is it okay to create a new Ref inside the method (and thereby, creating a new ref with each call) or should I "cache" the ref somewhere else (just like we cache jquery objects).
I could do something like this first:
var cachedRefs = {
myRef: firebase.database().ref('my/path'),
yourRef: firebase.database().ref('your/path'),
herRef: firebase.database().ref('her/path')
}
And then the former method could be rewritten as:
var utils = {
setSomething: function(id, item) {
cachedRefs.myRef.set(item).then(something);
}
}
Is there any performance gain besides having less code repetition?
firebaser here
References just contain the location in the database. they are cheap.
Adding the first listener to a reference requires that we start synchronizing the data, so that is as expensive as the data you listen to. Adding extra listeners is then relatively cheap, since we de-duplicate the data synchronization across listeners.

Making RESTful API call from React.js

I am doing a POC for isomorphic JavaScript application to render HTML from the server side. The POC is working with simple HTML, but I want to make an API call and get the JSON response and send to the render function. I tried various ways but it is not working.
What am I missing? I am very new to React.js.
loadCategoriesFromServer: function() {
var self = this;
// get walking directions from central park to the empire state building
var http = require("http");
url = "api url here";
var request = http.get(url, function (response) {
// data is streamed in chunks from the server
// so we have to handle the "data" event
var buffer = "",
data,
route;
response.on("data", function (chunk) {
buffer += chunk;
});
response.on("end", function (err) {
data = JSON.parse(buffer);
//console.log(data.d);
//console.log(data.d.Items);
self.setState({
categories: data.d.Items
});
});
});
}, // load from server end
getInitialState: function() {
return { categories: [] };
},
componentWillMount: function() {
console.log("calling load categories")
this.loadCategoriesFromServer();
},
render: function () {
//console.log("data");
//console.log(this.state.categories);
var postNodes = this.state.categories.map(function (cat) {
console.log(cat);
});
return (
<div id="table-area">
//i want to paint the data here..
</div>
)
}
});
Fetching inside of component using componentWillMount is not a right place, in case when you need to render server side. You need to somehow move it out form component, and pass actual data as props after it is fetched - for example as #JakeSendar suggested in his answer.
I have some experience doing isomorphic app with React, and the main problem I faced is how to wait until all data would be loaded before first render
As #FakeRainBrigand already mentioned in comments, there is not only one way to do this, and it depends from your requirements.
There is few ways to do build an isomorphic app, the some interesting from my perspective is: https://github.com/webpack/react-starter and http://fluxible.io/
But, the most elegant way to do this, as I figured out for myself - is to organise asynchronous rendering for react components, in particular using RxJS.
In general my application is structured as following:
views - React components without any logic (just a view)
models - Observables with current state (initial data is loaded using superagent, then combined with other models and/or actions results).
In simple case it is something like:
Rx.Observable.defer(fetchData).concat(updatesSubject).shareReplay()
actions(or intents) - Observers used to collects user input, do something, and dispatch action results to subscribers models and/or other actions. In simple case something like:
updatesSubject = new Rx.Subject();
action = new Rx.Subject();
action.switchMap(asyncRequest).subscribe(updatesSubject)
components - Observables(stream of virtual DOM elements) combined from models, other components and actions (I have a note about this, explaining how and why to create Observable React elements with RxJS), also now I am planning to add partial components (tuple from: react component, observables, observers, and properties. partially filled with using DI)
router - component responsible to handling location changes,
in general main feature is to map location changes to stream of virtual DOM elements and meta information. But in details, it is bit more complicated in my case(url generation, active url highlighting, handling scrolls when navigating, also it has possibility of nested routes and multiple views)
All this is assembled together using DI container, in my case similar to angular2 DI container, but a lot simplified for my specific needs.
Components, models and actions are created using DI.
On server side application is like this:
var rootInjector = new Injector();
// setup server specific providers
rootInjector.provide(..., ...)
app.get('/*', function(req,res){
var injector = rootInjector.createChild();
// setup request specific providers
injector.provide(..., ...);
injector.get(Router)
.first()
.subscribe(function(routingResult){
res.render('app', {
title: routingResult.title,
content: React.renderToString(routingResult.content)
});
});
}
and similar on client side:
var rootInjector = new Injector();
// setup server specific providers
// actually this is omitted in my case because default providers are client side
rootInjector.provide(..., ...)
contentElement = document.getElementById('#content');
rootInjector.get(Router)
.subscribe(function(routingResult){
document.title = routingResult.title;
React.render(routingResult.content, contentElement)
});
In comparison to flux, it is more declarative and more powerful way to organise app. And in case of isomorphic app - for me, it looks much better that various hacks with flux. But of course there is drawbacks... - it is more complicated.
Likely later, I will opensource all this, but for now - it is not quite ready to be published.
UPD1:
Original answer is a bit outdated(later I plan to update it), and I have some progress in this area.
Links to code mentioned above, already opensourced:
DI container: di1
Container for react componentns(connecting view to observables and obsrvers): rx-react-container
Starter template, for implementing isomorphic widgets, using RxJS and React, and libraries above: Reactive Widgets
About complete application(work still in progress, and documentation there is not quite good, but in general it should be clear):
Router built especially for isomophic reactive applications router1 and react components to use it router1-react
Application template with router and all libraries mentioned above: router1-app-template
React's renderToString method (for rendering components on the server) is synchronous. Therefore, any sort of async task, such as your api request, will still be pending by the time the component has rendered.
There are a couple of ways you can go about fixing this, depending on whether or not you want to fetch your data on the server or client.
If you choose to fetch the data on the server, first move your api-request logic outside of your component. Then, render your component in the callback, passing the fetched-data as a prop. It would look something like this:
response.on("end", function (err) {
var data = JSON.parse(buffer);
var markup = React.renderToString(Component({categories: data}));
});
Inside your component, you'd be able to access the data via this.props.categories.
The other option is to handle the api request on the client. You would make an AJAX request in componentDidMount, and set the component's state from the fetched data. It would look very similar to what you have now, the key difference being that your request logic would live in componentDidMount (async, called on the client) rather than componentWillMount (not async, called on the server).
You should use superagent, works really good for me, also you are missing the most important part, you should use flux to fetch data from a server, flux is the way that facebook strongly recommended, it's pretty easy to use flux architecture.

Categories