Dynamically construct list of all objects in package/directory - javascript

I have a directory containing all display types and I need to expose these types via an API. This set of display types will grow significantly over this application's lifecycle, so it'd be nice if adding a new type didn't require adding new code to our service layer.
So in a scenario where a subdir src/domain/types/ contains Type1.js, Type2.js, Type3.js, and Type4.js, is it possible to have a piece of code that instantiates all these types and adds those to some array? Something like:
// I know this line won't work, but you get the idea
const dir = require('../src/domain/types/')
getAllDisplayTypes() {
const types = []
for (const type in dir) {
// auto-magically create new type
types.push(someMagicallyCreatedType)
}
return types
}
where types would be an array where each entry is a set of default values created in the constructor of TypeX.js.
Alternatively, if you have a creative way to expose a set of domain object definitions, I'm all ears.

Figured it out. Using a package called require-dir that let's you require an entire directory, I was able to use some JS devilry to get this working:
const require_dir = require('require-dir')
const allTypes = require_dir('../domain/types/')
// ...
getDisplayTypes() {
const types = {}
for (const typeName of Object.keys(allTypes)) {
const typeConstructor = allTypes[typeName]
const typeInstance = new typeConstructor()
types[typeName] = typeInstance
}
return types
}

Related

Destructuring an object from a dynamic string name

I got this config.json file :
"signs_table": {
"id_field": "dummy_id_field_name",
"prop" : "dummy_prop_name"
...
}
This file contains tons of configuration for huge amount of tables stored in a database. Every table has different field name but the configuration for my code is the same for each table (an Id field name, different property fields, but of course fields name changes from table to table).
So, in my code, I am getting a data object and I want to be able to destructs it to dynamically named properties (from the configuration) like so :
const { dummy_id_field_name, dummy_prop_name} = this.props.data
but this is hard coded way. I would like to load the named properties based on the configuration file.
something like :
const IdField = config.get("signs_table").id_field // this will retrieve the actual field name from config.json I want to be able to pass it through the destructuring operation
const PropField = config.get("signs_table").prop
const { IdField , PropField } = data
Here the config.get("signs_table") line is a from a class method that manage my config.json file...it basically retrieves the property.
So far I found this usefull approach :
ES6 — How to destructure from an object with a string key?
But this does not help me since I need to first retrieve the field name from the configuration file...
Any Idea ?
You cannot avoid fetching the field names from the config files first:
const { id_field: IdField, pro: PropField } = config.get("signs_table"); // this will retrieve the actual field names from config.json
Then, afterwards you can use those as computed property names when destructuring your actual data:
const { [IdField]: idValue , [PropField]: propValue } = this.props.data;
console.log(idValue, propValue);
You can de-structure arbitrary property names as I'll show below, but the question is why are you forcing yourself into a syntax you are unfamiliar with. You should stick with the readable and straightforward approach instead of some fancy convoluted method.
const idField = config.get("signs_table").id_field; // this will retrieve the actual field name from config.json I want to be able to pass it through the destructuring operation
const propField = config.get("signs_table").prop;
const { [idField]: idValue, [propField]: propValue } = data;
It would be more straightforward to simply avoid the de-structuring altogether and access the fields directly.
const idField = config.get("signs_table").id_field; // this will retrieve the actual field name from config.json I want to be able to pass it through the destructuring operation
const propField = config.get("signs_table").prop;
const idValue = data[idField];
const propValue = data[propField];

Node library exposing components with dependencies

I'm developing a Node.js ORM library based on Knex, similar to Bookshelf, for use in other personal projects.
Some components of my library require an initialised instance of Knex, so I wrapped them up in an object that gets a Knex instance in the constructor, using a wrapper function to insert the Knex object without having the user inserting it whenever using the library. I tried to do it similar to how Knex and Bookshelf do it, but I found that code hard to read, besides I use ES6 classes, so it's not quite the same.
This is my current code:
const _Entity = require('./Entity.js');
class ORM {
constructor(knex) {
// wrapper for exposed class with Knex dependency;
// knex is the first argument of Entity's constructor
this.Entity = function(...args) {
return new _Entity(knex, ...args);
};
// exposed class without Knex dependency
this.Field = require('./Field.js');
}
}
function init(knex) {
return new ORM(knex);
}
module.exports = init;
The idea is that the user can use it something like this:
const ORM = require('orm')(knex);
const Entity = ORM.Entity;
const Field = ORM.Field;
const User = new Entity('user', [
new Field.Id(),
new Field.Text('name'),
// define columns...
]);
let user = User.get({id: 5});
It bothers me that Entity is only indirectly exposed and the code looks odd to me. Is there any more elegant or a "standard" way to expose components with dependencies?
Just use a regular function? :
const _Entity = require('./Entity.js');
const Field = require('./Field.js');
module.exports = function init(knex){
return {
Field,
Entity: _Entity.bind(_Entity, knex)
};
};

Require module that uses a singleton array

I want to create a really basic CRUD (sort-of) example app, to see how things work.
I want to store items (items of a shopping-list) in an array, using functions defined in my listService.js such as addItem(item), getAllItems() and so on.
My problems come when using the same module (listService.js) in different files, because it creates the array, in which it stores the data, multiple times, and I want it to be like a static "global" (but not a global variable) array.
listService.js looks like this:
const items = [];
function addItem (item) {
items.push(item);
}
function getItems () {
return items;
}
module.exports = {addItem, getItems};
and I want to use it in mainWindowScript.js and addWindowScript.js, in addWindowScript.js to add the elements I want to add to the array, and in mainWindowScript.js to get the elements and put them in a table. (I will implement later on Observer pattern to deal with adding in table when needed)
addWindowScript.js looks something like this:
const electron = require('electron');
const {ipcRenderer} = electron;
const service = require('../../service/listService.js');
const form = document.querySelector('form');
form.addEventListener('submit', submitForm);
function submitForm(e) {
e.preventDefault();
const item = document.querySelector("#item").value;
service.addItem(item);
console.log(service.getItems());
// This prints well all the items I add
// ...
}
and mainWindowScript.js like this:
const electron = require('electron');
const service = require('../../service/listService.js');
const buttonShowAll = document.querySelector("#showAllBtn")
buttonShowAll.addEventListener("click", () => {
console.log(service.getItems());
// This just shows an empty array, after I add the items in the add window
});
In Java or C#, or C++ or whatever I would just create a Class for each of those and in main I'd create an instance of the Service and pass a reference of it to the windows. How can I do something similar here ?
When I first wrote the example (from a youtube video) I handled this by
sending messages through the ipcRenderer to the main module, and then sending it forward to the other window, but I don't want to deal with this every time there's a signal from one window to another.
ipcRenderer.send('item:add', item);
and in main
ipcMain.on('item:add', (event, item) => {
mainWindow.webContents.send('item:add', item);
})
So, to sum up, I want to do something like : require the module, use the function wherever the place and have only one instance of the object.
require the module, use the function wherever the place and have only one instance of the object.
TL:DR - no, that isn't possible.
Long version: Nature of Electron is multi process, code you runs in main process (node.js side) and renderer (chromium browser) is runnning in different process. So even you require same module file, object created memory in each process is different. There is no way to share object between process except synchrnonize objects via ipc communication. There are couple of handful synchronization logic modules out there, or you could write your module do those job like
module.js
if (//main process)
// setup object
//listen changes from renderer, update object, broadcast to renderer again
else (//rendere process)
//send changes to main
//listen changes from main
but either cases you can't get away from ipc.

How can i use loads or JS functional programming approach to retrieve attributes from nested data structures?

My goal is to get the attributes from the object, attributes like:
Color, Icon, Name.
(for now lets call project to the object)
The tricky part is that i need to get all the attributes from all the projects where the user is in.
This should work
var projects = [/* Your big array */];
var userID = 1223456;
var userProjects = [];
for(project in projects) {
for (member in project) {
if(member.MemberId == userID) {
userProjects.push(project);
}
}
}
There might be a more optimized solution though.
i solve the issue using filter, map and includes, i forgot to mention that im using immutable data structures.
something like this:
const memberProjects = projects.filter(project => project.get('Members').map(member => member.get('MemberId')).includes(memberId));

Parse and share obj resources in module

I wanted to know if its good practice to use it like following since I used a global field cacheObj
I need to parse the data and share it between other modules,any module can take any property but only the first module which called to this parser is responsible to provide the data to parse(I need to do this parse just once and share properties in different modules)
This code is from other SO post and I want to use it
var Parser = require('myParser'),
_ = require('lodash');
var cacheObj; // <-- singleton, will hold value and will not be reinitialized on myParser function call
function myParser(data) {
if (!(this instanceof myParser)) return new myParser(data);
if (!_.isEmpty(cacheObj)) {
this.parsedData = cacheObj;
} else {
this.parsedData = Parser.parse(data);
cacheObj = this.parsedData;
}
}
myParser.prototype = {
//remove `this.cacheObj`
getPropOne: function () {
return this.parsedData.propOne;
},
getPropTwo: function () {
return this.parsedData.propTwo;
}
};
module.exports = myParser;
It kindda looks like the Context Object pattern, which is used for maintaining state and for sharing information. Some consider it a bad practice and prefer Singleton when it comes to share the object between layers, but if suites your case (in the same module) - my advice is to use it.
UPDATE
The main reason why you shouldn't use ContextObject through your layes is because it binds all sub-systems together( one object is referencing everything else). While Singleton is not just for creating objects, it is also services as access point that can be loaded by the corresponding sub-system. Having a Singleton represent every service access point allows for seamless vertical integration of cooperating components/modules. Simple code example:
Singleton:
// returns the "global" time
var time = Clock.getInstance().getTime();
Context object:
// allows different timezones to coexist within one application
var time = context.getTimezoneOffset().getTime();

Categories