While working on many differente Node.js projects, I began having common code that I want to move out in a new Node.js package in order to not rewrite the same code multiple times. I've a new Node.js module and I'm using it in my projects using npm link.
Now, I'm a bit confused as to how to structure this common library in order to properly modularize it. This is what I have right now in my common library:
// "my-common-lib"'s app.js
module.exports = {
math: require("./math/mathLib"),
network: require("./network/networkLib")
};
--
//mathLib.js
exports.pi = 3.14;
This works, I can do the following in another node.js project:
var commonLibrary = require("my-common-lib");
var commonMath = commonLibrary.Math;
console.log("Pi: " + commonMath.pi);
While this solves the issue, I would prefer something similar to how lodash does it:
var commonMath = require("my-common-lib/math");
console.log("Awesome pi: " + commonMath.pi);
I can't quite figure out how lodash does it, and I would definitely like to avoid having a humongous main js file.
TL;DR I want to modularize a node.js module so I can require submodules (require("my-common-lib\myCommonMathLib")), how can I do this?
lodash does it with a dedicated modular build. Look at the ES6 build for example. Every "sub-project" has a dedicated module, in a dedicated '.js' file. The aggregating file (lodash.js) simply imports all other modules.
If you want the nice lib/module convention, simply have a your lib.js file (aggregator) at the top level, next to a directory by the same name where all internal modules are kept.
Another option for the require("lib") part is to have a "main": "lib.js" configuration in your package.json
If you want to use lodash/array for example, LoDash has an array.js file, with the following:
module.exports = {
'chunk': require('./array/chunk'),
'compact': require('./array/compact'),
So you can easily have math.js inside your main folder, which has something like:
module.exports = {
pi: 3.14
// OR
pi: require('./math/pi'); // and have file pi.js inside math folder
}
This way you can use it as a short:
var math = require('my-common-lib/math');
math.pi; // 3.14
Related
In Node.js, is there any way to require file from the same package without using relative paths? For example, here's a snippet of code from ESLint.
const rule = require("../../../lib/rules/accessor-pairs"),
{ RuleTester } = require("../../../lib/rule-tester");
The fact that we have to walk all the way up the tree ../../../ to get to the root is not only annoying. It's also brittle, because I can't move the code without updating all of dependency references.
Yet somehow Node.js developers seem to have lived with it the past 10 years. I can't find anything in the docs or Stack Overflow that solves this problem other than a third-party dependency called require-self. Nor have I been able to find a definitive statement that using relative paths is the only non-hacky way for a file to require another file in the same module.
If there's a way to specify a path relative to the package root in ECMAScript Modules (ESM) but not CommonJS (CJS), or vice versa, I would like to know that as well.
To be clear, I don't think there is a solution to the problem. If there is great. Otherwise, I'm looking for confirmation with an authoritative reference.
Not necessarily the same package - if you are writing libraries this won't be useful, but if you are writing the "final application" - the thing that actually gets run:
One option:
If the NODE_PATH environment variable is set to a colon-delimited list of absolute paths, then Node.js will search those paths for modules if they are not found elsewhere.
So you can do any of:
1.
export NODE_PATH=.
node app.js
NODE_PATH=. node app.js
// app.js (or whatever your entry point is) before *any* require() calls
process.env.NODE_PATH = __dirname;
require('module').Module._initPaths();
Or, another way:
The Module object representing the entry script loaded when the Node.js process launched.
https://nodejs.org/api/modules.html#modules_require_main
So you can just do:
const rule = require.main.require("./lib/rules/accessor-pairs")
anytime you want it to be relative to the root (assuming that is how you have your project structured).
You can use the package name itself as a "symlink" to the package root.
Example - foo package imports bar script relative to the foo package root.
package.json
{
"dependencies": {
"foo": "file:./foo"
}
}
index.js
const foo = require('foo');
console.log(foo.bar); // prints "hello"
foo/index.js
const bar = require('foo/bar'); // import relative to the package root
module.exports = {
bar: bar
}
foo/bar.js
module.exports = 'hello';
If you use vscode then you're in luck - !!!! jsconfig.json in project root handles this masterfully for commonjs, es6, amd, umd, etc
The jsconfig.json file specifies the root files and the options for the features provided by the JavaScript language service.
jsconfig.json
{
"compilerOptions": {
"module": "commonjs",
"baseUrl": ".",
"paths": {
"#rules/*": ["path/to/lib/rules/"]
}
}
}
and then to use the alias:
const rule = require('#rules/accessor-pairs'),
{ RuleTester } = require('#rules/rule-tester');
Read more:
https://code.visualstudio.com/docs/languages/jsconfig
Bear with me as I lead you through the process that elicited my question.
I'm working on a CLI app in node and I'm using objects to encapsulate my business logic using this pattern:
// my-project/lib/widget/myobject.js
var MyObject = function(x) {
this.x = x;
};
MyObject.prototype.getX = function() {
return this.x;
};
module.exports = MyObject;
I'm also testing these objects:
// my-project/test/lib/widget/myobject.spec.js
var MyObject = require('../../../lib/widget/myobject.js');
describe('MyObject', function() {
...
});
At one point I was unhappy with the naming and directory structure I had chosen. I found myself tediously counting those parent directory references (..) in several spec files when rewriting the relative paths. I figured there must be an easier way to reference a root directory containing these object definitions.
One of the recommendations I found here suggested "putting application-specific modules into node_modules".
Now, as I understand modules, they are the packages I download from npm and use in my project. They contain libraries of useful things with a single API exported to me when I call require. This is not how I view the simple single-purpose classes built specifically for the internal use of my application.
If you've stuck with me this far, thank you! Here is my question:
How do I make the internals of my application more "modular" so it properly follows the intent of the Node module system while remaining object oriented?
I'm not sure how suitable this is for production or for modules you plan on distributing but in your main file you could add this:
process.env.NODE_PATH = __dirname;
require('module').Module._initPaths();
which would let you always require modules relative to the folder containing your main file. I.e. if you had a file in:
library/some_file.js
then in tests/some_other_file.js you could just do:
require('library/some_file');
Or as an alternative you could add this in your main file:
global.__base = __dirname + '/';
and then in your other modules require using:
var MyObject = require(__base + 'my-project/lib/widget/myobject');
I'm trying to figure out how to perform dynamic import of classes in ES6 one the server side (node.js with Babel).
I would like to have some functionalities similar to what reflection offers in Java. The idea is to import all the classes in a specific folder and instanciate them dynamically.
So for example I could have multiple classes declared in a folder like the one below :
export default class MyClass {
constructor(somevar) {
this._somevar = somevar
}
//...
//some more instance level functions here
}
and then somewhere else in my app's code I could have a function that finds out all the classes in a specific folder and tries to instanciate them :
//somewhere else in my app
instanciationFunction(){
//find all the classes in a specific folder
var classFiles = glob.sync(p + '/path_to_classes/**/*.js', {
nodir: true
});
_.each(classFiles, async function (file) {
console.log(file);
var TheClass = import(file);
var instance = new TheClass();
//and then do whatever I want with that new instance
});
}
I've tried doing it with require but I get errors. Apparently the constructor cant be found.
Any idea would be greatly appreciated.
Thanks
ES module definitions are declarative, and the current direction tools are taking is the path where dependencies are determined during parse (via static analysis), waaay before any of the code is executed. This means dynamic and conditional imports go against the said path. It's not like in Node where imports are determined on execution, upon executing require.
If you want dynamic, runtime imports, consider taking a look at SystemJS. If you're familiar with RequireJS, it takes the same concept, but expands it to multiple module formats, including ES6. It has SystemJS.import which appears to do what you want, plus handles the path resolution that you're currently doing.
Alternatively, if your intention is to shed off excess code, consider using Rollup. It will analyze code for you and only include code that's actually used. That way, you don't need to manually do conditional loading.
You need to preprocess with babel, because they are not yet a part of node (for that matter, neither are static imports - node uses require).
https://github.com/airbnb/babel-plugin-dynamic-import-node
steps:
pre
npm i -D babel-cli or npm i -D babel
1
npm i -D babel-plugin-dynamic-import-node
2
.babelrc
{
"plugins": ["dynamic-import-node"]
}
ready, go!
babel-node test_import.js for babel-cli, or for raw babel:
a
(edit) package.json
"scripts": {
"pretest": "babel test_imports.js -o dist/test_imports.js",
"test": "node dist/test_imports.js"
//...
b
node test
I had the same usecase and i managed to dynamically load and instantiate default exported classes using:
const c = import("theClass.js")
const i = new c.default();
using node v16.4.0
Is there a Gulp plugin that allows me to include/concatenate JavaScript files together?
I'm trying to have a way in which I can "include" the contents of one JavaScript file in to others. By include, I mean having something like this:
// main.js
var a = 2;
///include an-include-file.import.js
// an-include-file.import.js
var b = 5;
"Compile" to something like this:
// compiled.js
var a = 2;
var b = 5;
Or, probably even better, something like this:
// compiled.js v2
var a = 2;
// wrapped in an anonymous, self-calling function to isolate scope
(function () {
var b = 5;
})();
I wrote a plugin myself to do just that, but I'd like to be able to use source maps. Implementing source maps myself is a bit more effort than I'd like to devote to this little project.
Now, I know I could use something like gulp-concat, but there isn't an easy way to control their order. I'd have to modify the gulpfile every time I add a new file, and manually list them all out (or have lots of complicated patterns), which is a rather large pain.
I'd prefer something where I can use an import or include to precisely control where the file goes, and control it from the scripts themselves, not the build tool. Very similar to how LESS or something does it.
For LESS, what I do is I name suffix files with ".import.less" if I don't want them to generate their own standalone file, and then #import them where I want them in other files. This makes it very easy to only generate the files I want, without simply creating one giant file.
I ended up taking Mike's idea and using WebPack, in addition to Babel, to create ES6-style modules.
The basics look like this:
// math.js
export function sum (a, b) { return a + b; }
// main.js
import sum from "math";
console.log(sum(1, 2));
I use Gulp to handle my build process, which in a simplified manner looks like this:
var gulp = require('gulp'),
webpack = require('webpack-stream');
gulp.task('build', function () {
return gulp.src('main.js')
.pipe(webpack())
.pipe(gulp.dest('dist'));
});
My actual usage is much more complex, but that's the basic idea. I have to use babel-loader (with the ES2015 preset) in my webpack config to have it process the ES6 in to ES5, then WebPack puts them together in to one file.
Related reading:
Babel - https://babeljs.io/
Babel ES2015 Preset - https://babeljs.io/docs/plugins/preset-es2015/
WebPack - https://webpack.github.io/
WebPack with Gulp - https://webpack.github.io/docs/usage-with-gulp.html
babel-loader - https://github.com/babel/babel-loader
I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.