I`m trying to combine javascript files, I tried uglifyjs and it works but I want to eliminate duplicate require for the same npm library.
Use case:
I have file1.js and file2.js. Both files are using for example request npm module, and when I combine these two files into one, it duplicates the require('request'). Is there an option or something I can eliminate this issue?
Thanks.
One way to do that is to pass the module as parameter to another file.
For example:
file1.js
var request = require('request');
var file2 = require('./file2')(request);
module.exports = {...};
file2.js
module.exports = function(request) {
// use "request" parameter, instead of requiring.
return {...};
};
Update:
import * as file2 from './file2';
const file2Module = file2(request);
Related
I want to get a variable from one .js file to another .js file. Right now I have
main.js
const balances = require('./balance');
console.log(balances.balanceBTC)
and I have
balance.js
const balanceBTC = () => {
return arrayCleaned[0];
};
exports.balanceBTC = balanceBTC;
And I am getting the error
const balances = require('./balance');
ReferenceError: require is not defined
I am running this code via windows PowerShell and the node version is: v14.10.1
NodeJS might be treating your code as an ES Module. And CommonJS variables like "require" are not available in ES modules. Try one of the below:
As mentioned
here,
declare require before using it.
import { createRequire } from 'module';
const require = createRequire(import.meta.url);
const balances = require('./balance');
[...]
If you have "type" : "module" in your package.json, remove it
It looks like the problem is coming from the environment where you are running your code.
Check the following links and you'lle find the answser:
Node | Require is not defined
https://www.thecrazyprogrammer.com/2020/05/require-is-not-defined.html
Require is not defined nodejs
https://github.com/nodejs/node/issues/33741
In my index.js file, I have const config = require('config'); written as one of the first lines.
And I have a file in my project folder called config.js
But I keep having my console tell my that it Cannot find module 'config'
My config file is this basically:
module.exports = {
'secretKey': 'mySecretCode12232',
'mongoUrl' : 'mongodb://localhost:27017/test'
};
This doesn't make any sense it should be working.
const config = require( path.join(__dirname, 'config'+'.js' ) );
I also have own function which loads atomaticaly from specified subdirectory at it's definition, it saves a lot of time.
When you don't provide any path selector in the require statement (eg. require('./config')), your code will search for the package named config and fail as it cannot find this specific one, as require will assume that it was the package name that was provided (and will start searching e.g. in your node_modules etc. - search path for it is not a trivial topic :) ).
If you want to require the module from another file, you have to provide a correct path to it, so assuming your config.js resides in the same catalog as your other file, the correct statement would be:
const config = require('./config'); // Extension can be omitted
[EDIT]
Thanks to Stafano that formalized my question in a better way:
You have a module
-) There are several files in this module
-) All these files depend on a configuration whose path is unknown to the module itself
-) This module does not do much on its own, and is meant to be used by other applications
-) These applications should inject a configuration path into the module before it can be used
So i have this module, used from another application. It's composed of other submodules and i want to configure it using a configuration object.
I already tried to inject the configuration in my submodels but i had the same problem exposed in the original question.
For example my module use mongoDB (with mongoose) as a store.
// app.js
// in the config object i have the URI to the mongo instance (in order to create a connection).
var myModule = require('myModule')(config);
// myModule.js
// files
// myModule/index.js expose the module's functionalities
// is the entry point so I create the mongoose connection
var mongoose = require('mongoose');
module.exports = function(config){
var connection = mongoose.createConnection(config.store.URL);
// I need to expose this connection to the others submodules.
}
// myModule/storeController.js contains the business logic that use the store (createItem, deleteItem, get...) and requrie mongoose and my Models (store in the models folder)
var mongoose = require('mongoose');
var Item = require('./models/item.js');
exports.createItem = function(item){
Item.save(item, function(err, item){
if (err) throw
...
});
}
// myModule/models/item.js
// In this module i need to use the connection application in the configuration.
var mongoose = require('mongoose');
var connection = // i don't know how to get it
var ItemSchema = new mongoose.Schema({
name: String
});
module.exports = mongoose.model('item', ItemSchema);
If I inject the configuration obj to the item.js i can't do the module.exports of my model.
I hope that this example can clarify my question, but the problem is the simple, expose an object after get it as a parameter.
[PREVIOUS]
I have a node.js application that require a module. This module accept the coniguration file path (a JSON file).
I need to load that configuration on require and expose it to the module.
How can I achieve this behavior?
Something like:
// app.js
var myModule = require('myModule')(__dirname + '/config/myModuleCnfig.json');
// myModule.js
module.exports = function(configPath){
var config = require(configPath);
module.exports = config; // This is wrong
}
Is there another way to get the configuration path, configure the module and share the configuration??
With "share the configuration" i mean that i want to give the possibility to other files of my module to use that configuration.
Thanks for any suggestions!
FINAL EDIT:
After many misunderstandings, your problem is finally clear to me. To summarise what's in the comments, here is the situation:
You have a module
There are several files in this module
All these files depend on a configuration whose path is unknown to the module
itself
This module does not do much on its own, and is meant to be
used by other applications
These applications should inject a
configuration path into the module before it can be used
Since you cannot modify dynamically what a module exports, you should use another approach. As with most situations that you encounter in programming, there is not one way which is always right, as much pedends on your requirements and limitations.
The easiest way to do this (which I don't recommend) is to use a global variable, which you set in your myModule.js file and will be used by the other files in your module. The biggest drawback of this approach is that you wouldn't be able to use multiple instances of the module at the same time with different configurations. Also, any other module could easily modify (deliberately or not) you configuration at any time, by simply changing the value of the global variable, so it's also a security risk.
A much better way, which will probably require more work on your part - depending on how many files you have - is to implement some kind of Inversion of Control (IoC). In your case, you could turn all your exports into functions that accept a config, and then initialise them by passing the active configuration after you require the module. I don't know the specifics of your implementation, so here is some sample code:
// submodule1.js
module.exports = function(config) {
// return something that uses the configuration
}
// myModule.js
var fs = require('fs');
var submodule1 = require('./submodule1');
var submodule2 = require('./submodule2');
// ...
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath));
var sm1 = submodule1(config);
var sm2 = submodule2(config);
return /* an object that uses sm1 and sm2 */;
}
If your module is quite complex, you can use some IoC library that does the binding for you. An good one could be Electrolite.
Hope this helps.
PREVIOUS ANSWER:
You can use a library called jsop:
var jsop = require('jsop');
var config = jsop('./config/myModuleCnfig.json');
If you don't want to add a dependency to this module, the linked GitHub page also has a snippet that you can use to load the json config using only native methods.
EDIT: I just realised that this module is only for node 0.11, which you are probably not using. Since you don't probably need the writing functionality, you can use the following snippet instead:
var fs = require('fs')
var config = JSON.parse(fs.readFileSync('./config/myModuleCnfig.json'))
EDIT 2:
Now I think I understand your problem better. To pass the path to the required configuration, you can do something like this:
// myModule.js
var fs = require('fs')
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath))
return config;
}
I have multiple sets of js modules that I would like to concat into separate files. I don't want to have to create a seperate concat task for each file. It would make more sense to be able to pass arguments into the gulp task "concat". Unfortunately gulp doesn't allow arguments to be passed into tasks(I'm sure for good reason).
Any ideas of how I can accomplish this?
Use Case
A specific scenario would be website that has a global.js file for all pages as well as page specific js files.
Creating a task for each page specific js file will quickly make the gulpfile.js hard to manage as the site grows.
My dev invironment:
I have a dev/js/ directory which has multiple sub-directories. Each sub-directory contains modules for a specific js file. So each sub-directory needs to be concatenated into it's own file within lib/js/.
Perhaps requirejs?
Maybe I should just look into using a module loader like requirejs.
I needed to take modules from my source sub-directory (src/modules/), concatenate a specific file to each individually (src/distribution), then pipe the result to a sub-directory in my distribution folder (dist/js/modules/).
I wasn't sure how many modules would end up being written for this project so I wanted to do it dynamically and found this to be the best (simplest) solution:
gulp.task("modules:js", () => {
let modules = fs.readdirSync("src/modules");
let concatModule = (module) => {
return gulp.src([
'src/distribution',
module
])
.pipe(concat(module))
.pipe(gulp.dest("build/js/modules"));
}
for (let module of modules) {
concatModule(module);
};
});
You could make concatJS a higher-order function:
var concatJS = function (src, filename, dest) {
return function() {
gulp.src(src)
.pipe(concat(filename))
.pipe(gulp.dest(dest));
};
};
gulp.task('concat-1', concatJS('src/module-1', 'module-1.js', 'build/module-1'));
gulp.task('concat-2', concatJS('src/module-2', 'module-2.js', 'build/module-2'));
//etc...
Note: You'd probably be better off using a bundler like browserify or webpack. Since asking this question I have switched to browserify rather than trying to roll my own solution.
Improved Solution:
var fs = require("fs");
/* other requires omitted */
/* Set JS dev directory in one place */
var jsSrc = "dev/js/";
var jsDest = "lib/js/";
var concat = function (path) {
path = path.replace(/\\/g, "/");
var src = path.replace(/(\/[^\/]+?\.js$)|(\/$)/, "/*.js";
var filename = src.match(/\/([^\/]+?)(\/[^\/]+?\.js$)/)[1] + ".js";
gulp.src(src)
.pipe(concat(filename)
.pipe(gulp.dest(jsDest));
});
/* The concat task just runs the concat function for
* each directory in the javascript development directory.
* It will take a performance hit, but allows concat to be
* run as a dependency in a pinch.
*/
gulp.task("concat", function () {
var dirArr = fs.readdirSync(jsDev);
for (var d in dirArr) {
var path = jsDev+dirArr[d]+"/";
concat(path);
}
});
/* Run "concat" as a dependency of the default task */
gulp.taks("default", ["concat"], function () {
var JSWatcher = gulp.watch([jsSrc+"**/*.js"]);
JSWatcher.on("change", function (event) {
concat(event.path);
});
});
Alright, I think this works. It's a little bit of a hack though, and doesn't work for all use cases.
... removed previous example to save space ...
I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.