Require in require javascript - javascript

I made a nice folder structure with the use of require and require in require in javascript. As the require in require needs all functions to be included again I was wondering if there are easier ways to do this.
var file2 = require('./file2.js');
var IncludeAll={
func1: function(msg){
return file2.function1(msg);
},
func2: function(msg){
return file2.function2(msg);
}
};
module.exports = IncludeAll;

You can create a exporter script "exporter.js" file like below.
// exporter.js
module.exports = {
File2: require('./file2'),
File3: require('./file3')
}
Then you can import and call like this.
const {File2} = require('./exporter')
const param = 5;
File2.func1(param);
File2.func2(param);

Related

Unable to read config.json file in node.js

I am writing a unit test case for the , question is mentioned in the link How to stub/mock submodules of a require of nodejs using sinon
when I include a require
const index=require('./index.js');
It has a library require inside it
const library= require('./library.js');
the library.js file has a require which reads config.json file(this config file is also required inside above index.js) as below
const readConfig = require('read-config');
const config = readConfig('./config.json');
I have tried many ways as suggested in the above link but I am failing
const stubs = {
'./library': function (response) {
assert.equal(some, null);
return 'Some ' + argument;
},
'../library1.js': {
function(paths, opts){
var config='./config.json'
return config;
}
},
}
const index=proxyquire('./index.js',stubs)
When I run my unit test case I am still getting the below error
throw configNotFound(configPath);
^
ReadConfigError: Config file not found: ./config.json
I would like to know which part of the code I am missing badly that the code throws the error
I am trying to edit the index.js and all the related files where config is read with the below code
var path = require('path');
var pathToJson = path.resolve(__dirname, '../config.json');
// Load config
var config = fs.readFile(pathToJson , 'utf8', function (err, data) {
if (err) throw err;
config = JSON.parse(data);
});
Here challenge is that I cannot change the node code
You problem is likely to be path resolution. If ./config.json is relative to where you are running Node from (process.cwd()), then it'll work. If it's relative to your library module, then you can do something like:
// Works for JS and JSON
const configPath = require.resolve('./config.json');
// Works in general
const configPath = require('path').join(__dirname, 'config.json');
// Then
const readConfig = require('read-config');
const config = readConfig(configPath);
It's difficult to say if this is the case without knowing more about your project layout and how you're starting your app.

Node.js Path infinity loop

I have the following two files in a folder:
plus one file in same folder called minions_model
But when i try to call murder_model with:
var murder_model = require('./murder_model.js');
and call it to see its functions, i get:
{}
while i should see something like:
{ xxx: [Function: xxx] }
Update:
It seems like missionModel fails to get murder file because of minionModel.
I noticed if i removed minionModel reference from Murder file, it would work.
But minionModel crashes because of Mission_model. so if i remove minionModel from MissionModel it would work. But its an infinity Loop!.
What causes this and how can i fix it?
Source:
minions:
var path = require('path');
var missionsCompålicated = require('./mission_model.js');
var mongoose = require('mongoose');
function yyy() {
console.log("inside minions");
return 499;
}
module.exports = {
yyy : yyy,
};
Murder:
var path = require('path');
var MinionModel = require('./minions_model.js');
function xxx() {
console.log("inside murder model");
}
module.exports = {
/*botAttack : botAttack,*/
xxx : xxx,
};
mission:
var xau = require('./murder_model.js');
function getMission(userid) {
console.log("??:)");
console.log(xau);
console.log(xau.xxx());
}
module.exports = {
getMission : getMission,
};
Its weird, because if i call it anywhere else ( not in minion,murder or mission, for example server.js it works perfectly.
This happens because you are replacing the entire module.exports object and the two modules are requiring each other and causing a cyclic dependency. You can solve it my not replacing the module.exports object.
Try this :
module.exports.getMission = getMission;
module.exports.xxx = xxx;
module.exports.yyy = yyy;

Problems at require(<Module>) in self written Node-RED node

I added a self written WebSocket-Client library. When I require in node.js it works fine, just as in Node-RED's function-node with registering it in settings.js and requireing it by global.get("RWSjs").
Now I had to write a Node by myself and wanted to require this file, and it doesn't work. Node-RED always gives me the "node not deployed" error, which is, I think, because of a javascript syntax error.
How can I require a self written module in a self written node's .js?
Thanks a lot in advance, Peter :)
Edit:
some Code:
eval-R-char.js (Code for the node)
module.exports = function(RED) {
// doesn't work:
var RWSjs = global.get("RWSjs");
function EvalRCharNode(config) {
RED.nodes.createNode(this,config);
this.instruction = config.instruction;
var node = this;
this.on('input', function(msg) {
//msg.payload = msg.payload.toLowerCase();
msg.payload = "Instruction: " + this.instruction;
node.send(msg);
});
}
RED.nodes.registerType("eval-R-char",EvalRCharNode);
}
You shouldn't use the context to require modules when writing your own nodes, it is purely a workaround as you can't use require in the function node.
You should just require as normal in your custom node.
So in this case:
module.exports = function(RED) {
//assuming your module is in the RWS.js file in the same directory
var RWSjs = require('./RWS.js');
function EvalRCharNode(config) {
RED.nodes.createNode(this,config);
this.instruction = config.instruction;
var node = this;
this.on('input', function(msg) {
//msg.payload = msg.payload.toLowerCase();
msg.payload = "Instruction: " + this.instruction;
node.send(msg);
});
}
RED.nodes.registerType("eval-R-char",EvalRCharNode);
}

Accessing typescript file variable values using gulp

I have several typescript files, some of them export a const named APIS.
I'm trying to access those exports (I want to concatenated all of them to a single file), but it doesn't seem to work. I'm obviously doing something wrong, but I'm not sure what.
For example, I have a folder named services, with 2 files: service1.ts, service2.ts.
service1.ts:
...
export const APIS = [ { "field1" : "blabla" } ];
service2.ts: does not contain the APIS var.
This is my gulpfile.js:
var gulp = require('gulp');
var concat = require('gulp-concat');
var map = require('gulp-map');
gulp.task('default', function() {
return gulp.src('.../services/*.ts')
.pipe(map(function(file) {
return file.APIS;
}))
.pipe(concat('all.js'))
.pipe(gulp.dest('./test/'));
});
When I run this task, I get nothing. When I added console.log(file.APIS); to the map function, I get undefined for all the values (although it is defined in service1.ts!).
This is following to: Extracting typescript exports to json file using gulp
EDIT: OK, so I tried saving the exports in a .js file instead of a .ts file, and now I can access those vars using require:
gulp.task('default', function() {
return gulp.src('./**/*.service.export.js')
.pipe(map(function(file) {
var fileObj = require(file.path);
...
}))
Now if I try console.log(fileObj.APIS); I get the correct values. What I'm still confused about is how I can pass these value on, and create a single file out of all these vars. Is it possible to push them into an array?
This will not work as you think it would work. Gulp itself knows nothing about typescript files, that file is a vinyl-file and has no knowledge about the typescript code within its content.
Edit
Based on your example, you can do something like this:
var gulp = require('gulp');
var concat = require('gulp-concat');
var map = require('gulp-map');
var fs = require('fs');
gulp.task('test', function ()
{
var allConstants = [];
var stream = gulp.src('./**/*.service.export.js')
.pipe(map(function(file)
{
var obj = require(file.path);
if (obj.APIS != null)
allConstants = allConstants.concat(obj.APIS);
return file;
}));
stream.on("end", function (cb)
{
// Do your own formatting here
var content = allConstants.map(function (constants)
{
return Object.keys(constants).reduce(function (aggregatedString, key)
{
return aggregatedString + key + " : " + constants[key];
}, "");
}).join(", ");
fs.writeFile('filename.txt', content, cb);
});
return stream;
});
Suggestion
If you want to collect multiple variables into a single file i.e. a common variables file I suggest gulp-replace.
Steps
Create a file, require it and use tags within that file to place your variables.
Advice
If you are already using services don't create an array. Instead create an object (JSON) where every property is a constant. i.e.
var constants = {
const_1: 0,
const_2: 1,
const_3: 2,
}

Common logging for node, express application -- best practice?

I'm working on an node.js application with several dozen modules and using bunyan for logging (JSON output, multiple configurable streams). I've been looking for good examples of how to implement a instance across all the modules, but haven't seen what appears to be a really clean example I can learn from.
Below illustrates an approach that works, but seems quite inelegant (ugly) to me. I'm new to node & commonjs javascript in general, so looking for recommendations on how to improve it.
module: ./lib/logger
// load config file (would like this to be passed in to the constructor)
nconf.file({ file: fileConfig});
var logSetting = nconf.get('log');
// instantiate the logger
var Bunyan = require('bunyan');
var log = new Bunyan({
name: logSetting.name,
streams : [
{ stream : process.stdout,
level : logSetting.stdoutLevel},
{ path : logSetting.logfile,
level : logSetting.logfileLevel}
],
serializers : Bunyan.stdSerializers
});
function Logger() {
};
Logger.prototype.info = function info(e) { log.info(e) };
Logger.prototype.debug = function debug(e) { log.debug(e) };
Logger.prototype.trace = function trace(e) { log.trace(e) };
Logger.prototype.error = function error(e) { log.error(e) };
Logger.prototype.warn = function warn(e) { log.warn(e) };
module.exports = Logger;
module: main app
// create the logger
var logger = require('./lib/logger)
var log = new logger();
// note: would like to pass in options --> new logger(options)
module: any project module using logger
// open the logger (new, rely on singleton...)
var logger = require('./lib/logger');
var log = new logger();
or view the gist
any recommendations?
EDIT:
I've modified the constructor, making the singleton pattern explicit (rather than implicit as part of the 'require' behaviour.
var log = null;
function Logger(option) {
// make the singleton pattern explicit
if (!Logger.log) {
Logger.log = this;
}
return Logger.log;
};
and then changed the initialization to take an options parameter
// initialize the logger
Logger.prototype.init = function init(options) {
log = new Bunyan({
name: options.name,
streams : [
{ stream : process.stdout,
level : options.stdoutLevel},
{ path : options.logfile,
level : options.logfileLevel}
],
serializers : Bunyan.stdSerializers
});
};
Singleton pattern in nodejs - is it needed?
Actually, singleton is perhaps not needed in Node's environment. All you need to do is to create a logger in a separate file say, logger.js:
var bunyan = require("bunyan"); // Bunyan dependency
var logger = bunyan.createLogger({name: "myLogger"});
module.exports = logger;
Then, retrieve this logger from another module:
var logger = require("./logger");
logger.info("Anything you like");
if you are using express with node.js then you can try this.
By default, logging is disabled in Express. You have to do certain stuff to get logs working for your app. For access logs, we need to use the Logger middleware; for error logs we will use Forever.Hope it will help you..
Here is a good example How to Logging Access and Errors in node.js

Categories