I have a node.js server to create a web chat application. But I have a problem. In one file, I want to get a function from another file with the require method and module.export. In my first file (server.js, which is in the root path), the require method works, but in the /js folder (which is not in the root), it does not work. I installed npm and all packages globally.
My All File :
Code in chat.js:
const {verifUserConnected, getUserInfo} = require('express');
console.log(verifUserConnected)
Code in connect.js :
function verifUserConnected(){
return isConnected;
}
function getUserInfo(){
return null;
}
module.exports = {
verifUserConnected,
getUserInfo
};
In "Server.js" The require method works
You've put connect.js in underneath a folder named "public" which implies you are serving it to the browser and trying to run it client-side.
Browsers do not have native support for CommonJS modules (i.e. module.exports and require).
Your starting options are:
Rewrite the client-side code to not use modules
Rewrite the client-side code to use JavaScript modules (i.e. using import, export and <script type="module").
Transpile the modules for use on the browser (e.g. using a tool like Webpack or Parcel.js
However … chat.js attempts to require('express'). Express will not run in the browser and doesn't export anything named verifUserConnected either. You'll need to address that too.
In Common JS (Node JS works by default eith Common JS)
const startServer = () => {
// Code
};
module.exports = { startServer }
//Or
exports.startServer = startServer;
To import.
const { startServer } = require("./path");
If you have any question ask me
Related
I want to get a variable from one .js file to another .js file. Right now I have
main.js
const balances = require('./balance');
console.log(balances.balanceBTC)
and I have
balance.js
const balanceBTC = () => {
return arrayCleaned[0];
};
exports.balanceBTC = balanceBTC;
And I am getting the error
const balances = require('./balance');
ReferenceError: require is not defined
I am running this code via windows PowerShell and the node version is: v14.10.1
NodeJS might be treating your code as an ES Module. And CommonJS variables like "require" are not available in ES modules. Try one of the below:
As mentioned
here,
declare require before using it.
import { createRequire } from 'module';
const require = createRequire(import.meta.url);
const balances = require('./balance');
[...]
If you have "type" : "module" in your package.json, remove it
It looks like the problem is coming from the environment where you are running your code.
Check the following links and you'lle find the answser:
Node | Require is not defined
https://www.thecrazyprogrammer.com/2020/05/require-is-not-defined.html
Require is not defined nodejs
https://github.com/nodejs/node/issues/33741
So lets say I have some code in js
const myApiKey = 'id_0001'
But instead of harcoding it I want to put it in some bash script with other env vars and read from it and then replace it in the JS
So lets say for prod I would read from prod-env.sh or for dev I would read them from dev-env.sh and then gulp or some other tool does the magic and replaces MY_API_KEY based on whatever is established inside of prod-env.sh or dev-env.sh.
const myApiKey = MY_API_KEY
Update: I want to add I only care about unix OS, not concerned about windows. In golang there is way to read for example envVars.get('MY_API_KEY'), I'm looking for something similar but for JS in the client side.
If you're using gulp, it sounds like you could use any gulp string replacer, like gulp-replace.
As for writing the gulp task(s). If you are willing to import the environment into your shell first, before running node, you can access the environment via process.env
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', process.env.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
If you don't want to import the environment files before running node, you can use a library like env2 to read shell environment files.
Another option would be to use js/json to define those environment files, and load them with require.
prod-env.js
{
"MY_API_KEY": "api_key"
}
gulpfile.js
const myEnv = require('./prod-env')
gulp.task('build', function(){
gulp.src(['example.js'])
.pipe(replace('MY_API_KEY', myEnv.MY_API_KEY))
.pipe(gulp.dest('build/'));
});
Also, for a more generic, loopy version of the replace you can do:
gulp.task('build', function () {
stream = gulp.src(['example.js']);
for (const key in process.env) {
stream.pipe('${' + key + '}', process.env[key]);
}
stream.pipe(gulp.dest('build/'));
});
In that last example I added ${} around the environment variable name to make it less prone to accidents. So the source file becomes:
const myApiKey = ${MY_API_KEY}
This answer is an easy way to do this for someone who doesn't want to touch the code they are managing. For example you are on the ops team but not the dev team and need to do what you are describing.
The environment variable NODE_OPTIONS can control many things about the node.js runtime - see https://nodejs.org/api/cli.html#cli_node_options_options
One such option we can set is --require which allows us to run code before anything else is even loaded.
So using this you can create a overwrite.js file to perform this replacement on any non-node_modules script files:
const fs = require('fs');
const original = fs.readFileSync;
// set some custom env variables
// API_KEY_ENV_VAR - the value to set
// API_KEY_TEMPLATE_TOKEN - the token to replace with the value
if (!process.env.API_KEY_TEMPLATE_TOKEN) {
console.error('Please set API_KEY_TEMPLATE_TOKEN');
process.exit(1);
}
if (!process.env.API_KEY_ENV_VAR) {
console.error('Please set API_KEY_ENV_VAR');
process.exit(1);
}
fs.readFileSync = (file, ...args) => {
if (file.includes('node_modules')) {
return original(file, ...args);
}
const fileContents = original(file, ...args).toString(
/* set encoding here, or let it default to utf-8 */
);
return fileContents
.split(process.env.API_KEY_TEMPLATE_TOKEN)
.join(process.env.API_KEY_ENV_VAR);
};
Then use it with a command like this:
export API_KEY_ENV_VAR=123;
export API_KEY_TEMPLATE_TOKEN=TOKEN;
NODE_OPTIONS="--require ./overwrite.js" node target.js
Supposing you had a script target.js
console.log('TOKEN');
It would log 123. You can use this pretty much universally with node, so it should work fine with gulp, grunt, or any others.
Question: is there a way to tell webpack to tell built-in modules modules like fs to execute during build so the browser gets the result of this function, not the function call itself?
My Situation:
Currently I'm developing an application for the browser using webpack. I'm trying to use the node 'fs' module in one my files to require the index.js files from other directories. For example:
plugins
├──plugin1
│ ├── index.js (simply exports an object)
│
├──plugin2
│ ├── index.js (simply exports an object)
|
├──plugin3
│ ├── index.js (simply exports an object)
|
|──index.js (want to require all index.js from each plugin directory here)
I'm getting an error with webpack saying: Can't resolve 'fs' in somepath/node_modules/require-dir
My file index.js located at `plugins/index.js' which is simply trying to require my other files.
//module from NPM which uses the 'fs' module ('im not explicity using it)
const requireDir = require('require-dir');
const allPlugins = requireDir('./plugins/');
console.log(allPlugins);
Can't resolve 'fs' in '/some_path/node_modules/require-dir'
You have two options here.
I haven't used this personally, but you can use node config value as specified here.
node: {
fs: {true, "mock", "empty", false}
}
Set fs to any of the above values.
Don't use the fs module. It is a built/native modules which may or may not rely on native V8/C++ functions/libraries. Remember that webpack typically bundles assets for the browser. So instead of relying on a plugin, you can manually import your plugins like:
plugins/index.js
const plugin1 = require('./plugin1')
const plugin2 = require('./plugin2')
module.exports = {
plugin1,
plugin2
}
You could also use this answer to polyfill the require-dir module.
Thanks to Francisco Mateo's additional link about polyfilling require-dir, I learned about the context method that webpack adds to require.
This allows me to do dynamic requires like so in my plugins/index.js file:
//require all index.js files inside of /plugins directory
let context = require.context('.', true, /\index\.js/);
const loadPlugins = function(ctx){
let keys = context.keys();
let values = keys.map(context);
return values;
}
//array of values from each index.js require
console.log('loadPlugins', loadPlugins());
I am using Browserify to compile a large Node.js application into a single file (using options --bare and --ignore-missing [to avoid troubles with lib-cov in Express]). I have some code to dynamically load modules based on what is available in a directory:
var fs = require('fs'),
path = require('path');
fs.readdirSync(__dirname).forEach(function (file) {
if (file !== 'index.js' && fs.statSync(path.join(__dirname, file)).isFile()) {
module.exports[file.substring(0, file.length-3)] = require(path.join(__dirname, file));
}
});
I'm getting strange errors in my application where aribtrary text files are being loaded from the directory my compiled file is loaded in. I think it's because paths are no longer set correctly, and because Browserify won't be able to require() the correct files that are dynamically loaded like this.
Short of making a static index.js file, is there a preferred method of dynamically requiring a directory of modules that is out-of-the-box compatible with Browserify?
This plugin allows to require Glob patterns: require-globify
Then, with a little hack you can add all the files on compilation and not executing them:
// Hack to compile Glob files. Don´t call this function!
function ಠ_ಠ() {
require('views/**/*.js', { glob: true })
}
And, for example, you could require and execute a specific file when you need it :D
var homePage = require('views/'+currentView)
Browserify does not support dynamic requires - see GH issue 377.
The only method for dynamically requiring a directory I am aware of: a build step to list the directory files and write the "static" index.js file.
There's also the bulkify transform, as documented here:
https://github.com/chrisdavies/tech-thoughts/blob/master/browserify-include-directory.md
Basically, you can do this in your app.js or whatever:
var bulk = require('bulk-require');
// Require all of the scripts in the controllers directory
bulk(__dirname, ['controllers/**/*.js']);
And my gulpfile has something like this in it:
gulp.task('js', function () {
return gulp.src('./src/js/init.js')
.pipe(browserify({
transform: ['bulkify']
}))
.pipe(rename('app.js'))
.pipe(uglify())
.pipe(gulp.dest('./dest/js'));
});
I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.