I am new to webpack and node, and I am wondering how to use the
__non_webpack_require__
function. I've visited webpack's website but am still confused as to what this function is and how I can use it. Could you provide a short description of a use case for this function and then how to use it in a node / react app?
Webpack processes every module that you use in your application starting from the entry point(s) and including every module you import (import or require) and includes it in your bundle. The __non_webpack_require__ is a function that will result in a plain require call.
Let's take this entry point as an example:
const processedByWebpack = require("./module");
const notProcessed = __non_webpack_require__("./non-webpack");
console.log(processedByWebpack);
console.log(notProcessed);
In this case webpack will bundle that application and include every module you import, which in this case is only the ./module.js. So the output will be:
/******/ ([
/* 0 */
/***/ (function(module, exports, __webpack_require__) {
const processedByWebpack = __webpack_require__(1);
const notProcessed = require("./non-webpack");
console.log(processedByWebpack);
console.log(notProcessed);
/***/ }),
/* 1 */
/***/ (function(module, exports) {
module.exports = "This module is bundled with webpack"
/***/ })
/******/ ]);
The ./module.js module was included in the bundle and would also have been processed by any loaders if there were any applicable rules present. On the other hand, the ./non-webpack.js is not included in the bundle and webpack made it a call to require. This means that the ./non-webpack.js will be resolved when it's executed and it will fail with a runtime error if it isn't available or contains invalid JavaScript.
__non_webpack_require__ is a way to work around the fact that webpack processes all require calls. Because webpack bundles up all the modules, it must know which modules to include at compile time. This makes require more restrictive than it actually is in Node.js. For instance, you can't use dynamic requires, that means you can't use a variable as the module's path (see also webpack dynamic module loader by require). For example:
// Path to module as a variable (could be an argument to a function)
const modulePath = "./module";
const processedByWebpack = require(modulePath); // Fails
const notProcessed = __non_webpack_require__(modulePath);
In the regular require webpack will fail, because it doesn't know which modules to include to cover all the modules that could be referenced at runtime. In this example it might seem obvious, but it could go as far as using user input to determine the module to load. With __non_webpack_require__ it simply creates a require call and you'll have to deal with possible Module not found exceptions at runtime.
When should you use it?
Probably never. That's one of these functions that should be considered as last resort where you need to sidestep webpack to have some dynamic module resolution. In most situations there are other solutions to achieve the same goal (e.g. deferring imports to runtime by using Externals), everything else is an edge case.
You will have noticed that __non_webpack_require__ is transformed into a require call. This means that it only works in Node.js and fails in any browser environment unless you have a global require defined that may or may not do something special. Another downside is that it is webpack specific and when you want to use another tool (for instance for testing), it won't work or you'll have a hard time trying to work around it.
Related
This is just something I thought today and I didn't see a lot of information so I'm going to share this weird cases and how I personally solved them (if there's a better way please comment, but meanwhile this might help others ^^)
In a regular module, you would do something like this to export your function/library/object/data:
// regular NodeJS way:
module.exports = data;
// ES6 way
// (will get transpiled to the regular way using the module variable by webpack)
export data;
default export data;
When compiling the library usually babel or tsc are used, but if for any reason you want not only to compile (transpile) your library but also pack it using webpack, you will encounter this case.
As you know, in a webpack bundle the module variable is local to the bundle (every module/file gets wrapped with a function where module is a parameter = local variable), so nothing really gets exported outside the bundle, is just nicely managed by webpack.
That means that you can't also access the contents using the regular require/import methods.
In some case you might find necessary to export outside webpack. (i.e. you are trying to build a library using webpack and you want it to be accessible by other people). This basically means you need to access the original module variable, but webpack doesn't expose it like it happened with __non_webpack_require__.
See also: Importing runtime modules from outside webpack bundle
The solution is to create our own __non_webpack_module__ (as webpack does with __non_webpack_require__.
How I did it is using webpack.BannerPlugin to inject some code outside the bundle. This code is prepended to the build after the minification is done, so it's preserved safely.
In your webpack.config.js:
plugins: [
new BannerPlugin({
raw: true,
banner: `const __non_webpack_module__ = module;`,
}),
]
And again, if you are using TypeScript, in global.d.ts:
declare const __non_webpack_module__: NodeModule;
And now, you can do something like this in your code:
__non_webpack_module__.exports = /* your class/function/data/whatever */
This will allow to import it as usual from other files
Tip: You might want to look at BannerPlugin to check other options, like include or exclude so this variable is only generated on the desired files, etc.
I am trying to answer,
when to use import/export and when to use require()/module.exports? But as I try to dig, it seems to get complicated.
Here's my understanding
require()/module.exports: this is nodejs implementation of the module system. This loads the modules syncronously.
with es6, we can use import/export. the docs says
The import statement is used to import bindings which are exported by another module. Imported modules are in strict mode whether you declare them as such or not. The import statement cannot be used in embedded scripts unless such script has a type="module".
Ques1: How does this work with babel or webpack or browsers in general?
As I was exploring I came across stuff like CommonJs, requireJs, Asynchronous Module Definition (AMD)
Ques2: I am more interested in knowing the timeline as how these things evolved in javascript ?
How does this work with babel or webpack or browsers in general?
Babel and Webpack follow the ES spec and transpile the import / export statement to one single file. As they also support the require syntax, they usually transpile the import statements to require() calls and the export statements to module exports, and then ship with a custom loader for modules., If you got for example:
// A.js
export default function() { }
// B.js
import A from "./A";
A();
Then it gets transpiled to the following require syntax:
//A.js
exports.default = function() {};
//B.js
var A = require("./A").default;
A();
That could then get wrapped to something like:
(function() { // prevent leaking to global scope
// emulated loader:
var modules = {};
function require(name) { return modules[name]; }
function define(name, fn) {
var module = modules[name] = { exports: {} };
fn(module, module.exports, require);
}
// The code:
define("A", function(module, exports, require) {
// A.js
exports.default = function() { };
});
define("B", function(module, exports, require) {
// B.js
var A = require("A").default;
A();
});
})();
how these things evolved in javascript ?
A few years ago, writing JS was restricted to browsers, and the only way to load multiple js sources was to use multiple <script> tags and use the global object to exchange functionality. That was ugly.
Then Nodejs was invented and they needed a better way to work with modules and invented the require() thing.
The writers of the spec saw a need for a native syntax for that, so import / export were introduced.
Babel and others then wrote transpilers.
What webpack the bundler does is the following:
You specify an input file in the config
You specify an output file the config
Webpack will look at all the files which the input file requires (commomJS module system) or imports (ES6 module system). It then funnels the code based on file name extention through loaders. Loaders can transpile the individual files to code the browser can understand. An example of a loader is babel or the sass/scss compiler.
After the different files are transpiled with loaders, the plugins can work at the
transform the bundle of generated code into something else. The bundle is just a bunch of code which together forms piece of functionality
In won't go into detail in the internals of webpack too deeply, but the most important thing to understand is:
You use webpack so you can use split up your code in multiple files, which makes them more maintainable and easier to work with. However then requesting all these files by the client would be horrible for performance (many HTTP requests overhead). Therefore, we bundle the files into one file, or a couple so this overhead is reduced.
Generally, you should write all modern code with import/export syntax if you are using a bundler like webpack, or translating with Babel... npm modules may favor require/module syntax but you can still import them.
Also worth noting is the import() method which returns a promise that should resolve to the root module being imported asynchronously. Webpack may bundle these as asynchronous modules if you have it configured to do so.
In practice the resolution through tooling like babel and webpack will fallback to node-style behaviors against the node_modules lookup, where the standard is transport paths, favoring relative paths. Additional support is per environment.
You can experiment with esm support in modern browsers and in current node (behind a flag as of this answer). The behaviors are somewhat inconsistent, but well defined. When in doubt, experiment and try.
I don't like the whole export/require stuff in node, it takes too long. Let's say I have a file server.js and I want to use functions in whatever.js. in html I just add this to the header:
<script src='whatever.js'></script>
and then I can just use all the functions of whatever.js in my body's script.
But in node, in the server.js file I'd do:
var myobject = require('./whatever.js');
but then I need to set it to myobject, and further I need to go to whatever.js and manually decide what functions I want to export. not to mention that typing myobject.someFunction() is alot longer to write than someFunction() and I need to remember what I exposed/didn't expose.
I wanted something where I could just go:
require('./whatever.js');
and it puts it ALL in global, no bs. like in good old html/javascript. Is there a way to do this in node?
This will do the trick,
var fs = require('fs');
eval(fs.readFileSync('whatever.js')+'');
// here call functions from whatever.js file
(I realize this is an old thread but wanted to leave a note here for posterity.)
Here in 2022 there are several approaches for executing code from different files with Node.js:
ESM: Use standard ECMAScript modules
At the time of this writing, much of the node ecosystem (i.e. packages on npm) is in the process of transitioning to this paradigm, and there are some associated growing pains (e.g. things like __dirname are only available in CJS not ESM, though the workaround is easy).
For most developers, it would be advisable to become comfortable with this standard as it transcends node.js (i.e. is implemented in other runtimes like Deno and web browsers) and has been years in the making.
CJS: Use the original "CommonJS" module mechanism, e.g. require('./some-script.js')
It should be noted, particularly for the OP, that even though the "intended" way to use CJS modules is to export functions, constants, etc. and import them explicitly, it is possible to define everything in global scope using globalThis, though I would not recommend this.
// my-script.js
require('./foo.js');
require('./bar.js');
foo(); // This is foo from <...>foo.js
console.log(`bar = ${bar} (in ${__filename})`); // bar = 123 (in <...>my-script.js)
// foo.js
globalThis.foo = function() {
console.log(`This is foo from ${__filename}`);
}
// bar.js
globalThis.bar = 123;
If you try omitting globalThis. you'll find that foo and bar are no longer defined in the main script because require "wraps them" in "module scope."
Use eval
In my experience, there are very few legitimate use cases for eval (see Never use eval()!). Nevertheless, the functionality requested in this question is precisely what eval provides: "run some code as if it were written right here" and you can feed it from a file, as explained above by Mehul Prajapati
// include.js
// Defines a global function that works like C's "#include" preprocessor directive
const { readFileSync } = require('fs');
globalThis.include = function(scriptFile) {
console.warn('!!! EXTREMELY INSECURE !!!');
eval(readFileSync(scriptFile, 'utf-8'));
};
// main.js
require('./include.js'); // loads global include
// (I sure hope you completely trust these sources)
include('./foo.js');
include('./bar.js');
Note: Something that has contributed to much of my confusion in the past is that there have been competing standards/conventions/APIs that use some of the same identifiers, namely require, which require.js and other bundlers that support AMD (Asynchronous Module Definition)
use with different semantics. So for someone building a web application (using AMD for modules in web browsers) with node.js tooling (using CJS for modules locally) it can be frustrating to keep the functions straight, especially if it's an Electron application, which can expose Node.js APIs to scripts running in the renderer (browser). If you find yourself confused why a module is "not found" in a situation like that, check the stack trace to see which require is being called (and you may have to wrap/rename them on globalThis or something to avoid collisions).
Further reading:
JavaScript Modules: A Brief History [2019]
How the module system, CommonJS & require works [updated 2022]
What is AMD, CommonJS, and UMD? [2014]
In a current webpack project my code is partitioned into modules, each which has a main entry point with a .module.js ending. E.g. for the module Util, the entry point is util.module.js and I include it in other modules by writing import util from 'modules/util/util.module.js'
However, since all modules have the same syntax for the entry point, I would rather avoid specifying the full filename of the entry. E.g, it would be nice if I could just write import util from 'modules/util'.
I know this works if util.module.js is name index.js, but how can I tell Webpack to pick up my current module entries in the same way?
I'll attempt to outline some readily available approaches, starting from your requirements and progressing towards something that's more conventional.
I sense that some overengineering of build tools might be creeping in, so I'd highly recommend you think again whether all of this is really necessary and reconsider the conventional approach.
It is analogous to how others would expect modules get to resolved, does not require Webpack and will not break other tools that internally attempt to resolve modules according to this strategy, like eslint-plugin-import.
NormalModuleReplacementPlugin
Since you need to include the directory name in the module you're trying to resolve, you can use the NormalModuleReplacementPlugin to construct the resolve path dynamically:
new webpack.NormalModuleReplacementPlugin(/modules\/(\w+)$/, (result) => {
result.request += `/${result.request.split('/').pop()}.module.js`
}
You'd need to play around with this to ensure all valid names are covered.
resolve.extensions
If you really want to somehow distinguish these modules, then you could do so by providing a specific extension for them:
modules/util/index.module.js
modules/dashboard/index.module.js
It would be enough to configure:
{
resolve: {
extensions: ['.js', '.module.js']
}
}
Package main
Another approach would be to extract all your modules as packages, specifying {main} in package.json.
This will set you on a path towards a monorepository.
Conventional
The conventional approach would be to name the modules index.js:
modules/util/index.js
modules/dashboard/index.js
I am just getting use to CoffeeScript and I have gotten stuck with classes.
I want to have my files structured like in node so that I can require a JavaScript file containing a class like this.
Test = require "test.js"
Test.start()
Where start is a method of the Test Class.
Is this possible?
Is this possible?
Not exactly like in Node. There is no synchronous require in browser environments. Yet, you could try one of the many asynchronous libraries to do that, have a look at AMD. The most famous implementation is require.js.
I've found that the simplest way to use CommonJS modules (the ones that Node.js uses) in browser environments is to use Browserify. I personally also prefer the CommonJS module definitions to the the AMD ones, but that's just personal taste.
Also, take into account that in order to export your classes so that require 'test' will give you the class constructor directly, you would have to assign your class to module.exports:
# In test.coffee
module.exports = class Test
#start = -> console.log 'start!'
Then compile that file to test.js and you're ready to use it:
Test = require './test'
Test.start()
In Node.js this will Just Work. In browsers, you will need to process the files first with Browserify (or some other tool) to get it working (it will create the proper require function as well as some exports and module.exports variables for CommonJS modules to work correctly).
Take a look on stitch, hem (which is inspired by stitch, but has more neat features) and browserify.
Personally, I prefer hem. You can do something like this with it:
# app/lib/model.coffee
module.exports = class Model
...
.
# app/lib/util.coffee
helper1 = -> ...
helper2 = -> ...
module.export = {helper1, helper2}
.
# app/index.coffee
Model = require 'lib/model'
{helper1} = require 'lib/util'
# do whatever you want with required stuff
...
Hem takes care of compiling CoffeeScript on the fly and bundling all the needed code together (it also supports npm modules and arbitrary js libs as external dependencies for your code, see the docs for more details).