evaluate module in another module's context - javascript

I know that this is very un-Node/CommonJS-y—forgive me. (I'm writing a library of sorts and I'd like my library's require method to work exactly the same on the browser and on NodeJS.)
What I'd like is to be able to do is evaluate the script in the context of the current module—that is, if I say exports.a = "100"; in the module, I'd like for exports.a to be equal to "100" in all of the code in the requireing module after the require.
If this isn't clear, I'd be happy to elaborate.

This won't be a complete answer, but hopefully will help you get in the right direction.
I've been messing with Node's system of creating modules for the last couple of days. Basically I wanted to create some modules that were invoked in an entirely fresh context and variable scope, for which I would define a limited subset and extension of Node's capabilities.
I ended up studying their source here, and giving particular attention to the NativeModule constructor and its methods.
You'll notice that the source of the module is read from a file, wrapped in a string representing a function and eval'd into actual code.
The wrapper:
NativeModule.wrapper = [
'(function (exports, require, module, __filename, __dirname, define) { ',
'\n});'
];
The function is invoked, which invokes the contained module code.
The function requires half a dozen arguments as you can see from the wrapper, the first of which is the exports object (which starts off empty). It's also is passed the require function, which is why you can access require as a variable even though require isn't global.
The module code populates the exports object and then the exports is cached so all that work doesn't need to be done in the future. So when require( 'someModule' ) is invoked, it just looks up the cached exports object and returns it.
I'm sure you could do something like this in your code as long as you can get the source for the module that you want to require.
Perhaps SomeModule.toString() will be sufficient for you. Not sure how consistent browser support is though.
There's also a private API that's used to set up the environment for the modules.
process.binding('evals').Script
/*
{ [Function: Script]
createContext: [Function],
runInContext: [Function],
runInThisContext: [Function],
runInNewContext: [Function] }
*/
I ended up needing to use createContext and runInContext to get things working, but I'd guess you probably won't need anything like this.

(I'm writing a library of sorts and I'd like my library's require
method to work exactly the same on the browser and on NodeJS
If I understand you correctly(please forgive me if not ;)) you are looking for something like node-browserify.
Browserify Browser-side require() for your node modules and npm
packages
Just point a javascript file or two at browserify and it will walk the
AST to read all your require()s recursively. The resulting bundle has
everything you need, including pulling in libraries you might have
installed using npm!

Related

Instantiate an object defined in module A and use its methods in module B javascript

I'm working in a web development project.
Right now i'm using a 3rd party library to instantiate an object of that library in a file called, let's say, fileA.js
so i do:
import libraryExport from "./librarymain.js"
var object = libraryExport( ... );
export default object;
Now, in fileB.js i want to use the methods that the instantiated object has, for example:
import object from "fileA.js"
object.methodOfTheLibrary();
However, when im running this in my browser console i always get "methodOfTheLibrary is not a function", which means, from my point of view, that the library is not being imported properly in fileB.js
Note: I'm using webpack to bundle all of my files and everything was compiling and bundling just fine until i came with issues. I usually know my way around C++ in an advanced way but for JS i just still don't fully understand how to solve these kind of import issues.
Thank you for any help
It's generally* recommended that you avoid using an imported module directly in the body of another module. (One of your own modules, that is... as you'll see in a moment, third-party modules are generally fine to use.)
The big issue is load order. If your fileA also imports some other module, which imports another module which then imports fileB, then fileB will attempt to run and, since fileA is still trying to load dependencies, object will not actually have been instantiated yet.
You'll either need to carefully review your dependencies to look for just such a loop and then eliminate it or, if possible, restructure fileA to wrap your code in a function that can be called once the entry point has finished loading (which will guarantee that all other modules have been resolved):
// fileB.js
import object from "fileA.js"
export function init() {
object.methodOfTheLibrary();
}
// main.js
import init from "fileB.js";
init();
* "generally" meaning that there are plenty of perfectly acceptable situations where it's fine. You just have to mindful of the pitfalls and mitigate against those situations.

Jest: automock modules, but only those defined in __mocks__, rather than all

TL;DR
I would like to have some kind of automock feature enabled, but only (and only!) for the modules that I have explicitly defined in a corresponding __mocks__ folder. Is there a way for this in Jest?
General advices and suggestions are also welcome.
A bit of context: (optional)
Turns out I was totally misunderstanding Jests automock feature. Btw, looking back now, I don't understand why, 'cause docs are pretty clear on what it actually does:
This option tells Jest that all imported modules in your tests should be mocked automatically.
As if I just noticed ALL keyword. Maybe I was just thinking - but it doesn't makes sense to have an automock even for the imported function that I'm actually going to test here, does it? Like obviously I would like to automock third party stuff from node_modules, but not my own code. And it turns out that:
Note: Node modules are automatically mocked when you have a manual mock in place (e.g.: __mocks__/lodash.js).
Note: Core modules, like fs, are not mocked by default. They can be mocked explicitly, like jest.mock('fs')
So it's kind of doing the opposite of what I thought it was doing.
If I understand correctly, you have certain mock files in your __mocks__ folder which you would like to be globally replaced whenever they're needed as part of a test file imports.
Jest has a means of configuring that. If you go through the Jest Configuration Documentation, you'll find a property named moduleNameMapper which takes an object with the keys being regular expressions and the values being the paths for the mock for the corresponding modules which match the regex.
In your jest.config.js, you would need to add a separate parameter and specify the mock pattern and path.
moduleNameMapper: {
"/^bootstrap/": "<root>/tests/__mocks__/bootstrapMock.js",
"/^axios/": "<root>/tests/__mocks/axiosMock.js"
}
You can find more info about the usage in the Jest docs.
However, if you do not want to go through all this, this is also achievable by placing a __mocks__ folder next to your node_modules folder. However, as defined in the documentation for Manual Mocks:
If we want to mock Node's core modules (e.g.: fs or path), then explicitly calling e.g. jest.mock('path') is required, because core Node modules are not mocked by default.

Exporting outside webpack

This is just something I thought today and I didn't see a lot of information so I'm going to share this weird cases and how I personally solved them (if there's a better way please comment, but meanwhile this might help others ^^)
In a regular module, you would do something like this to export your function/library/object/data:
// regular NodeJS way:
module.exports = data;
// ES6 way
// (will get transpiled to the regular way using the module variable by webpack)
export data;
default export data;
When compiling the library usually babel or tsc are used, but if for any reason you want not only to compile (transpile) your library but also pack it using webpack, you will encounter this case.
As you know, in a webpack bundle the module variable is local to the bundle (every module/file gets wrapped with a function where module is a parameter = local variable), so nothing really gets exported outside the bundle, is just nicely managed by webpack.
That means that you can't also access the contents using the regular require/import methods.
In some case you might find necessary to export outside webpack. (i.e. you are trying to build a library using webpack and you want it to be accessible by other people). This basically means you need to access the original module variable, but webpack doesn't expose it like it happened with __non_webpack_require__.
See also: Importing runtime modules from outside webpack bundle
The solution is to create our own __non_webpack_module__ (as webpack does with __non_webpack_require__.
How I did it is using webpack.BannerPlugin to inject some code outside the bundle. This code is prepended to the build after the minification is done, so it's preserved safely.
In your webpack.config.js:
plugins: [
new BannerPlugin({
raw: true,
banner: `const __non_webpack_module__ = module;`,
}),
]
And again, if you are using TypeScript, in global.d.ts:
declare const __non_webpack_module__: NodeModule;
And now, you can do something like this in your code:
__non_webpack_module__.exports = /* your class/function/data/whatever */
This will allow to import it as usual from other files
Tip: You might want to look at BannerPlugin to check other options, like include or exclude so this variable is only generated on the desired files, etc.

Using external javascript files in node.js without using export and require

I don't like the whole export/require stuff in node, it takes too long. Let's say I have a file server.js and I want to use functions in whatever.js. in html I just add this to the header:
<script src='whatever.js'></script>
and then I can just use all the functions of whatever.js in my body's script.
But in node, in the server.js file I'd do:
var myobject = require('./whatever.js');
but then I need to set it to myobject, and further I need to go to whatever.js and manually decide what functions I want to export. not to mention that typing myobject.someFunction() is alot longer to write than someFunction() and I need to remember what I exposed/didn't expose.
I wanted something where I could just go:
require('./whatever.js');
and it puts it ALL in global, no bs. like in good old html/javascript. Is there a way to do this in node?
This will do the trick,
var fs = require('fs');
eval(fs.readFileSync('whatever.js')+'');
// here call functions from whatever.js file
(I realize this is an old thread but wanted to leave a note here for posterity.)
Here in 2022 there are several approaches for executing code from different files with Node.js:
ESM: Use standard ECMAScript modules
At the time of this writing, much of the node ecosystem (i.e. packages on npm) is in the process of transitioning to this paradigm, and there are some associated growing pains (e.g. things like __dirname are only available in CJS not ESM, though the workaround is easy).
For most developers, it would be advisable to become comfortable with this standard as it transcends node.js (i.e. is implemented in other runtimes like Deno and web browsers) and has been years in the making.
CJS: Use the original "CommonJS" module mechanism, e.g. require('./some-script.js')
It should be noted, particularly for the OP, that even though the "intended" way to use CJS modules is to export functions, constants, etc. and import them explicitly, it is possible to define everything in global scope using globalThis, though I would not recommend this.
// my-script.js
require('./foo.js');
require('./bar.js');
foo(); // This is foo from <...>foo.js
console.log(`bar = ${bar} (in ${__filename})`); // bar = 123 (in <...>my-script.js)
// foo.js
globalThis.foo = function() {
console.log(`This is foo from ${__filename}`);
}
// bar.js
globalThis.bar = 123;
If you try omitting globalThis. you'll find that foo and bar are no longer defined in the main script because require "wraps them" in "module scope."
Use eval
In my experience, there are very few legitimate use cases for eval (see Never use eval()!). Nevertheless, the functionality requested in this question is precisely what eval provides: "run some code as if it were written right here" and you can feed it from a file, as explained above by Mehul Prajapati
// include.js
// Defines a global function that works like C's "#include" preprocessor directive
const { readFileSync } = require('fs');
globalThis.include = function(scriptFile) {
console.warn('!!! EXTREMELY INSECURE !!!');
eval(readFileSync(scriptFile, 'utf-8'));
};
// main.js
require('./include.js'); // loads global include
// (I sure hope you completely trust these sources)
include('./foo.js');
include('./bar.js');
Note: Something that has contributed to much of my confusion in the past is that there have been competing standards/conventions/APIs that use some of the same identifiers, namely require, which require.js and other bundlers that support AMD (Asynchronous Module Definition)
use with different semantics. So for someone building a web application (using AMD for modules in web browsers) with node.js tooling (using CJS for modules locally) it can be frustrating to keep the functions straight, especially if it's an Electron application, which can expose Node.js APIs to scripts running in the renderer (browser). If you find yourself confused why a module is "not found" in a situation like that, check the stack trace to see which require is being called (and you may have to wrap/rename them on globalThis or something to avoid collisions).
Further reading:
JavaScript Modules: A Brief History [2019]
How the module system, CommonJS & require works [updated 2022]
What is AMD, CommonJS, and UMD? [2014]

Use r.js to package a library written using RequireJS for reuse in other RequireJS applications

So lets say I have some small bit of library code that I develop and test in isolation. I use RequireJS during development and have a root level file that depends on 1 other file. So it's define looks something like...
// lib/main.js
define(['lib/dep1'] function(dep1) {
...
})
I run r.js on the code which results in dist/myLibrary.js, which looks something like this:
define('lib/dep1',[], function(){...})
define('lib/main',["lib/dep1"], function(dep1){...})
If I pull myLibrary.js straight into another project it won't work. Nothing is defining itself as the module for that file. But if I append an actual module definition, it works.
define('lib/dep1',[], function(){...})
define('lib/main',["lib/dep1"], function(dep1){...})
define(['lib/main'], function(lib) {
return lib;
})
And the ['lib/main'] seems to be scoped to the module, because if I have an actual lib/main in my app, it doesn't get used.
Questions:
Regarding the scoping, is that the normal behavior? The fact that lib/main is recognized as a module id from the same file rather than going to look for it someplace else. If I import 10 such libraries that all have a lib/main, they won't collide?
Is there a better way? I am at least initially unconcerned about supporting the non-AMD use case as this is all internal lib development and we all use RequireJS. So within a fully AMDed environment, is there another, better way to do this? Assuming there's no pitfall to this approach, it seems fairly simple and boilerplate to support.

Categories