There is the isomorphic-webcrypto that pretends doing that but doesn't : it builds separate build for each target.
There is the noble-crypto way to do it, but it's based on if-else conditions and fails if I want an isomorphic mjs code.
Finally, there is the eval require way way to pass-through bundler, but node fails to use it in mjs.
In brief :
const crypto = require("crypto"); // work only in node.js but not in mjs file.
const crypto = eval(`require("crypto")`); // pass-thru bundler, then work only in node.js but not in mjs file.
window.crypto; // work only in browser
import * as crypto from "crypto"; // could work from both but must be at top level of a module, so it can't be a conditional import.
I would like to use native crypto in node.js and in browser, in an isomorphic way, to be able to use native import mjs in node and browser transparently.
How can I do this?
Alright. Ready for something ugly? :-) Behold, my latest hackjob… IsomorphicCyrpto.js:
export default
globalThis.crypto ||
(await import('node:crypto')).default.webcrypto
;
This works in Node.js v16 in module mode ("type": "module" in package.json, or equivalent CLI args), and will probably work with your bundler too… but who knows. ;-) Anyone using this code snippet should test thoroughly on whatever platforms they want to use it on.
In a nutshell:
We use globalThis, which represents global under Node.js, window for most browser contexts, and could perhaps even be a Worker context.
We first check to see if crypto is a thing. If it is, we're probably on a browser, and can just use that directly.
If crypto is not a thing, we're probably on Node.js and we need to import the module.
Because this is done dynamically, we need a dynamic import() rather than a true import.
import() is async and returns a Promise. But hey, it's all good, because top-level await is a thing in Node.js now!
To then use the module:
import crypto from './lib/IsomorphicCrypto.js';
console.log( crypto.randomUUID() );
Uggggly, but works for now. Hopefully, someone comes up with a better solution, or Node.js and browser contexts converge on naming in the future.
Related
Is there an existing API or library that can be used to load a JSON file in both the browser and Node?
I'm working on a module that I intend to run both from the command-line in NodeJS, and through the browser. I'm using the latest language features common to both (and don't need to support older browsers), including class keywords and the ES6 import syntax. The class in question needs to load a series of JSON files (where the first file identifies others that need to be loaded), and my preference is to access them as-is (they are externally defined and shared with other tools).
The "import" command looks like it might work for the first JSON file, except that I don't see a way of using that to load a series of files (determined by the first file) into variables.
One option is to pass in a helper function to the class for loading files, which the root script would populate as appropriate for NodeJS or the browser.
Alternatively, my current leading idea, but still not ideal in my mind, is to define a separate module with a "async function loadFile(fn)" function that can be imported, and set the paths such that a different version of that file loads for browser vs NodeJS.
This seems like something that should have a native option, or that somebody else would have already written a module for, but I've yet to find either.
For node, install the node-fetch module from npm.
Note that browser fetch can't talk directly to your filesystem -- it requires an HTTP server on the other side of the request. Node can talk to your filesystem, as well as making HTTP calls to servers.
It sounds like as of now, there is no perfect solution here. The 'fetch' API is the most promising, but only if Node implements it some day.
In the meantime I've settled for a simple solution that works seamlessly with minimal dependencies, requiring only a little magic with my ExpressJS server paths to point the served web instance to a different version of utils.js.
Note: To use the ES-style import syntax for includes in NodeJS (v14+) you must set "type":"module" in your package.json. See https://nodejs.org/api/esm.html#esm_package_json_type_field for details. This is necessary for true shared code bases.
Module Using it (NodeJS + Browser running the same file):
import * as utils from "../utils.js";
...
var data = await utils.loadJSON(filename);
...
utils.js for browser:
async function loadJSON(fn) {
return $.getJSON(fn); // Only because I'm using another JQuery-dependent lib
/* Or natively something like
let response = await fetch(fn);
return response.json();
*/
}
export { loadJSON };
utils.js for nodeJS
import * as fs from 'fs';
async function loadJSON(fn) {
return JSON.parse(await fs.promises.readFile(fn));
}
export { loadJSON };
I have a module like this:
module.exports = class Edge {
constructor(vertex1, vertex2) {
this.vertex1 = vertex1;
this.vertex2 = vertex2;
}
}
I want to import it into some NodeJS files and some front-end files in Chrome. I know Chrome now supports ES6 modules, but importing is giving me trouble:
ReferenceError: module is not defined
I think I'm supposed to use export class { ... }, but that's NOT supported in NodeJS right? How can I make this module work with both Chrome and NodeJS?
ES6 modules are currently supported under a flag, so it is possible to have your file work natively in both environments. A few important things to note:
In Node, the file has to have an .mjs extension, so Node knows beforehand to load it as an ES module instead of a CommonJS module
Browsers don't automatically search for .js or .mjs extensions. You have to add them yourself when importing, e.g. import { Edge } from './edge.mjs'
However, the technology is still new and experimental, and there's not a lot of documentation or material on the subject. That, and relying on native technology isn't a good idea if you want to support older node environments and browsers.
If you want to support older environments, you can use a tool like webpack to "bundle" up your files into one big JS file that any environment can run.
Lastly, look more into ES modules and gain a good understanding of how the syntax works in detail (defaults especially), so you'll run into less problems later.
Use Babel and compile your code
I don't like the whole export/require stuff in node, it takes too long. Let's say I have a file server.js and I want to use functions in whatever.js. in html I just add this to the header:
<script src='whatever.js'></script>
and then I can just use all the functions of whatever.js in my body's script.
But in node, in the server.js file I'd do:
var myobject = require('./whatever.js');
but then I need to set it to myobject, and further I need to go to whatever.js and manually decide what functions I want to export. not to mention that typing myobject.someFunction() is alot longer to write than someFunction() and I need to remember what I exposed/didn't expose.
I wanted something where I could just go:
require('./whatever.js');
and it puts it ALL in global, no bs. like in good old html/javascript. Is there a way to do this in node?
This will do the trick,
var fs = require('fs');
eval(fs.readFileSync('whatever.js')+'');
// here call functions from whatever.js file
(I realize this is an old thread but wanted to leave a note here for posterity.)
Here in 2022 there are several approaches for executing code from different files with Node.js:
ESM: Use standard ECMAScript modules
At the time of this writing, much of the node ecosystem (i.e. packages on npm) is in the process of transitioning to this paradigm, and there are some associated growing pains (e.g. things like __dirname are only available in CJS not ESM, though the workaround is easy).
For most developers, it would be advisable to become comfortable with this standard as it transcends node.js (i.e. is implemented in other runtimes like Deno and web browsers) and has been years in the making.
CJS: Use the original "CommonJS" module mechanism, e.g. require('./some-script.js')
It should be noted, particularly for the OP, that even though the "intended" way to use CJS modules is to export functions, constants, etc. and import them explicitly, it is possible to define everything in global scope using globalThis, though I would not recommend this.
// my-script.js
require('./foo.js');
require('./bar.js');
foo(); // This is foo from <...>foo.js
console.log(`bar = ${bar} (in ${__filename})`); // bar = 123 (in <...>my-script.js)
// foo.js
globalThis.foo = function() {
console.log(`This is foo from ${__filename}`);
}
// bar.js
globalThis.bar = 123;
If you try omitting globalThis. you'll find that foo and bar are no longer defined in the main script because require "wraps them" in "module scope."
Use eval
In my experience, there are very few legitimate use cases for eval (see Never use eval()!). Nevertheless, the functionality requested in this question is precisely what eval provides: "run some code as if it were written right here" and you can feed it from a file, as explained above by Mehul Prajapati
// include.js
// Defines a global function that works like C's "#include" preprocessor directive
const { readFileSync } = require('fs');
globalThis.include = function(scriptFile) {
console.warn('!!! EXTREMELY INSECURE !!!');
eval(readFileSync(scriptFile, 'utf-8'));
};
// main.js
require('./include.js'); // loads global include
// (I sure hope you completely trust these sources)
include('./foo.js');
include('./bar.js');
Note: Something that has contributed to much of my confusion in the past is that there have been competing standards/conventions/APIs that use some of the same identifiers, namely require, which require.js and other bundlers that support AMD (Asynchronous Module Definition)
use with different semantics. So for someone building a web application (using AMD for modules in web browsers) with node.js tooling (using CJS for modules locally) it can be frustrating to keep the functions straight, especially if it's an Electron application, which can expose Node.js APIs to scripts running in the renderer (browser). If you find yourself confused why a module is "not found" in a situation like that, check the stack trace to see which require is being called (and you may have to wrap/rename them on globalThis or something to avoid collisions).
Further reading:
JavaScript Modules: A Brief History [2019]
How the module system, CommonJS & require works [updated 2022]
What is AMD, CommonJS, and UMD? [2014]
What is the best way to create an ES6 library, e.g. my-es6-crypto-lib, which can be used both in the browser and in Node.js, but for which the implementations are different on each platform?
(E.g. the Node.js implementation uses the built-in crypto module for better performance.)
ES6 module usage:
import { sha256 } from 'my-es6-crypto-lib'
let digest = sha256('abc')
console.log(digest)
Or Node.js-style requires:
let sha256 = require('my-es6-crypto-lib')
let digest = sha256('abc')
console.log(digest)
The package.json for my-es6-crypto-lib would include:
{
"name": "my-es6-crypto-lib",
"main": "transpiled-to-commonjs/node.js",
"module": "es6-module/node.js",
"browser": "es6-module/browser.js",
...
}
Node.js will follow the main key to resolve the CommonJS module.
Tools capable of consuming ES6 modules (like transpiler/bundling tools) follow the module key.
Tools which consume ES6 modules and bundle them for browsers (e.g. rollup-plugin-node-resolve) will follow the browser key.
The actual implementation for Node.js would look something like: (transpiled-to-commonjs/node.js)
// built-in module, faster than the pure Javascript implementation
let createHash = require('crypto')
export function sha256 (message) {
return createHash('sha256').update(message).digest('hex')
}
While the browser implementation would look something like: (es6-module/browser.js)
// a javascript-only implementation available at es6-module/hashFunctions.js
import { sha256 } from './hashFunctions'
export function sha256 (message) {
// slightly different API than the native module
return sha256(message, 'hex')
}
Note, the implementation of each function is different on each platform, but both sha256 methods have the same parameter message and return a string.
What is the best way to structure my ES6 module to provide each of these implementations? Are there any javascript libraries out there which do this?
Ideally:
the unused implementation should be able to be tree shaken, and
no runtime checks should be used to determine the current environment.
(I also opened a GitHub issue for Rollup →)
After a while, I now think the best way to do this is actually to export different functionality for each environment.
In the past, complex javascript libraries have used solutions like Browserify to bundle a version of their application for the browser. Most of these solutions work by allowing library developers to extensively configure and manually override various dependencies with respective browser versions.
For example, where a Node.js application might use Node.js' built-in crypto module, a browser version would need to fall back to a polyfill-like alternative dependency like crypto-browserify.
With es6, this customization and configuration is no longer necessary. Your library can now export different functionality for different consumers. While browser consumers may import a native JavaScript crypto implementation which your library exports, Node.js users can choose to import a different, faster implementation which your library exports.
...
It's explained in depth, and with an example, in the typescript-starter readme →
I've been researching CommonJs, AMD, module loading, and related issues for over a week. I feel like nothing out there does what I need. My basic need is to share code seamlessly between frontend and backend. There are various issues around this including module formats for the client side, script loading, and module format conversions/wrapping. The piece I've been struggling with recently is how to use both CommonJS and AMD (or something AMD-like) in node.js.
You can't get away from commonJs in node.js, so my thinking is that if I want to use AMD, it has to work alongside commonJs. What tools, libraries, or techniques can I use to get something AMD-like working?
For example, I would like to be able to write a module like this:
var x = require('x')
modules.exports = function(a, callback) {
if(a) {
require(['y','z'], function(y,z) {
callback(x, y.o + z.k)
}
} else {
callback(x, "ok")
}
}
Ideally:
Both node.js and the amd-like modules will have paths interpreted in the node.js way (paying attention to node_modules unless the module path starts with "/", "./", or "../")
doesn't require source conversion for the server side in a build step (ie modules will run in node.js without each one being programmatically converted)
module or require don't need to be explicitly passed into the amd-like require function
uRequire is the perfect tool for this requirement, it's all about interoperability between the module formats and their incompatibilities.
Essentially uRequire converts or translates modules from nodejs to AMD and vise versa, plus the UMD format that runs on both nodejs and the browser or a combined .is that requires no AMD loader on browser.
It will require a build step though, but that is a minor concern in contrast to the offering.
You could check out, http://dojotoolkit.org/documentation/tutorials/1.9/node/
I've only played with it a little, but has worked with what I've tried. I got it working with node-orm and remember that being a pain to get going, but might of just been me making a mess while playing with it.
Essentially you end up with AMD on the server, like:
require(["dojo/node!orm","other/amd/module"], function(orm){
//use third party commonjs module and your own amd modules here
}
It looks like you've already investigated Requirejs's suggestion to wrap commonjs modules in an AMD require (automatically during build most likely using r.js).