How can I use the modules loaded from other Node process from another Node process.
Example I run:
node my_modules
which load MyModule
then I will run another nodejs process:
node grab_modules
which will run GrabModule
GrabModule will attempt to use the functions inside MyModule
Is this possible? And if this is possible how?
What you want is probably dnode:
From the README of dnode:
The server (which hosts the functions to be run):
var dnode = require('dnode');
var server = dnode({
zing : function (n, cb) { cb(n * 100) }
});
server.listen(5050);
The client (which calls functions on the server and gets their results in a callback)
var dnode = require('dnode');
dnode.connect(5050, function (remote) {
remote.zing(66, function (n) {
console.log('n = ' + n);
});
});
It depends on what you are trying to do.
If you simply want to reuse the same module (MyModule) from two separate node processes, that's quite easy. You just need to put require('MyModule') in GrabModule and MyModule is accessible when you run grab_module.
If you want to 'share' MyModule including its global variables among two processes, it is much more complex. You need to define a inter-process protocol between two processes (typically REST over socket), and use that protocol to access the module in one process from another process.
1) To use a module (implementation) not an instance (module loaded somewhere of the process using require) in different process, you only need to require that module wherever you need.
If you run two process, for example, process A that use 'MyModule' and process B that use 'GrabModule', but you only need that 'GrabModule', in process B, can access to the exported properties of 'MyModule' then you only need to use require('path to MyModule').
2) On the other hand, if you need that a process B, can access to a module's state (a module that has been executed, because you use require in somewhere) of a process A, then you need to use a IPC (Inter-process communication) that allows to exchange data between process A and process B, and build or use the same protocol in both, over it.
Depending if your process are in the same machine or in different one, to can use some IPC build in the same OS, as nodejs offer with child fork (http://nodejs.org/api/child_process.html#child_process_child_process_fork_modulepath_args_options) or use an IPC built in some network channel.
For example, you can use the publish/subscribe messaging system of Redis (http://redis.io/topics/pubsub)
What about this:
my_modules will work as the program with public api (rest api, xml-rpc, ...)
and grab_modules will connect to api and call functions from my_modules
If you also require interoperability with other languages and/or high speed, ZeroMQ is also an option. While originally being a plain C library, I have had good experiences with the NodeJS bindings.
There a ZeroMQ binding for almost all popular languages, see http://zeromq.org/bindings:_start
Related
Is there an existing API or library that can be used to load a JSON file in both the browser and Node?
I'm working on a module that I intend to run both from the command-line in NodeJS, and through the browser. I'm using the latest language features common to both (and don't need to support older browsers), including class keywords and the ES6 import syntax. The class in question needs to load a series of JSON files (where the first file identifies others that need to be loaded), and my preference is to access them as-is (they are externally defined and shared with other tools).
The "import" command looks like it might work for the first JSON file, except that I don't see a way of using that to load a series of files (determined by the first file) into variables.
One option is to pass in a helper function to the class for loading files, which the root script would populate as appropriate for NodeJS or the browser.
Alternatively, my current leading idea, but still not ideal in my mind, is to define a separate module with a "async function loadFile(fn)" function that can be imported, and set the paths such that a different version of that file loads for browser vs NodeJS.
This seems like something that should have a native option, or that somebody else would have already written a module for, but I've yet to find either.
For node, install the node-fetch module from npm.
Note that browser fetch can't talk directly to your filesystem -- it requires an HTTP server on the other side of the request. Node can talk to your filesystem, as well as making HTTP calls to servers.
It sounds like as of now, there is no perfect solution here. The 'fetch' API is the most promising, but only if Node implements it some day.
In the meantime I've settled for a simple solution that works seamlessly with minimal dependencies, requiring only a little magic with my ExpressJS server paths to point the served web instance to a different version of utils.js.
Note: To use the ES-style import syntax for includes in NodeJS (v14+) you must set "type":"module" in your package.json. See https://nodejs.org/api/esm.html#esm_package_json_type_field for details. This is necessary for true shared code bases.
Module Using it (NodeJS + Browser running the same file):
import * as utils from "../utils.js";
...
var data = await utils.loadJSON(filename);
...
utils.js for browser:
async function loadJSON(fn) {
return $.getJSON(fn); // Only because I'm using another JQuery-dependent lib
/* Or natively something like
let response = await fetch(fn);
return response.json();
*/
}
export { loadJSON };
utils.js for nodeJS
import * as fs from 'fs';
async function loadJSON(fn) {
return JSON.parse(await fs.promises.readFile(fn));
}
export { loadJSON };
I have been studying NodeJS for about a year and today I found something strange for me. We all know that in order to use a module (function, object or variable) in another module we must export and import it except for the native modules like String, Number, Promise, etc. I installed an external package for unit testing called Jest.
The strange thing here is that I have created two test modules called: logic.js and logic.test.js, in none I have imported the Jest module, however I can access all its methods. Let's show some code:
logic.js
module.exports.add = function(a, b){
return a + b;
}
logic.test.js
const lib = require('../logic')
test('Add - Should return the sum of two numbers', () => {
const result = lib.add(1,3);
expect(result).toBe(4);
});
As you can see in logic.test.js I have access to expect and test methods and i have not impoted nothing about Jest.
The questions here are:
How is this posible?
Its a good practice to do this with my modules
As Jonas W stated in the comments, they make use of the global variable that is common to all your application.
The use of the global variable is very simple
test.js
global.myObject = 'MyMan'
app.js
require('./test')
console.log(myObject)
Loading app.js will render MyMan
You might say that I actually import the test module and that Jest does not.
The thing is that you execute your node application using node yourFile.js but you instanciate your jests tests with the jest command line.
It's the jest command line that handles the binding between its framework (the expect and test methods.) and your script.
Is it a good practice?
I would say no. Except if you plan to make a library like Jest that have its own command line launcher and that you want to give tools like that to the users of your library.
The power of Node lives into the module organization, don't be afraid to use them.
I'm dealing and wondering with context issues (and possibilities) in our microservice backend and I'm trying to better understand what exactly is shared across NodeJS forked cluster instances (for example using PM2).
The situation - lets assume I have an NPM package named that exports one instance of itself as in:
class Content {
constructor(){
if(!Content.Registry) Content.Registry = {};
}
registerStatic(key,obj) {
Content.Registry[key] = obj;
}
getStatic(key){
return Content.Registry[key];
}
}
module.exports = new Content();
and my Node application includes this module at several points and registers the instances of several objects using the static context methods.
Now, I know that the require registry from Node caches the instances when you require a package (or to maybe go with a better wording that suits the node documentation - it will 'probably' bring out the same cached instance).
Questions asked:
can I count on the package being cached? I require it as an NPM module and not as a file (require('package-name'))
if even though the instance returned from the require registry won't be the same - will the static context be the same across the same PID? (Content.Registry)
to complicate things a bit more - when running this whole thing over PM2 - will every forked child PID will have a different package instance AND static context? will the static context be preserved? I have already noticed that Node shares some resources (the sharing of sockets via IPC is documented, however we found that even file handles are shared when trying to write to the same file from different PM2 instances - logic says each one will try to acquire a lock handle on the file but only one will succeed. in reality they all write together - this was done using winston)
Thanks for any help!
EDIT: would there even be any way to test for this? could I somehow access within the NPM package some node/javascript functionality that would give me some UUID of the package's instance? memory address?
I am trying to get some of the NPM functionality into my Node.js programs. In particular, I would like to be able to analyze the available node modules on my system. "Module" here means "module identifier", thus either an identifier like "fd", or a file path; in other words, anything that can be put into a require() call and does load a module.
This question is split into three sub-problems:
1) Get a list of all core modules
2) Get a list of all loaded modules
3) Get a list of all installed and available modules.
Question 1 is answered by zeke's node-core-module-names list. Loading another module to find the core modules list is not elegant and it may outdate itself, but it is an option and does work. The (ordered) list is thus ['assert', 'buffer', 'child_process', ..., 'zlib'].
The 2. question can be answered by a call of Object.keys(require.cache), which returns a list of file paths.
What I cannot elegantly solve by now is the 3. question. There is the shell command npm ls which returns a tree graph. But is there anything usable and better?
Thank's for listening!
Tom
Here is something I have found tinkering around and I think this should be valid.
The V8 code has a standard set of bindings which you have seen in Node. They include but are not limited to:
fs
path
http
etc.
Also, there is a global variable by name of process. This exposes process level information and functionality, but also lets you get your hands on some V8 code through a function inside of the process variable called bindings.
The bindings(...) functions allows you to interface into exposed C++ libraries created by Node or you can create your own NodeJS modules by following the V8 developer guide (beyond the scope of this answer, read more here).
A funny little line I saw in the Node.cc file included a static check for checking for bindings for a keyword natives. This returns, it seems, a list of system level modules that you are looking for, and then some.
So that being said, I went into Node REPL and plugged in two lines (which I am sure can be shortened in a more elegant, expressive manner). Also note that I am pruning out anything starting with an underscore (_) so to preserve private functions or bindings:
var natives = process.binding('natives');
for (var key in natives) {if (key.indexOf('_') !== 0) {console.log(key);}}
npm list has various output options/flags, including json and parseable (which outputs a list of paths)
Try this:
var exec = require('child_process').exec;
var cmd = 'npm ls --json';
exec(cmd, function(error, stdout, stderr) {
var treeObject = JSON.parse(stdout);
});
The above requires no external packages but might need more code to work around the buffer limit: https://github.com/nodejs/node/issues/4236
Alternatively npm can also be used programmatically, perhaps through global-npm:
var npm = require('global-npm');
npm.load({}, function (err) {
npm.commands.list(null, function(err, treeObject) {
var firstLevelDependenciesArray = Object.keys(treeObject.dependencies);
});
});
Let's say I need CasperJS to report progress steps to a localhost server. I couldn't user casper.open to send a POST request because it'd "switch" pages so to speak and would be unable to continue other steps properly.
I sidestepped this issue by evaluating an XMLHttpRequest() inside the browser to ping to localhost. Not ideal but it works.
As the number of the scripts grow, I'd rather move this common functionality into a module, which is to say, I want to move a number of functions into a separate module.
It's my understanding CasperJS doesn't work like node.js does so require() rules are different. How do I go about accomplishing this?
Since CasperJS is based on PhantomJS you can use its module system, which is "modelled after CommonJS Modules 1.1"
You can require the module file by its path, full or relative.
var tools = require("./tools.js");
var tools = require("./lib/utils/tools.js");
var tools = require("/home/scraping/project/lib/utils/tools.js");
Or you can follow node.js convention and create a subfolder node_modules/module_name in your project's folder, and place module's code into index.js file. It would then reside in this path:
./node_modules/tools/index.js
After that require it in CasperJS script:
var tools = require("tools");
Module would export its functions in this way:
function test(){
console.log("This is test");
}
module.exports = {
test: test
};