I am building an Atom Electron app. Right now I have this in the preload.js of one of my webviews:
var { requireTaskPool } = require('electron-remote');
var work = '';
var _ = require('lodash');
work = requireTaskPool(require.resolve('./local/path/to/js/file.js'));
function scriptRun() {
console.log('Preload: Script Started');
// `work` will get executed concurrently in separate background processes
// and resolve with a promise
_.times(1, () => {
work(currentTab).then(result => {
console.log(`Script stopped. Total time running was ${result} ms`);
});
});
}
module.exports = scriptRun;
scriptRun();
It gets a local script and then executes it in a background process.
I want to do the same exact thing, except I want to retrieve the script from an external source like so
work = requireTaskPool(require.resolve('https://ex.com/path/to/js/file.js'));
When I do this, I get errors like:
Uncaught Error: Cannot find module 'https://ex.com/path/to/js/file.js'
How can I load external scripts? And then use the loaded scripts with my work function. My feeling is that require only works with local files. If AJAX is the answer, can I see an example of how to get a script, then pass it into my work without executing it prior?
I was able to load a remote js file and execute the function defined in it, hopefully it will provide you enough to start with...
my remote dummy.js, available online somewhere:
const dummy = () => console.log('dummy works');
my download.js:
const vm = require("vm");
const rp = require('request-promise');
module.exports = {
downloadModule: async () => {
try {
let body = await rp('http://somewhere.online/dummy.js');
let script = vm.createScript(body);
script.runInThisContext();
// this is the actual dummy method loaded from remote dummy.js
// now available in this context:
return dummy;
} catch (err) {
console.log('err', err);
}
return null;
}
};
You need to add the request-promise package.
Then in my main.js I use it like this:
const {downloadModule} = require('./download');
downloadModule().then((dummy) => {
if (dummy) dummy();
else console.log('no dummy');
});
When I run it, this is what I get:
$ electron .
dummy works
I wanted to create an actual module and require it, but I have not had the time to play with this further. If I accomplish that I will add it here.
You have not provided any details on your file.js. But I can give you the general idea.
There are two things that you need at minimum to call your package a module:
file.js (of course you have it) and
package.json
The structure of your file.js should be something like this:
//load your dependencies here
var something = require("something");
//module.exports is necessary to export your code,
//so that you can fetch this code in another file by using require.
module.exports = function() {
abc: function(){
//code for abc function
},
xyz: function(){
//code for xyz function
}
}
Now if you put your package on any website, you can access it as:
npm install https://ex.com/path/to/js/file.js
Now, a copy of your package will be put into node-modules folder.
So, now you can access it as:
var x = require('name-of-your-package-in-node-modules');
Now, you can also do:
var abc = require('name-of-your-package-in-node-modules').abc;
or
var xyz = require('name-of-your-package-in-node-modules').xyz;
Related
I am working on supporting a REST API that literally has thousands of functions/objects/stats/etc., and placing all those calls into one file does not strike me as very maintainable. What I want to do is have a 'base' file that has the main constructor function, a few utility and very common functions, and then files for each section of API calls.
The Problem: How do you attach functions from other files to the 'base' Object so that referencing the main object allows for access from the subsections you have added to your program??
Let me try and illustrate what I am looking to do:
1) 'base' file has the main constructor:
var IPAddr = "";
var Token = "";
exports.Main = function(opts) {
IPAddr = opts.IPAddr;
Token = opts.Token;
}
2) 'file1' has some subfunctions that I want to define:
Main.prototype.Function1 = function(callback) {
// stuff done here
callback(error, data);
}
Main.prototype.Function2 = function(callback) {
// stuff done here
callback(error,data);
}
3) Program file brings it all together:
var Main = require('main.js');
var Main?!? = require('file1.js');
Main.Function1(function(err,out) {
if(err) {
// error stuff here
}
// main stuff here
}
Is there a way to combine an Object from multiple code files?? A 120,000 line Node.JS file just doesn't seem to be the way to go to me....not to mention it takes too long to load! Thanks :)
SOLUTION: For those who may stumble upon this in the future... I took the source code for Object.assign and back ported it to my v0.12 version of Node and got it working.
I used the code from here: https://github.com/sindresorhus/object-assign/blob/master/index.js and put it in a separate file that I just require('./object-assign.js') without assigning it to a var. Then my code looks something like this:
require('./object-assign.js');
var Main = require('./Main.js');
Object.assign(Main.prototype, require('./file1.js'));
Object.assign(Main.prototype, require('./file2.js'));
And all my functions from the two files show up under the Main() Object...too cool :)
At first each file works in its own scope, so all local variables are not shared.
However, as hacky approach, you may just add Main into global scope available everywhere by writing global.Main = Main right after you define it, please make sure that you require main file first in list of requires.
The better(who said?) approach is to extend prototype of Main later, but in this case you may need to update a lot of code. Just mix-in additional functionality into base class
file1.js
module.exports = {
x: function() {/*****/}
}
index.js
var Main = require('main.js');
Object.assign(Main.prototype, require('file1.js'));
Shure.
constructor.js
module.exports = function(){
//whatever
};
prototype.js
module.exports = {
someMethod(){ return "test";}
};
main.js
const Main = require("./constructor.js");
Object.assign( Main.prototype, require("./prototype.js"));
I would like to make use of a function called executeJavaScript() from the Electron webContents API. Since it is very close to eval() I will use this in the example.
The problem:
I have a decent sized script but it is contained in a template string.
Expanding this app, the script could grow a lot as a string.
I am not sure what the best practices are for this.
I also understand that eval() is dangerous, but I am interested in the principal of my question.
Basic eval example for my question:
// Modules
const fs = require('fs');
// CONSTANTS
const EXAMPLE_1 = 'EXAMPLE_1';
const EXAMPLE_2 = 'EXAMPLE_2';
const EXAMPLE_3 = 'EXAMPLE_3';
const exampleScriptFunction = require('./exampleScriptFunction');
const exampleScriptFile = fs.readFileSync('./exampleScriptFile.js');
// using direct template string
eval(`console.log(${EXAMPLE_1})`);
// using a method from but this doesnt solve the neatness issue.
eval(exampleScriptFunction(EXAMPLE_2));
// What I want is to just use a JS file because it is neater.
eval(`${exampleScriptFile}`);
exampleScriptFunction.js
module.exports = function(fetchType) {
return `console.log(${fetchType});`;
}
This will allow me to separate the script to a new file
what if I have many more then 1 variable???
exampleScriptFile.js:
console.log(${EXAMPLE_3});
This clearly does not work, but I am just trying to show my thinking.
back ticks are not present, fs loads as string, main file has back ticks.
This does not work. I do not know how else to show what I mean.
Because I am loading this will readFileSync, I figured the es6 template string would work.
This allows me to write a plain js file with proper syntax highlighting
The issue is the variables are on the page running the eval().
Perhaps I am completely wrong here and looking at this the wrong way. I am open to suggestions. Please do not mark me minus 1 because of my infancy in programming. I really do not know how else to ask this question. Thank you.
Assuming your source is stored in exampleScriptFile:
// polyfill
const fs = { readFileSync() { return 'console.log(`${EXAMPLE_3}`);'; } };
// CONSTANTS
const EXAMPLE_1 = 'EXAMPLE_1';
const EXAMPLE_2 = 'EXAMPLE_2';
const EXAMPLE_3 = 'EXAMPLE_3';
const exampleScriptFile = fs.readFileSync('./exampleScriptFile.js');
// What I want is to just use a JS file because it is neater.
eval(exampleScriptFile);
Update
Perhaps I wasn't clear. The ./exampleScriptFile.js should be:
console.log(`${EXAMPLE_3}`);
While what you're describing can be done with eval as #PatrickRoberts demonstrates, that doesn't extend to executeJavaScript.
The former runs in the caller's context, while the latter triggers an IPC call to another process with the contents of the code. Presumably this process doesn't have any information on the caller's context, and therefore, the template strings can't be populated with variables defined in this context.
Relevant snippets from electron/lib/browsers/api/web-contents.js:
WebContents.prototype.send = function (channel, ...args) {
// ...
return this._send(false, channel, args)
}
// ...
WebContents.prototype.executeJavaScript = function (code, hasUserGesture, callback) {
// ...
return asyncWebFrameMethods.call(this, requestId, 'executeJavaScript',
// ...
}
// ...
const asyncWebFrameMethods = function (requestId, method, callback, ...args) {
return new Promise((resolve, reject) => {
this.send('ELECTRON_INTERNAL_RENDERER_ASYNC_WEB_FRAME_METHOD', requestId, method, args)
// ...
})
}
Relevant snippets from electron/atom/browser/api/atom_api_web_contents.cc
//...
void WebContents::BuildPrototype(v8::Isolate* isolate,
v8::Local<v8::FunctionTemplate> prototype) {
prototype->SetClassName(mate::StringToV8(isolate, "WebContents"));
mate::ObjectTemplateBuilder(isolate, prototype->PrototypeTemplate())
// ...
.SetMethod("_send", &WebContents::SendIPCMessage)
// ...
}
I did't found any realy working solution of this issue on Google, so...
There is a module called vm, but people say it is very heavy. I need some simple function, like PHP's include, which works like the code of an including file inserts right into the place in code you need and then executes.
I tryed to create such function
function include(path) {
eval( fs.readFileSync(path) + "" );
}
but it is not as simple... it is better if I'll show you why in example.
Let's say I need to include file.js file with content
var a = 1;
The relative file look like this
include("file.js");
console.log(a); // undefined
As you already realized a is undefined because it is not inherits from a function.
Seems the only way to do that is to type this long creepy code
eval( fs.readFileSync("file.js") + "" );
console.log(a); // 1
every time with no wrapper function in order to get all the functionality from a file as it is.
Using require with module.exports for each file is also a bad idea...
Any other solutions?
Using require with module.exports for each file is also a bad idea...
No, require is the way you do this in NodeJS:
var stuff = require("stuff");
// Or
var stuff = require("./stuff"); // If it's within the same directory, part of a package
Break your big vm into small, maintainable pieces, and if they need to be gathered together into one big thing rather than being used directly, have a vm.js that does that.
So for example
stuff.js:
exports.nifty = nifty;
function nifty() {
console.log("I do nifty stuff");
}
morestuff.js:
// This is to address your variables question
exports.unavoidable = "I'm something that absolutely has to be exposed outside the module.";
exports.spiffy = spiffy;
function spiffy() {
console.log("I do spiffy stuff");
}
vm.js:
var stuff = require("./stuff"),
morestuff = require("./morestuff");
exports.cool = cool;
function cool() {
console.log("I do cool stuff, using the nifty function and the spiffy function");
stuff.nifty();
morestuff.spiffy();
console.log("I also use this from morestuff: " + morestuff.unavoidable);
}
app.js (the app using vm):
var vm = require("./vm");
vm.cool();
Output:
I do cool stuff, using the nifty function and the spiffy function
I do nifty stuff
I do spiffy stuff
I also use this from morestuff: I'm something that absolutely has to be exposed outside the module.
What you are trying to do breaks modularity and goes against Node.js best practices.
Lets say you have this module (sync-read.js):
var fs = require('fs');
module.exports = {
a: fs.readFileSync('./a.json'),
b: fs.readFileSync('./b.json'),
c: fs.readFileSync('./c.json')
}
When you call the module the first time ...
var data = require('./sync-read');
... it will be cached and you wont read those files from disk again. With your approach you'll be reading from disk on every include call. No bueno.
You don't need to append each of your vars to module.exports (as in comment above):
var constants = {
lebowski: 'Jeff Bridges',
neo: 'Keanu Reeves',
bourne: 'Matt Damon'
};
function theDude() { return constants.lebowski; };
function theOne() { return constants.neo; };
function theOnly() { return constants.bourne; };
module.exports = {
names: constants,
theDude : theDude,
theOne : theOne
// bourne is not exposed
}
Then:
var characters = require('./characters');
console.log(characters.names);
characters.theDude();
characters.theOne();
There are some third party Javascript libraries that have some functionality I would like to use in a Node.js server. (Specifically I want to use a QuadTree javascript library that I found.) But these libraries are just straightforward .js files and not "Node.js libraries".
As such, these libraries don't follow the exports.var_name syntax that Node.js expects for its modules. As far as I understand that means when you do module = require('module_name'); or module = require('./path/to/file.js'); you'll end up with a module with no publicly accessible functions, etc.
My question then is "How do I load an arbitrary javascript file into Node.js such that I can utilize its functionality without having to rewrite it so that it does do exports?"
I'm very new to Node.js so please let me know if there is some glaring hole in my understanding of how it works.
EDIT: Researching into things more and I now see that the module loading pattern that Node.js uses is actually part of a recently developed standard for loading Javascript libraries called CommonJS. It says this right on the module doc page for Node.js, but I missed that until now.
It may end up being that the answer to my question is "wait until your library's authors get around to writing a CommonJS interface or do it your damn self."
Here's what I think is the 'rightest' answer for this situation.
Say you have a script file called quadtree.js.
You should build a custom node_module that has this sort of directory structure...
./node_modules/quadtree/quadtree-lib/
./node_modules/quadtree/quadtree-lib/quadtree.js
./node_modules/quadtree/quadtree-lib/README
./node_modules/quadtree/quadtree-lib/some-other-crap.js
./node_modules/quadtree/index.js
Everything in your ./node_modules/quadtree/quadtree-lib/ directory are files from your 3rd party library.
Then your ./node_modules/quadtree/index.js file will just load that library from the filesystem and do the work of exporting things properly.
var fs = require('fs');
// Read and eval library
filedata = fs.readFileSync('./node_modules/quadtree/quadtree-lib/quadtree.js','utf8');
eval(filedata);
/* The quadtree.js file defines a class 'QuadTree' which is all we want to export */
exports.QuadTree = QuadTree
Now you can use your quadtree module like any other node module...
var qt = require('quadtree');
qt.QuadTree();
I like this method because there's no need to go changing any of the source code of your 3rd party library--so it's easier to maintain. All you need to do on upgrade is look at their source code and ensure that you are still exporting the proper objects.
There is a much better method than using eval: the vm module.
For example, here is my execfile module, which evaluates the script at path in either context or the global context:
var vm = require("vm");
var fs = require("fs");
module.exports = function(path, context) {
context = context || {};
var data = fs.readFileSync(path);
vm.runInNewContext(data, context, path);
return context;
}
And it can be used like this:
> var execfile = require("execfile");
> // `someGlobal` will be a global variable while the script runs
> var context = execfile("example.js", { someGlobal: 42 });
> // And `getSomeGlobal` defined in the script is available on `context`:
> context.getSomeGlobal()
42
> context.someGlobal = 16
> context.getSomeGlobal()
16
Where example.js contains:
function getSomeGlobal() {
return someGlobal;
}
The big advantage of this method is that you've got complete control over the global variables in the executed script: you can pass in custom globals (via context), and all the globals created by the script will be added to context. Debugging is also easier because syntax errors and the like will be reported with the correct file name.
The simplest way is: eval(require('fs').readFileSync('./path/to/file.js', 'utf8'));
This works great for testing in the interactive shell.
AFAIK, that is indeed how modules must be loaded.
However, instead of tacking all exported functions onto the exports object, you can also tack them onto this (what would otherwise be the global object).
So, if you want to keep the other libraries compatible, you can do this:
this.quadTree = function () {
// the function's code
};
or, when the external library already has its own namespace, e.g. jQuery (not that you can use that in a server-side environment):
this.jQuery = jQuery;
In a non-Node environment, this would resolve to the global object, thus making it a global variable... which it already was. So it shouldn't break anything.
Edit:
James Herdman has a nice writeup about node.js for beginners, which also mentions this.
I'm not sure if I'll actually end up using this because it's a rather hacky solution, but one way around this is to build a little mini-module importer like this...
In the file ./node_modules/vanilla.js:
var fs = require('fs');
exports.require = function(path,names_to_export) {
filedata = fs.readFileSync(path,'utf8');
eval(filedata);
exported_obj = {};
for (i in names_to_export) {
to_eval = 'exported_obj[names_to_export[i]] = '
+ names_to_export[i] + ';'
eval(to_eval);
}
return exported_obj;
}
Then when you want to use your library's functionality you'll need to manually choose which names to export.
So for a library like the file ./lib/mylibrary.js...
function Foo() { //Do something... }
biz = "Blah blah";
var bar = {'baz':'filler'};
When you want to use its functionality in your Node.js code...
var vanilla = require('vanilla');
var mylibrary = vanilla.require('./lib/mylibrary.js',['biz','Foo'])
mylibrary.Foo // <-- this is Foo()
mylibrary.biz // <-- this is "Blah blah"
mylibrary.bar // <-- this is undefined (because we didn't export it)
Don't know how well this would all work in practice though.
I was able to make it work by updating their script, very easily, simply adding module.exports = where appropriate...
For example, I took their file and I copied to './libs/apprise.js'. Then where it starts with
function apprise(string, args, callback){
I assigned the function to module.exports = thus:
module.exports = function(string, args, callback){
Thus I'm able to import the library into my code like this:
window.apprise = require('./libs/apprise.js');
And I was good to go. YMMV, this was with webpack.
A simple include(filename) function with better error messaging (stack, filename etc.) for eval, in case of errors:
var fs = require('fs');
// circumvent nodejs/v8 "bug":
// https://github.com/PythonJS/PythonJS/issues/111
// http://perfectionkills.com/global-eval-what-are-the-options/
// e.g. a "function test() {}" will be undefined, but "test = function() {}" will exist
var globalEval = (function() {
var isIndirectEvalGlobal = (function(original, Object) {
try {
// Does `Object` resolve to a local variable, or to a global, built-in `Object`,
// reference to which we passed as a first argument?
return (1, eval)('Object') === original;
} catch (err) {
// if indirect eval errors out (as allowed per ES3), then just bail out with `false`
return false;
}
})(Object, 123);
if (isIndirectEvalGlobal) {
// if indirect eval executes code globally, use it
return function(expression) {
return (1, eval)(expression);
};
} else if (typeof window.execScript !== 'undefined') {
// if `window.execScript exists`, use it
return function(expression) {
return window.execScript(expression);
};
}
// otherwise, globalEval is `undefined` since nothing is returned
})();
function include(filename) {
file_contents = fs.readFileSync(filename, "utf8");
try {
//console.log(file_contents);
globalEval(file_contents);
} catch (e) {
e.fileName = filename;
keys = ["columnNumber", "fileName", "lineNumber", "message", "name", "stack"]
for (key in keys) {
k = keys[key];
console.log(k, " = ", e[k])
}
fo = e;
//throw new Error("include failed");
}
}
But it even gets dirtier with nodejs: you need to specify this:
export NODE_MODULE_CONTEXTS=1
nodejs tmp.js
Otherwise you cannot use global variables in files included with include(...).
Is it easy/possible to do a simple include('./path/to/file') type of command in node.js?
All I want to do is have access to local variables and run a script. How do people typically organize node.js projects that are bigger than a simple hello world? (A fully functional dynamic website)
For example I'd like to have directories like:
/models
/views
... etc
Just do a require('./yourfile.js');
Declare all the variables that you want outside access as global variables.
So instead of
var a = "hello" it will be
GLOBAL.a="hello" or just
a = "hello"
This is obviously bad. You don't want to be polluting the global scope.
Instead the suggest method is to export your functions/variables.
If you want the MVC pattern take a look at Geddy.
You need to understand CommonJS, which is a pattern to define modules. You shouldn't abuse GLOBAL scope that's always a bad thing to do, instead you can use the 'exports' token, like this:
// circle.js
var PI = 3.14; // PI will not be accessible from outside this module
exports.area = function (r) {
return PI * r * r;
};
exports.circumference = function (r) {
return 2 * PI * r;
};
And the client code that will use our module:
// client.js
var circle = require('./circle');
console.log( 'The area of a circle of radius 4 is '
+ circle.area(4));
This code was extracted from node.js documentation API:
http://nodejs.org/docs/v0.3.2/api/modules.html
Also, if you want to use something like Rails or Sinatra, I recommend Express (I couldn't post the URL, shame on Stack Overflow!)
If you are writing code for Node, using Node modules as described by Ivan is without a doubt the way to go.
However, if you need to load JavaScript that has already been written and isn't aware of node, the vm module is the way to go (and definitely preferable to eval).
For example, here is my execfile module, which evaluates the script at path in either context or the global context:
var vm = require("vm");
var fs = require("fs");
module.exports = function(path, context) {
var data = fs.readFileSync(path);
vm.runInNewContext(data, context, path);
}
Also note: modules loaded with require(…) don't have access to the global context.
If you are planning to load an external javascript file's functions or objects, load on this context using the following code – note the runInThisContext method:
var vm = require("vm");
var fs = require("fs");
var data = fs.readFileSync('./externalfile.js');
const script = new vm.Script(data);
script.runInThisContext();
// here you can use externalfile's functions or objects as if they were instantiated here. They have been added to this context.
Expanding on #Shripad's and #Ivan's answer, I would recommend that you use Node.js's standard module.export functionality.
In your file for constants (e.g. constants.js), you'd write constants like this:
const CONST1 = 1;
module.exports.CONST1 = CONST1;
const CONST2 = 2;
module.exports.CONST2 = CONST2;
Then in the file in which you want to use those constants, write the following code:
const {CONST1 , CONST2} = require('./constants.js');
If you've never seen the const { ... } syntax before: that's destructuring assignment.
Sorry for resurrection. You could use child_process module to execute external js files in node.js
var child_process = require('child_process');
//EXECUTE yourExternalJsFile.js
child_process.exec('node yourExternalJsFile.js', (error, stdout, stderr) => {
console.log(`${stdout}`);
console.log(`${stderr}`);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});