Builder design pattern in javascript - javascript

Need to construct a custom JSON file after parsing XML. To cut the long story short, have an XML file from which must be created a JSON file, but not all information from XML must be present in JSON. It must be done using only JS and Node.js. To parse XML I used xmldom and fs.
Now the part I am stuck at. To create a JSON file I can't use code snipped already available on the Web, because there are very few examples with Node.js and because I need to omit many things from XML. Looping also is not a possibility, since XMLDOM is recursive. Right now I use this code to do it:
var builder = function() {
this.meta = {};
}
builder.atRoot = function(library) {
this.meta["!name"] = library;
this.meta["!define"] = {};
this.define = this.meta["!define"];
}
builder.atNamespace = function(name) {
this.namespace = {};
this.namespace["!category"] = "namespace";
this.namespace["!description"] = "";
this.define[name] = this.namespace;
}
builder.atDescription = function(method){
this.class = {};
this.class["!Accordion"] = method;
this.class["!AccordionSection"] = method;
this.class["!ApplicationHeader"] = method;
....
}
// and so on
builder.toJson = function() {
var s = JSON.stringify(this.meta);
console.log(s);
}
But it doesn't work. Now, may be I simply don't understand how builder works, or it is the wrong way. Any ideas on what is wrong, or additional explanation of builder design pattern will be much appreciated.
Many thanks in advance.

Builder usually implies recursion. I don't fully understand what do you need, but it seems that you need to create a whitelist of attribute names and just recursively copy properties into new object, then serialize it as JSON.

Related

Alternative to eval() in node script

I am working on a script that runs during our build process in Jenkins right before npm install. My issue is that I need to download a JavaScript file from an external resource and read a variable from it.
unzipper.on('extract', () => {
const content = fs.readFileSync(`${outputDir}/js/en.js`, 'utf8');
eval(content); // Less smellier alternative?
if (obj) {
const str = JSON.stringify(obj);
fs.writeFileSync(`${outputDir}/public/data.json`, str);
} else {
throw 'Variable `obj` not found';
}
});
I know that "eval is evil", but any suggested alternatives I've found online don't seem to work.
I have tried different variations of new Function(obj)(), but Node seems to exit the script after (the if-case never runs).
Ideas?
Since node.js provides the API to talk to the V8 runner directly, it might be a good idea to use it. Basically, it's the API used by node's require under the hood.
Assuming the js file in question contains the variable obj we're interested in, we do the following:
read the code from the file
append ; obj to the code to make sure it's the last expression it evaluates
pass the code to V8 for evaluation
grab the return value (which is our obj):
const fs = require('fs'),
vm = require('vm');
const code = fs.readFileSync('path-to-js-file', 'utf8');
const obj = vm.runInNewContext(code + ';obj');
This answer is heavily based on #georg's comments, but since it helped me I'll provide it as an alternative answer.
Explanation in the comments.
let content = fs.readFileSync(`${outputDir}/js/en.js`, 'utf8');
content += '; module.exports=obj'; // Export "obj" variable
fs.writeFileSync(`${outputDir}/temp`, content); // Create a temporary file
const obj = require(`${outputDir}/temp`); // Import the variable from the temporary file
fs.unlinkSync(`${outputDir}/temp`); // Remove the temporary file

Evalute JS file with template strings from another file

I would like to make use of a function called executeJavaScript() from the Electron webContents API. Since it is very close to eval() I will use this in the example.
The problem:
I have a decent sized script but it is contained in a template string.
Expanding this app, the script could grow a lot as a string.
I am not sure what the best practices are for this.
I also understand that eval() is dangerous, but I am interested in the principal of my question.
Basic eval example for my question:
// Modules
const fs = require('fs');
// CONSTANTS
const EXAMPLE_1 = 'EXAMPLE_1';
const EXAMPLE_2 = 'EXAMPLE_2';
const EXAMPLE_3 = 'EXAMPLE_3';
const exampleScriptFunction = require('./exampleScriptFunction');
const exampleScriptFile = fs.readFileSync('./exampleScriptFile.js');
// using direct template string
eval(`console.log(${EXAMPLE_1})`);
// using a method from but this doesnt solve the neatness issue.
eval(exampleScriptFunction(EXAMPLE_2));
// What I want is to just use a JS file because it is neater.
eval(`${exampleScriptFile}`);
exampleScriptFunction.js
module.exports = function(fetchType) {
return `console.log(${fetchType});`;
}
This will allow me to separate the script to a new file
what if I have many more then 1 variable???
exampleScriptFile.js:
console.log(${EXAMPLE_3});
This clearly does not work, but I am just trying to show my thinking.
back ticks are not present, fs loads as string, main file has back ticks.
This does not work. I do not know how else to show what I mean.
Because I am loading this will readFileSync, I figured the es6 template string would work.
This allows me to write a plain js file with proper syntax highlighting
The issue is the variables are on the page running the eval().
Perhaps I am completely wrong here and looking at this the wrong way. I am open to suggestions. Please do not mark me minus 1 because of my infancy in programming. I really do not know how else to ask this question. Thank you.
Assuming your source is stored in exampleScriptFile:
// polyfill
const fs = { readFileSync() { return 'console.log(`${EXAMPLE_3}`);'; } };
// CONSTANTS
const EXAMPLE_1 = 'EXAMPLE_1';
const EXAMPLE_2 = 'EXAMPLE_2';
const EXAMPLE_3 = 'EXAMPLE_3';
const exampleScriptFile = fs.readFileSync('./exampleScriptFile.js');
// What I want is to just use a JS file because it is neater.
eval(exampleScriptFile);
Update
Perhaps I wasn't clear. The ./exampleScriptFile.js should be:
console.log(`${EXAMPLE_3}`);
While what you're describing can be done with eval as #PatrickRoberts demonstrates, that doesn't extend to executeJavaScript.
The former runs in the caller's context, while the latter triggers an IPC call to another process with the contents of the code. Presumably this process doesn't have any information on the caller's context, and therefore, the template strings can't be populated with variables defined in this context.
Relevant snippets from electron/lib/browsers/api/web-contents.js:
WebContents.prototype.send = function (channel, ...args) {
// ...
return this._send(false, channel, args)
}
// ...
WebContents.prototype.executeJavaScript = function (code, hasUserGesture, callback) {
// ...
return asyncWebFrameMethods.call(this, requestId, 'executeJavaScript',
// ...
}
// ...
const asyncWebFrameMethods = function (requestId, method, callback, ...args) {
return new Promise((resolve, reject) => {
this.send('ELECTRON_INTERNAL_RENDERER_ASYNC_WEB_FRAME_METHOD', requestId, method, args)
// ...
})
}
Relevant snippets from electron/atom/browser/api/atom_api_web_contents.cc
//...
void WebContents::BuildPrototype(v8::Isolate* isolate,
v8::Local<v8::FunctionTemplate> prototype) {
prototype->SetClassName(mate::StringToV8(isolate, "WebContents"));
mate::ObjectTemplateBuilder(isolate, prototype->PrototypeTemplate())
// ...
.SetMethod("_send", &WebContents::SendIPCMessage)
// ...
}

Node.js make initialized object available in all modules

I have an initialized object that I initialized in app.js file and I would like to make this initialized object is available in all modules. How could I do that? Passing this object to every modules is one way to do and I'm wondering if I'm missing anything or there should be done in difference ways?
I saw mongoose actually support default connection, which I need to init in app.js one time and anywhere in other modules, I can just simply use it without requiring passing it around. Is there any I can do the same like this?
I also checked global object doc from node.js http://nodejs.org/api/globals.html, and wondering I should use global for issue.
Thanks
A little advice:
You should only very rarely need to use a global. If you think you need one, you probably don't.
Singletons are usually an anti-pattern in Node.js, but sometimes (logging, config) they will get the job done just fine.
Passing something around is sometimes a useful and worthwhile pattern.
Here's an example of how you might use a singleton for logging:
lib/logger.js
var bunyan = require('bunyan'),
mixIn = require('mout/object/mixIn'),
// add some default options here...
defaults = {},
// singleton
logger,
createLogger = function createLogger(options) {
var opts;
if (logger) {
return logger;
}
opts = mixIn({}, defaults, options);
logger = bunyan.createLogger(opts);
return logger;
};
module.exports = createLogger;
lib/module.js
var logger = require('./logger.js'),
log = logger();
log.info('Something happened.');
Hope that helps.
The solution, as you suggest is to add the object as a property to the global object. However, I would recommend against doing this and placing the object in its own module that is required from every other module that needs it. You will gain benefits later on in several ways. For one, it is always explicit where this object comes from and where it is initialized. You will never have a situation where you try to use the object before it is initialized (assuming that the module that defines it also initializes it). Also, this will help make your code more testable,
There are multiple solutions to the problem, depends upon how large your application is. The two solutions that you have mentioned are the most obvious ones. I would rather go for the third which is based on re-architecturing your code. The solution that I am providing looks alot like the executor pattern.
First create actions which require your common module that are in this particular form -
var Action_One = function(commonItems) {
this.commonItems = commonItems;
};
Action_One.prototype.execute = function() {
//..blah blah
//Your action specific code
};
var Action_Two = function(commonItems) {
this.commonItems = commonItems;
};
Action_Two.prototype.execute = function() {
//..blah blah
//Your action_two specific code
};
Now create an action initializer which will programmatically initialize your actions like this -
var ActionInitializer = function(commonItems) {
this.commonItems = commonItems;
};
ActionInitializer.prototype.init = function(Action) {
var obj = new Action(this.commonItems);
return obj;
};
Next step is to create an action executor -
//You can create a more complex executor using `Async` lib or something else
var Executor = function(ActionInitializer, commonItems) {
this.initializer = new ActionInitializer(commonItems);
this.actions = [];
};
//Use this to add an action to the executor
Executor.prototype.add = function(action) {
var result = this.initializer.init(action);
this.actions.push(result);
};
//Executes all the actions
Executor.prototype.executeAll = function() {
var result = [];
for (var i = this.action.length - 1; i >= 0; i--) {
result[i] = this.action[i].execute();
}
this.action = []
return result;
};
The idea was to decouple every module so that there is only one module Executor in this case which is dependent on the common properties. Now lets see how it would work -
var commonProperties = {a:1, b:2};
//Pass the action initilizer class and the common property object to just this one module
var e = new Executor(ActionInitializer, commonProperties);
e.add(Action_One);
e.add(Action_Two);
e.executeAll();
console.log(e.results);
This way your program will be cleaner and more scalable. Shoot questions if it's not clear. Happy coding!

Load "Vanilla" Javascript Libraries into Node.js

There are some third party Javascript libraries that have some functionality I would like to use in a Node.js server. (Specifically I want to use a QuadTree javascript library that I found.) But these libraries are just straightforward .js files and not "Node.js libraries".
As such, these libraries don't follow the exports.var_name syntax that Node.js expects for its modules. As far as I understand that means when you do module = require('module_name'); or module = require('./path/to/file.js'); you'll end up with a module with no publicly accessible functions, etc.
My question then is "How do I load an arbitrary javascript file into Node.js such that I can utilize its functionality without having to rewrite it so that it does do exports?"
I'm very new to Node.js so please let me know if there is some glaring hole in my understanding of how it works.
EDIT: Researching into things more and I now see that the module loading pattern that Node.js uses is actually part of a recently developed standard for loading Javascript libraries called CommonJS. It says this right on the module doc page for Node.js, but I missed that until now.
It may end up being that the answer to my question is "wait until your library's authors get around to writing a CommonJS interface or do it your damn self."
Here's what I think is the 'rightest' answer for this situation.
Say you have a script file called quadtree.js.
You should build a custom node_module that has this sort of directory structure...
./node_modules/quadtree/quadtree-lib/
./node_modules/quadtree/quadtree-lib/quadtree.js
./node_modules/quadtree/quadtree-lib/README
./node_modules/quadtree/quadtree-lib/some-other-crap.js
./node_modules/quadtree/index.js
Everything in your ./node_modules/quadtree/quadtree-lib/ directory are files from your 3rd party library.
Then your ./node_modules/quadtree/index.js file will just load that library from the filesystem and do the work of exporting things properly.
var fs = require('fs');
// Read and eval library
filedata = fs.readFileSync('./node_modules/quadtree/quadtree-lib/quadtree.js','utf8');
eval(filedata);
/* The quadtree.js file defines a class 'QuadTree' which is all we want to export */
exports.QuadTree = QuadTree
Now you can use your quadtree module like any other node module...
var qt = require('quadtree');
qt.QuadTree();
I like this method because there's no need to go changing any of the source code of your 3rd party library--so it's easier to maintain. All you need to do on upgrade is look at their source code and ensure that you are still exporting the proper objects.
There is a much better method than using eval: the vm module.
For example, here is my execfile module, which evaluates the script at path in either context or the global context:
var vm = require("vm");
var fs = require("fs");
module.exports = function(path, context) {
context = context || {};
var data = fs.readFileSync(path);
vm.runInNewContext(data, context, path);
return context;
}
And it can be used like this:
> var execfile = require("execfile");
> // `someGlobal` will be a global variable while the script runs
> var context = execfile("example.js", { someGlobal: 42 });
> // And `getSomeGlobal` defined in the script is available on `context`:
> context.getSomeGlobal()
42
> context.someGlobal = 16
> context.getSomeGlobal()
16
Where example.js contains:
function getSomeGlobal() {
return someGlobal;
}
The big advantage of this method is that you've got complete control over the global variables in the executed script: you can pass in custom globals (via context), and all the globals created by the script will be added to context. Debugging is also easier because syntax errors and the like will be reported with the correct file name.
The simplest way is: eval(require('fs').readFileSync('./path/to/file.js', 'utf8'));
This works great for testing in the interactive shell.
AFAIK, that is indeed how modules must be loaded.
However, instead of tacking all exported functions onto the exports object, you can also tack them onto this (what would otherwise be the global object).
So, if you want to keep the other libraries compatible, you can do this:
this.quadTree = function () {
// the function's code
};
or, when the external library already has its own namespace, e.g. jQuery (not that you can use that in a server-side environment):
this.jQuery = jQuery;
In a non-Node environment, this would resolve to the global object, thus making it a global variable... which it already was. So it shouldn't break anything.
Edit:
James Herdman has a nice writeup about node.js for beginners, which also mentions this.
I'm not sure if I'll actually end up using this because it's a rather hacky solution, but one way around this is to build a little mini-module importer like this...
In the file ./node_modules/vanilla.js:
var fs = require('fs');
exports.require = function(path,names_to_export) {
filedata = fs.readFileSync(path,'utf8');
eval(filedata);
exported_obj = {};
for (i in names_to_export) {
to_eval = 'exported_obj[names_to_export[i]] = '
+ names_to_export[i] + ';'
eval(to_eval);
}
return exported_obj;
}
Then when you want to use your library's functionality you'll need to manually choose which names to export.
So for a library like the file ./lib/mylibrary.js...
function Foo() { //Do something... }
biz = "Blah blah";
var bar = {'baz':'filler'};
When you want to use its functionality in your Node.js code...
var vanilla = require('vanilla');
var mylibrary = vanilla.require('./lib/mylibrary.js',['biz','Foo'])
mylibrary.Foo // <-- this is Foo()
mylibrary.biz // <-- this is "Blah blah"
mylibrary.bar // <-- this is undefined (because we didn't export it)
Don't know how well this would all work in practice though.
I was able to make it work by updating their script, very easily, simply adding module.exports = where appropriate...
For example, I took their file and I copied to './libs/apprise.js'. Then where it starts with
function apprise(string, args, callback){
I assigned the function to module.exports = thus:
module.exports = function(string, args, callback){
Thus I'm able to import the library into my code like this:
window.apprise = require('./libs/apprise.js');
And I was good to go. YMMV, this was with webpack.
A simple include(filename) function with better error messaging (stack, filename etc.) for eval, in case of errors:
var fs = require('fs');
// circumvent nodejs/v8 "bug":
// https://github.com/PythonJS/PythonJS/issues/111
// http://perfectionkills.com/global-eval-what-are-the-options/
// e.g. a "function test() {}" will be undefined, but "test = function() {}" will exist
var globalEval = (function() {
var isIndirectEvalGlobal = (function(original, Object) {
try {
// Does `Object` resolve to a local variable, or to a global, built-in `Object`,
// reference to which we passed as a first argument?
return (1, eval)('Object') === original;
} catch (err) {
// if indirect eval errors out (as allowed per ES3), then just bail out with `false`
return false;
}
})(Object, 123);
if (isIndirectEvalGlobal) {
// if indirect eval executes code globally, use it
return function(expression) {
return (1, eval)(expression);
};
} else if (typeof window.execScript !== 'undefined') {
// if `window.execScript exists`, use it
return function(expression) {
return window.execScript(expression);
};
}
// otherwise, globalEval is `undefined` since nothing is returned
})();
function include(filename) {
file_contents = fs.readFileSync(filename, "utf8");
try {
//console.log(file_contents);
globalEval(file_contents);
} catch (e) {
e.fileName = filename;
keys = ["columnNumber", "fileName", "lineNumber", "message", "name", "stack"]
for (key in keys) {
k = keys[key];
console.log(k, " = ", e[k])
}
fo = e;
//throw new Error("include failed");
}
}
But it even gets dirtier with nodejs: you need to specify this:
export NODE_MODULE_CONTEXTS=1
nodejs tmp.js
Otherwise you cannot use global variables in files included with include(...).

Lazy Load External Javascript Files

I am trying to write a javascript class that loads script files as they are needed. I have most of this working. It is possible to use the library with the following Syntax:
var scriptResource = new ScriptResource('location/of/my/script.js');
scriptResource.call('methodName', arg1, arg2);
I would like to add some additional syntactic sugar so you could write
var scriptResource = new ScriptResource('location/of/my/script.js');
scriptResource.methodName(arg1, arg2);
I'm almost certain that this isnt possible but there may be an inventive solution. I guess what there need to be is some sort of methodCall event. SO the following could work
ScriptResource = function(scriptLocation)
{
this.onMethodCall = function(methodName)
{
this.call(arguments);
}
}
This code is obviously very incomplete but I hope it gives an idea of what I am trying to do
Is something like this even remotely possible?
There is a non standard method, __noSuchMethod__ in Firefox that does what you're looking for
have a look at
https://developer.mozilla.org/en/Core_JavaScript_1.5_Reference/Global_Objects/Object/noSuchMethod
so you could define
obj.__noSuchMethod__ = function( id, args ) {
this[id].apply( this, args );
}
If the set of method names is limited, then you could generate those methods:
var methods = ["foo", "bar", "baz"];
for (var i=0; i<methods.length; i++) {
var method_name = methods[i];
WildCardMethodHandler[method_name] = function () {
this.handleAllMethods(method_name);
};
}
edit: Posted this answer before the question changed dramatically.
An intermediary solution might be to have syntax such as:
var extObj = ScriptResource('location/of/my/script.js');
extObj('methodname')(arg1,arg2);
the code might look like this:
function ScriptResource(file) {
return function(method) {
loadExternalScript(file);
return window[method];
}
}
All kinds of assumptions in the code above, which I'd let you figure out yourself. The most interesting, IMHO, is - in your original implementation - how do you get the proxyied method to run synchronously and return a value? AFAIK you can only load external scripts asynchronously and handle them with an "onload" callback.

Categories