Node js require code instead of file - javascript

I am trying to setup an interface, where I can write one js file that can be used on the server (nodejs) and on the client (javascript).
An example file would be a Vector object, that I would like to use on both the client and the server, as I am creating a multiplayer game.
In node.js, I know that you can use the following syntax to require source files...
var Vector = require('./vector');
Then you can access its module.exports by typing in Vector.
The problem here is that for the server I need an extra bit of code at the end of the file...
module.exports = Vector;
... which is not necessary on the client.
Is it possible to maybe require source code, something like the following?
var data = (...) // get data from vector.js file
var Vector = require_code(data + 'module.exports = Vector');
If not, there might be another way of doing what I am trying to accomplish.
That might sound a little confusing, but help is greatly appreciated!
Thanks in advance,
David.

Sounds like you are looking for UMDs - Universal Module Definitions.
(function (root, factory) {
if (typeof define === "function" && define.amd) {
define(["jquery", "underscore"], factory);
} else if (typeof exports === "object") {
module.exports = factory(require("jquery"), require("underscore"));
} else {
root.Requester = factory(root.$, root._);
}
}(this, function ($, _) {
// this is where I defined my module implementation
var Requester = { // ... };
return Requester;
}));
You'll need to change the name in root.Requestor to be the name of your module. root picks up the value of this which will be the global object or what you normally call window on the browser.
This particular example looks for jQuery and underscore as example dependencies, but they are easy enough to factor out if you need.

Related

Trying to get one JavaScript file to be usable both by a node.js file and in a browser

I've got a relatively simple JavaScript file (call it foo.js) that has 3 functions that are being called by a second JS file in the browser. There are a few other functions in foo.js, but they are only used internally.
Now foo.js also needs to be able to be used by a JS file running in node.js. Same thing, only needs to access the three basic functions.
So I added module.exports around these three functions like so:
module.exports = {
init_foo: function (bar){
return JSON.parse(bar);
},
export_foo: function (foobar){
return JSON.stringify(foobar);
},
switch_foo: function (boofar){
switch(boofar)
{
case 'A':
return 1;
case 'B':
return 2;
default:
return 3;
}
}
};
So now my node.js file can get the code by using
var foo = require('./foo.js');
But of course the browser code can't use it anymore, gives an error. When looking for a solution I found browserify but I can't seem to get it to work (keeps returning an empty file even when doing the suggested tutorial, guessing it's something to do with the set up of the system I am using, just not sure what), and it seems like it's more complex then I need anyway (don't want to browserify the entire browser JavaScript code, but can't browserify foo.js, have to make a new JS file that requires foo.js and uses it then browserify that, effectively adding a middle man that wasn't needed before).
Seeing as how the code I want to access with both node.js and from the browser is relatively simple is there an easy way to do this? (Just writing the code twice isn't a solution, it is simple code but I want to only have to edit it once for changes to propagate to both locations).
It is probably the best to use a specialized packet like browserify but for a small thing like yours the following might be a better fit. (I used it in my primesieve module)
var myModule = (function() {
return {
init_foo: function (bar){
return JSON.parse(bar);
},
export_foo: function (foobar){
return JSON.stringify(foobar);
},
switch_foo: function (boofar){
switch(boofar)
{
case 'A':
return 1;
case 'B':
return 2;
default:
return 3;
}
}
};
})();
if (typeof module !== 'undefined' && typeof module.exports !== 'undefined') {
module.exports = myModule;
} else {
if (typeof define === 'function' && define.amd) {
define([], function() {
return myModule;
});
} else {
window.myModule = myModule;
}
}
There have been some time passed since and better methods might have evolved but it is small and simple and worked for me.
Welp, one thing you can do is only assign a property of 'module' if it exists. And hope that no other JavaScript has introduced that to your browser environment as a global variable.
(function() {
var exports
...
if (this.hasOwnProperty('module'))
module.exports = exports
})()
Note that the this there is the global object.

How to write a module that works with Node.js, RequireJS as well as without them

I am working on a JavaScript library for JSON/XML processing. My library works in browser as well as Node.js (with xmldom and xmlhttprequest modules).
One of the users recently asked for RequireJS support. I have taken a look at the RequireJS/AMD thing and think it is a good approach so I'd like to provide this.
However I'd like to retain the portability: my library must work in browsers (with and without RequireJS) as well as Node.js. And in the browser environment I don't depend on xmldom or xmlhttprequest since these things are provided by the browser itself.
My question is: how can I implement my library so that it works in browsers as well as in Node.js with an without RequireJS?
A bit of historyand my current solution
I initially wrote my library for browsers. So it just created a global-scope object and put everything inside it:
var Jsonix = { ... };
Later on users asked for Node.js support. So I added:
if(typeof require === 'function'){
module.exports.Jsonix = Jsonix;
}
I also had to import few modules mentioned above. I did it conditionally, depending on whether the require function is available or not:
if (typeof require === 'function')
{
var XMLHttpRequest = require('xmlhttprequest').XMLHttpRequest;
return new XMLHttpRequest();
}
Now there's this story with RequireJS. If RequireJS is present then the require function is present as well. But module loading works differently, I have to use the define function etc. I also can't just require things since require has an async API in RequireJS. Moreover, if my library is loaded via RequireJS, it seems to process the source code and detects require('something') even if I do it conditionally like
if (typeof require === 'function' && typeof require.specified !== 'function) ...
RequireJS still detects require('xmlhttprequest') an tries to load the corresponding JS file.
Currently I'm coming to the following solution.
// Module factory function, AMD style
var _jsonix = function(_jsonix_xmldom, _jsonix_xmlhttprequest, _jsonix_fs)
{
// Complete Jsonix script is included below
var Jsonix = { ... };
// Complete Jsonix script is included above
return { Jsonix: Jsonix };
};
// If require function exists ...
if (typeof require === 'function') {
// ... but define function does not exists, assume we're in the Node.js environment
// In this case, load the define function via amdefine
if (typeof define !== 'function') {
var define = require('amdefine')(module);
define(["xmldom", "xmlhttprequest", "fs"], _jsonix);
}
else {
// Otherwise assume we're in the RequireJS environment
define([], _jsonix);
}
}
// Since require function does not exists,
// assume we're neither in Node.js nor in RequireJS environment
// This is probably a browser environment
else
{
// Call the module factory directly
var Jsonix = _jsonix();
}
And this is how I check for dependencies now:
if (typeof _jsonix_xmlhttprequest !== 'undefined')
{
var XMLHttpRequest = _jsonix_xmlhttprequest.XMLHttpRequest;
return new XMLHttpRequest();
}
If I have require but not define then I assume this is a Node.js environment. I use amdefine to define the module and pass the required dependencies.
If I have require and define thet I assume this is a RequireJS environment, so I just use the define function. Currently I also assume this is a browser environment so dependencies like xmldom and xmlhttprequest are not available and don't require them. (This is probably nor correct.)
If I don't have the require function then I assume this is a browser environment without RequireJS/AMD support so I invoke the module factory _jsonix directly and export the result as a global object.
So, this is my approach so far. Seems a little bit awkward to me, and as a newbie to RequireJS/AMD I'm seeking advise. Is it the right approach? Are there better ways to address the problem? I'd be grateful for your help.
Take a look at how underscore.js handles it.
// Export the Underscore object for **Node.js**, with
// backwards-compatibility for the old `require()` API. If we're in
// the browser, add `_` as a global object.
if (typeof exports !== 'undefined') {
if (typeof module !== 'undefined' && module.exports) {
exports = module.exports = _;
}
exports._ = _;
} else {
root._ = _;
}
...
// AMD registration happens at the end for compatibility with AMD loaders
// that may not enforce next-turn semantics on modules. Even though general
// practice for AMD registration is to be anonymous, underscore registers
// as a named module because, like jQuery, it is a base library that is
// popular enough to be bundled in a third party lib, but not be part of
// an AMD load request. Those cases could generate an error when an
// anonymous define() is called outside of a loader request.
if (typeof define === 'function' && define.amd) {
define('underscore', [], function() {
return _;
});
}
This is what I ended up with:
// If the require function exists ...
if (typeof require === 'function') {
// ... but the define function does not exists
if (typeof define !== 'function') {
// Assume we're in the Node.js environment
// In this case, load the define function via amdefine
var define = require('amdefine')(module);
// Use xmldom and xmlhttprequests as dependencies
define(["xmldom", "xmlhttprequest", "fs"], _jsonix_factory);
}
else {
// Otherwise assume we're in the browser/RequireJS environment
// Load the module without xmldom and xmlhttprequests dependencies
define([], _jsonix_factory);
}
}
// If the require function does not exists, we're not in Node.js and therefore in browser environment
else
{
// Just call the factory and set Jsonix as global.
var Jsonix = _jsonix_factory().Jsonix;
}
Here is a template I'm currently using, it's both AMD and node compatible though not directly loadable stand-alone in the browser...
The main advantage to this approach is that the domain-specific code does not need to care about what imported it, for the general case.
/**********************************************************************
*
*
*
**********************************************************************/
((typeof define)[0]=='u'?function(f){module.exports=f(require)}:define)(
function(require){ var module={} // makes module AMD/node compatible...
/*********************************************************************/
/*********************************************************************/
/**********************************************************************
* vim:set ts=4 sw=4 : */ return module })

Sharing Code between Node.js and the browser

I'm working on a project that uses Node.js. I'm familiar with JavaScript, but not great. As part of that, I've run into a challenge that I'm not sure how to overcome.
I need to share some code on the server (Node.js) and my client-side (browser) app. I want to be able to access this code by typing the following:
myCompany.myProject.myFunction(someValue);
or just
myProject.myFunction(someValue);
In an attempt to do this, I have the following:
'use strict';
var myCompany = myCompany || {};
var myProject = myCompany.myProject || {};
myProject.myFunction= function(someValue) {
console.log(someValue);
};
Inside of myFunction, I want to one thing if I'm running on the server (Node.js) and something different if I'm running in the browser. However, I'm not sure how to do that. I reviewed this post and this SO question, yet I still don't understand it.
Thank you for your help!
You need something like this:
function someFunctionName() {
// Common functional
if (typeof module !== 'undefined' && module.exports) {
// Do something only in Node.JS
} else {
// Do something else in the browser
}
// Common functional
}

Passing in window to this in browserify

One of my dependencies uses the following to pass in window to its closure
(function (window) {
//
})(this)
For the time being I can just change it to something more sensible so that it doesn't break browserify, but is there some method whereby I can force a value for this in a browserified module?
I wrote a browserify transform called "moduleify" that should generally do what you want, i.e. wrap the offending code in an IIFE that looks kinda like this:
(function () {
// this === window
}.call(window));
In fact, my implementation is not much more sophisticated than that.
The original idea was to export a globals-polluting "module" as if it were a CommonJS module (e.g. have AngularJS export window.angular), but because it contains that wrapper, it should do the trick.
For instructions, see the README. If the offending script doesn't actually have anything it could reasonably export, just have it export window (which will result in module.exports = window['window']) or an arbitrary name that doesn't exist (resulting in undefined).
If you want to access the window object in your own browserify code, also check out the global module, which provides a nice wrapper to access browser globals safely in CommonJS modules.
To solve this specific problem, a simple transform, that wraps the code in a self calling function, will do the job.
CoffeeScript
through = require('through')
fenestrate = (file) ->
data = ';(function() {\n';
write = (buf) ->
data += buf
end = ->
data += '\n}).call(window);'
this.queue(data)
this.queue(null)
through(write, end)
JavaScript
var through = require('through');
var fenestrate = function(file) {
var data, end, write;
data = ';(function() {\n';
write = function(buf) {
return data += buf;
};
end = function() {
data += '\n}).call(window);';
this.queue(data);
return this.queue(null);
};
return through(write, end);
};
Writing Transforms: https://github.com/substack/browserify-handbook#transforms

Load "Vanilla" Javascript Libraries into Node.js

There are some third party Javascript libraries that have some functionality I would like to use in a Node.js server. (Specifically I want to use a QuadTree javascript library that I found.) But these libraries are just straightforward .js files and not "Node.js libraries".
As such, these libraries don't follow the exports.var_name syntax that Node.js expects for its modules. As far as I understand that means when you do module = require('module_name'); or module = require('./path/to/file.js'); you'll end up with a module with no publicly accessible functions, etc.
My question then is "How do I load an arbitrary javascript file into Node.js such that I can utilize its functionality without having to rewrite it so that it does do exports?"
I'm very new to Node.js so please let me know if there is some glaring hole in my understanding of how it works.
EDIT: Researching into things more and I now see that the module loading pattern that Node.js uses is actually part of a recently developed standard for loading Javascript libraries called CommonJS. It says this right on the module doc page for Node.js, but I missed that until now.
It may end up being that the answer to my question is "wait until your library's authors get around to writing a CommonJS interface or do it your damn self."
Here's what I think is the 'rightest' answer for this situation.
Say you have a script file called quadtree.js.
You should build a custom node_module that has this sort of directory structure...
./node_modules/quadtree/quadtree-lib/
./node_modules/quadtree/quadtree-lib/quadtree.js
./node_modules/quadtree/quadtree-lib/README
./node_modules/quadtree/quadtree-lib/some-other-crap.js
./node_modules/quadtree/index.js
Everything in your ./node_modules/quadtree/quadtree-lib/ directory are files from your 3rd party library.
Then your ./node_modules/quadtree/index.js file will just load that library from the filesystem and do the work of exporting things properly.
var fs = require('fs');
// Read and eval library
filedata = fs.readFileSync('./node_modules/quadtree/quadtree-lib/quadtree.js','utf8');
eval(filedata);
/* The quadtree.js file defines a class 'QuadTree' which is all we want to export */
exports.QuadTree = QuadTree
Now you can use your quadtree module like any other node module...
var qt = require('quadtree');
qt.QuadTree();
I like this method because there's no need to go changing any of the source code of your 3rd party library--so it's easier to maintain. All you need to do on upgrade is look at their source code and ensure that you are still exporting the proper objects.
There is a much better method than using eval: the vm module.
For example, here is my execfile module, which evaluates the script at path in either context or the global context:
var vm = require("vm");
var fs = require("fs");
module.exports = function(path, context) {
context = context || {};
var data = fs.readFileSync(path);
vm.runInNewContext(data, context, path);
return context;
}
And it can be used like this:
> var execfile = require("execfile");
> // `someGlobal` will be a global variable while the script runs
> var context = execfile("example.js", { someGlobal: 42 });
> // And `getSomeGlobal` defined in the script is available on `context`:
> context.getSomeGlobal()
42
> context.someGlobal = 16
> context.getSomeGlobal()
16
Where example.js contains:
function getSomeGlobal() {
return someGlobal;
}
The big advantage of this method is that you've got complete control over the global variables in the executed script: you can pass in custom globals (via context), and all the globals created by the script will be added to context. Debugging is also easier because syntax errors and the like will be reported with the correct file name.
The simplest way is: eval(require('fs').readFileSync('./path/to/file.js', 'utf8'));
This works great for testing in the interactive shell.
AFAIK, that is indeed how modules must be loaded.
However, instead of tacking all exported functions onto the exports object, you can also tack them onto this (what would otherwise be the global object).
So, if you want to keep the other libraries compatible, you can do this:
this.quadTree = function () {
// the function's code
};
or, when the external library already has its own namespace, e.g. jQuery (not that you can use that in a server-side environment):
this.jQuery = jQuery;
In a non-Node environment, this would resolve to the global object, thus making it a global variable... which it already was. So it shouldn't break anything.
Edit:
James Herdman has a nice writeup about node.js for beginners, which also mentions this.
I'm not sure if I'll actually end up using this because it's a rather hacky solution, but one way around this is to build a little mini-module importer like this...
In the file ./node_modules/vanilla.js:
var fs = require('fs');
exports.require = function(path,names_to_export) {
filedata = fs.readFileSync(path,'utf8');
eval(filedata);
exported_obj = {};
for (i in names_to_export) {
to_eval = 'exported_obj[names_to_export[i]] = '
+ names_to_export[i] + ';'
eval(to_eval);
}
return exported_obj;
}
Then when you want to use your library's functionality you'll need to manually choose which names to export.
So for a library like the file ./lib/mylibrary.js...
function Foo() { //Do something... }
biz = "Blah blah";
var bar = {'baz':'filler'};
When you want to use its functionality in your Node.js code...
var vanilla = require('vanilla');
var mylibrary = vanilla.require('./lib/mylibrary.js',['biz','Foo'])
mylibrary.Foo // <-- this is Foo()
mylibrary.biz // <-- this is "Blah blah"
mylibrary.bar // <-- this is undefined (because we didn't export it)
Don't know how well this would all work in practice though.
I was able to make it work by updating their script, very easily, simply adding module.exports = where appropriate...
For example, I took their file and I copied to './libs/apprise.js'. Then where it starts with
function apprise(string, args, callback){
I assigned the function to module.exports = thus:
module.exports = function(string, args, callback){
Thus I'm able to import the library into my code like this:
window.apprise = require('./libs/apprise.js');
And I was good to go. YMMV, this was with webpack.
A simple include(filename) function with better error messaging (stack, filename etc.) for eval, in case of errors:
var fs = require('fs');
// circumvent nodejs/v8 "bug":
// https://github.com/PythonJS/PythonJS/issues/111
// http://perfectionkills.com/global-eval-what-are-the-options/
// e.g. a "function test() {}" will be undefined, but "test = function() {}" will exist
var globalEval = (function() {
var isIndirectEvalGlobal = (function(original, Object) {
try {
// Does `Object` resolve to a local variable, or to a global, built-in `Object`,
// reference to which we passed as a first argument?
return (1, eval)('Object') === original;
} catch (err) {
// if indirect eval errors out (as allowed per ES3), then just bail out with `false`
return false;
}
})(Object, 123);
if (isIndirectEvalGlobal) {
// if indirect eval executes code globally, use it
return function(expression) {
return (1, eval)(expression);
};
} else if (typeof window.execScript !== 'undefined') {
// if `window.execScript exists`, use it
return function(expression) {
return window.execScript(expression);
};
}
// otherwise, globalEval is `undefined` since nothing is returned
})();
function include(filename) {
file_contents = fs.readFileSync(filename, "utf8");
try {
//console.log(file_contents);
globalEval(file_contents);
} catch (e) {
e.fileName = filename;
keys = ["columnNumber", "fileName", "lineNumber", "message", "name", "stack"]
for (key in keys) {
k = keys[key];
console.log(k, " = ", e[k])
}
fo = e;
//throw new Error("include failed");
}
}
But it even gets dirtier with nodejs: you need to specify this:
export NODE_MODULE_CONTEXTS=1
nodejs tmp.js
Otherwise you cannot use global variables in files included with include(...).

Categories