Require without keyword - javascript

I have some code that goes like this:
function foo() {
return 'bar';
}
function bar() {
return 'foo';
}
var hello = 'world';
var world = 'hello';
I want to use these functions and variables in a different file. I know I can use module.exports = {foo, bar, hello, world} and then require const foofile = require('./foofile.js')in a different file, but that makes me use them like foofile.foo(). Is there and way I can avoid this so I just have to type foo() in the other file?
Edit:
I see the 2 answers but when I test the code everything I require is undefined. I might as well just put it here, since it isn't that long anyway.
// vars.js (in root folder)
const sleep = require('./functions/sleep');
const scenarios = require('./scenarios.json')
const fs = require('fs');
const Discord = require('discord.js')
const client = new Discord.Client();
module.exports = {sleep, scenarios, fs, Discord, client}
// sleep.js (in subfolder functions)
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
module.exports = sleep;
//startTheGame.js (in subfolder functions)
const {sleep, scenarios, fs, Discord, client} = require('../vars')
async function startTheGame(channel) {
await sleep(1500);
const embed = Discord.MessageEmbed();
}
module.exports = startTheGame;
It's returning sleep is not a function and when I delete await sleep(1500); it returns cannot read property MessageEmbed of undefined.
I know this is becoming pretty long, and it might be because of a simple slip-up by me, but if anyone could help me, that would be great. Hope this helps someone else.

You can import them using object destructuring like this:
const {foo, bar, hello, world} = require('./foofile.js');
Then, foo, bar, hello and world will be top level variables in your module and you can just refer to them directly as in foo().

You can destruct the content required from the module, therefore you can avoid the mentioned syntax of foofile.foo().
Consider the module you described and a index.js which is requiring it using the destructuring syntax.
// mymodule.js
function foo() {
return "bar";
}
function bar() {
return "foo";
}
// Consider using const and let instead of var
const hello = "world";
const world = "hello";
module.exports = { foo, bar, hello, world };
// index.js
const { foo, bar, hello, world } = require('./mymodule');
console.log(foo()); // "bar"
console.log(bar()); // "foo"
console.log(hello); // "world"
console.log(world); // "hello"

Related

How can I "recursively" stringify a javascript function which calls other scoped functions?

Because javascript functions are not serializable, in order to pass them into new contexts sometimes (albeit rarely) it can be useful to stringify them then re-evaluate them later like:
const foo = () => { // do something }
const fooText = foo.toString()
// later... in new context & scope
const fooFunc = new Function(' return (' + fooText + ').apply(null, arguments)')
fooFunc() // works!
However, if foo references another function bar, the scope is not stringified, so if bar is not defined in the new context, the evaluated foo function will throw an error when called.
I'm wondering if there is a way to stringify a function recursively?
That is, not only stringifying the parent function, but also stringifying the contents of the child functions called from the parent.
For Example:
let bar = () => { alert(1) }
let foo = () => { bar() }
// what toString does
let fooString = foo.toString()
console.log(fooString) // "() => { bar() }"
// what we want
let recursiveFooString = foo.recursiveToString()
console.log(recursiveFooString) // "() => { alert(1) }"
Let me know if you have any ideas on how to accomplish something like a "recursiveToString"
The only good way to do this is to start from a parent scope that encloses all functions foo eventually references. For example, with your foo and bar, if you want to pass foo into another context such that bar is callable as well, pass a function that declares both foo and bar, and returns foo. For example:
const makeFoo = () => {
let bar = () => { alert(1) }
let foo = () => { bar() }
return foo;
};
const makeFooStr = makeFoo.toString();
// ...
const makeFooFunc = new Function(' return (' + makeFooStr + ').apply(null, arguments)');
const foo = makeFooFunc();
foo();
Implementing this sort of thing well does require premeditated design like above (unfortunately). You can't really include all ancestor LexicalEnvironments (the internal map of variable names to values in a given scope) when stringifying.
I'm wondering if there is a way to stringify a function recursively?
I think we can fairly simply demonstrate that this is impossible in general.
Let's think about these two function
const greet = (greeting) => (name) => `${greeting} ${name}`
const sayHi = greet ('Hi')
sayHi ('Jane') //=> "Hi Jane"
While with your foo and bar example, we could possibly imagine something that examined the body of the function and used everything available in the current scope to do your extended stringify function based on parsing the function and knowing what local variables are actually used. (I'm guessing that this would be impossible too, for reasons related to Rice's Theorem, but we can certainly imagine it.)
But here, note that
sayHi.toString() //=> "(name) => `${greeting} ${name}`"
and so sayHi depends on a free variable that's not stored in our current scope, namely, greeting. We simply have not stored the "Hi" used to create that function anywhere except in the closure scope of sayHi, which is not exposed anywhere.
So even this simple function could not be reliably serialized; there seems little hope for anything more complex.
What I ended up rolling with was inspired by #CertainPerformance's answer.
The trick is to build a function which defines all the child callee functions. Then you have everything you need to stringify the parent function.
Note: to allow for imported callee functions from other files, I decided to programmatically build a string with the callee definitions rather than defining them originally in the same scope.
The code:
// original function definitions (could be in another file)
let bar = () => { alert(1) }
let foo = () => { bar() }
const allCallees = [ bar, foo ]
// build string of callee definitions
const calleeDefinitions = allCallees.reduce(
(definitionsString, callee) => {
return `${definitionsString} \n const ${callee.name} = ${callee.toString()};`;
},
"",
);
// wrap the definitions in a function that calls foo
const fooString = `() => { ${calleeDefinitions} \n return foo(); \n }`;
console.log(fooString);
/**
* fooString looks like this:
* `() => {
* const bar = () => { alert(1) };
* const foo = () => { bar() };
* return foo();
* }`
**/
// in new context & scope
const evaluatedFoo = new Function(' return (' + fooString + ').apply(null, arguments)');
// works as expected
evaluatedFoo();

Stub an internal functions?

I want to stub a internal function in my code when unit testing it, example:
//foobar.js
const uuid = require('uuid');
function foo() {
console.log('uuid: ' + uuid.v4());
// Lots of timers
}
exports._foo = foo;
function bar() {
//Logic...
foo();
//Logic...
}
exports.bar = bar;
And the unit test:
// test/foobar.js
const chai = require('chai'),
expect = chai.expect,
proxyquire = require('proxyquire'),
sinon = require('sinon');
describe('bar', () => {
it('call foo', () => {
let foo = proxyquire('./foo.js', {
uuid: {
v4: () => {
return '123456789';
}
}
}),
fooSpy = sinon.spy(foo._foo);
foo.bar();
expect(fooSpy.calledOnce);
});
});
Now, when unit testing bar, I can spy on foo just fine, and it is quite fine.
However, the real foo make a lot of time consuming calls (DB calls, file IO...), and while I could use proxyquire to stub all fs and db calls to terminate immediately, that would duplicate code from the test of foo, be unreadable, and being bad altogether.
The simple solution would be to stub foo, but proxyquire doesn't seems to like that. A naive foo._foo = stubFoo didn't worked either. Rewire doesn't seems to handle this either.
What I could do would be to make a file that import and export foobar.js, and use proxyquire on it, but the idea itself is already bad.
How can I stub the foo function when testing bar?
Sinon is perfectly capable to handle this if done correctly, you just need to remember sinon replace references in the object, and export is not where the function is declared. But with the same code and tests, a simple modification to bar make it works:
function bar() {
//Logic...
exports._foo();
//Logic...
}

monkeypatch exported function

running on nodev4 -
let's say that I have 2 libraries : foo and bar
foo has this
var bar = require('./lib/bar');
exports = module.exports = function myApp(options) {
[snip]
}
exports bar = bar;
and bar has this
module.exports = function doStuff(moreOptions) {
function doMoreStuff () {
}
}
my app has
x = requires(foo);
what I would like to do is get my app to monkeypatch the doMoreStuff function - is this possible ?
I have tried various libraries , but I suspect there's a fundamental problem with my understanding of js ;)
Lets look at what you have here. After
x = requires(foo);
you have the following essentially
x = function (options){ /* function work */ }
x.bar = function(moreOptions){
function doMoreStuff(){
}
}
This, while not 'illegal' it is odd. It's the equivalent of
a = function() { return "I'm a function"; }
a.bar = "I'm a string attached to a function";
console.log(a()); // => "I'm a function"
console.log(a.bar); // => "I'm a string attached to a function"
Your bar module doesn't do much. It's a function containing an inaccessible function. Lets assume you meant
module.exports = function(moreOptions){
return {
doMoreStuff: function(){
return "Bar doing more stuff";
}
}
}
If you want to new up an myApp and attach an external function you would
function myApp(options){
}
myApp.prototype.bar = bar(); // since you exported a function
module.exports = myApp;
In your main app you could now
x = require('foo');
var app = new foo(); // you MUST otherwise you get an empty function
app.bar.doMoreStuff(); // => "Bar doing more stuff"
Alternatively (and less prone to error) foo could be
var bar = require('bar');
module.exports = function (options) {
// work here ...
return {
bar: bar(), // again, exported as function
appOptions: options,
// ...
}
}
Then in the main module
x = require('foo');
x.bar.doMoreStuff(); // => "Bar doing more stuff";
Hope this helps.

JavaScript design patterns: Injecting a dependency that is not yet created

I have a CommonJS module:
// main-module
module.exports = function () {
var foo,
someModule = require('other-module')(foo);
// A value is given to foo after other-module has been initialised
foo = "bar";
}
As you can see, this requires other-module:
// other-module.js
module.exports = function (foo) {
function example() {
console.log(foo);
// > "bar"
}
}
I would like the example function inside of other-module to be aware of the foo variable inside of main-module, even though is it established after the module is required.
When other-module runs, foo will not be undefined. However, the point is that by time my example function runs, foo will have been given a value of bar.
The pattern above obviously does not work. What design pattern do I need to implement?
I'm not super-familiar with CommonJS, so this might not be the idiomatic way to do it, but using a function instead of a variable should work:
// main-module
module.exports = function () {
var foo,
someModule = require('other-module')(function() { return foo; });
foo = "bar";
}
// other-module.js
module.exports = function (fooFn) {
function example() {
console.log(fooFn());
}
}
The foo value (a string) will be passed by value, so it's undefined inside other-module. You could use an options object that is passed by reference:
var options = {},
someModule = require('other-module')(options);
options.foo = "bar";

coffee -cj overwrites module.exports for each class

I just can't get my head around how this is supposed to work: As I understand, a pretty common way to define a class/module in CoffeeScript is by using module.exports = class MyClass at the top of the file. I would also guess that the coffee compiler would facilitate this pattern. Take this minimalist example:
# src/Foo.coffee
module.exports = class Foo
# src/Bar.coffee
module.exports = class Bar
Then compile and join the two with:
coffee -cj all.js src
The result is all.js where module.exports is redefined/overwritten for each module:
// Generated by CoffeeScript 1.4.0
(function() {
var Bar, Foo;
module.exports = Bar = (function() {
function Bar() {}
return Bar;
})();
module.exports = Foo = (function() {
function Foo() {}
return Foo;
})();
}).call(this);
If I now try to do this, the result would be an error stating that the Foo module cound not be found, and rightly so because the last module (here: Bar) has redefined module.exports to only contain itself.
Foo = require('foo');
I guess this is quite the noob question but I can't seem to get a good answer anywhere.
This is pretty much the desired behaviour... You're merging two modules into one, and they both want to be at the top level, so one of them has to win.
One possible solution would be as follows:
# src/Foo.coffee
module.exports.Foo = class Foo
# src/Bar.coffee
module.exports.Bar = class Bar
which yields:
# all.js
(function() {
var Bar, Foo;
module.exports.Bar = Bar = (function() {
function Bar() {}
return Bar;
})();
module.exports.Foo = Foo = (function() {
function Foo() {}
return Foo;
})();
}).call(this);
And then you can use (in CoffeeScript)
{Foo, Bar} = require "all"
to get at the classes contained therein

Categories