Testing CommonJS modules that use browserify aliases and shims - javascript

Browserify allows creating aliases and shimming modules that are not directly CommonJS compatible. Since I'd like to run my tests in node CLI, can I somehow handle those aliases and shimmed modules in node?
For example, let's say I'm aliasing ./my-super-module to supermodule and shimming and aliasing some jquery plugin ./vendor/jquery.plugin.js -> ./shims/jquery.plugin.shim.js to jquery.plugin.
As a result, I can do this in my module:
var supermodule = require('supermodule');
require('jquery.plugin');
// do something useful...
module.exports = function(input) {
supermodule.process(output)
}
Are there any practices how I could test this module in node.js/cli so that the dependencies are resolved?

You might want to use proxyquire if you plan to test this module directly in node using any cli runner.
using mocha will be something like this
describe('test', function () {
var proxyquire = require('proxyquire').noCallThru();
it('should execute some test', function () {
var myModule = proxyquire('./my-module', {
// define your mocks to be used inside the modules
'supermodule' : require('./mock-supermodule'),
'jquery.plugin': require('./jquery-plugin-mock.js')
});
});
});
If you want to test this is a real browser, you might not need to mock your aliases modules, you can use browserify to run your tests in karma directly.
If you need to mock modules in that scenario you can use proxyquireify, which will allow you to do the same but with browserify.
there is also browsyquire which is a fork of proxyquireify that I made with some extra features and a bug fix.

Related

Replacing dependencies in AMD module format with testdouble.js

I'm writing tests for a JS application using Jasmine and testdouble.js as a mocking library. I am using AMD format to organize code in modules, and RequreJS as a module loader. I was wondering how to use testdouble.js to replace dependency for the module being tested that is in AMD format and it is loading via RequireJS. The documentation is unclear about this or I am missing something, so if someone could point me in the right direction.
I'll post the example bellow that illustrates my setup and the problem that I am facing.
car.js
define("car", ["engine"], function(engine) {
function drive = {
engine.run();
}
return {
drive: drive
}
});
engine.js
define("engine", function() {
function run() {
console.log("Engine running!");
}
return {
run: run
}
});
car.spec.js
define(["car"], function(car) {
describe("Car", function() {
it("should run the motor when driving", function() {
// I am not sure how to mock the engine's object run method
// and where to place that logic, in beforeEach or...
td.replace(engine, "run");
car.drive();
// How to verify that when car.run() has executed, it calls this mocked method
td.verify(engine.run());
});
});
});
testdouble.js does not have any explicit support for AMD modules. The only module-related tricks it offers are Node.js specific and built on top of Node's CJS module loader.
What you would need to do in this case is require from the test a reference to engine and replace the run property, which it seems like you've done (your example is incomplete).
If you do this, don't forget to run td.reset() in an afterEach to restore the original properties to anything you replace!

Loading webpack module in a require.js based project returns null

I'm trying to load a library that compiles to Webpack in a require.js project. While the library exposes an object, it returns null when required from the require.js project :
define(function(require, exports, module) {
[...]
require("./ext/mylib.core.js"); // -> null
})
Is there any flags that I can use in Webpack to enable AMD compliance ? There are some references to AMD in the generated library but as it is it does not seem to do anything.
The solution was in Webpack documentation : there is an outputLibrary flag that can be set to "amd" or "umd" and in that case webpack produces amd compliant modules.
EDIT 3:/EDIT: 4
Webpack is not cooperating it may seem, so another possibility would be to expose the module with the shim config option:
require.config({
paths: {
// Tell require where to find the webpack thingy
yourModule: 'path/to/the/webpack/asset'
},
shim: {
// This lets require ignore that there is no define
// call but will instead use the specified global
// as the module export
yourModule: {
exports: 'theGlobalThatIsPutInPlaceByWebpack'
}
}
});
This obviously only works in the case that the webpack stuff is putting something in the global scope. Hope this helps!
EDIT 2:
So I got the question wrong as pointed out in the comments. I didn't find any built-in functionality to produce AMD modules from webpack - the end result seems to be a static asset js file. You could wrap the result in a
define(function () {
return /* the object that webpack produces */;
});
block, maybe with the help of some after-build event (e.g. using this after build plugin for webpack). Then you should be able to require the module with an AMD loader.
Original Answer:
require.js loads it's dependencies asynchronously, you have to declare them explicitly when you're not using the r.js optimizer or the like. So if the module exposes an AMD definition it should work like this:
// It works the way you did it ...
define(['path/to/your/module'], function (require, exports, module) {
require('path/to/your/module'); // -> { ... }
});
// ... but I personally prefer this explicit syntax + it is
// friendlier to a code minifier
define(['path/to/your/module'], function (yourModule) {
console.log(yourModule); // { ... }
});
Maybe you have to configure your require instance, there are docs for that.
EDIT1: as pointed out the way the module is being accessed is not wrong but the dependencies were missing, so I added code that is closer to the original question.

Import existing AMD module into ES6 module

I have an existing application where I have AMD modules defined using RequireJS. I use "text" and "i18n" plugins for requirejs extensively in my project.
I have been experimenting with ES6 modules lately and would like to use them while creating new modules in my application. However, I want to reuse the existing AMD modules and import them while defining my ES6 modules.
Is this even possible? I know Traceur and Babel can create AMD modules from ES6 modules, but that only works for new modules with no dependency on existing AMD modules, but I could not find an example of reusing the existing AMD modules.
Any help will be appreciated. This is a blocker for me right now to start using all ES6 goodies.
Thanks
Yes, it can be done. Create a new application with the following structure:
gulpfile.js
index.html
js/foo.js
js/main.es6
node_modules
Install gulp and gulp-babel. (I prefer to install gulp locally but you may want it globally: that's up to you.)
index.html:
<!DOCTYPE html>
<html>
<head>
<title>Something</title>
<script src="https://cdnjs.cloudflare.com/ajax/libs/require.js/2.1.20/require.js"></script>
<script>
require.config({
baseUrl: "js",
deps: ["main"]
});
</script>
</head>
<body>
</body>
</html>
gulpfile.js:
"use strict";
var gulp = require('gulp');
var babel = require('gulp-babel');
gulp.task("copy", function () {
return gulp.src(["./js/**/*.js", "./index.html"], { base: '.' })
.pipe(gulp.dest("build"));
});
gulp.task("compile-es6", function () {
return gulp.src("js/**/*.es6")
.pipe(babel({"modules": "amd"}))
.pipe(gulp.dest("build/js"));
});
gulp.task("default", ["copy", "compile-es6"]);
js/foo.js:
define(function () {
return {
"foo": "the value of the foo field on module foo."
};
});
js/main.es6:
import foo from "foo";
console.log("in main: ", foo.foo);
After you've run gulp to build the application, open the file build/index.html in your browser. You'll see on the console:
in main: the value of the foo field on module foo.
The ES6 module main was able to load the AMD module foo and use the exported value. It would also be possible to have a native-AMD module load an ES6 module that has been converted to AMD. Once Babel has done its work, they are all AMD modules as far as an AMD loader is concerned.
In addition to #Louis's answer, assuming you already have a bunch of third party libraries specified in require.js configuration, in your new ES6 modules, whenever you are importing a module, be it amd or es6, you'll be fine as long as you keep the imported module name consistent. For example:
Here is the gulpfile:
gulp.task("es6", function () {
return gulp.src("modules/newFolder//es6/*.js")
.pipe(babel({
"presets": ["es2015"],
"plugins": ["transform-es2015-modules-amd"]
// don't forget to install this plugin
}))
.pipe(gulp.dest("modules/newFolder/build"));
});
Here is the es6 file:
import d3 from 'd3';
import myFunc from 'modules/newFolder/es6module'
// ...
This will be compiled to sth like this:
define(['d3', 'modules/newFolder/es6module'], function (_d, _myFunc) {
'use strict';
// ...
});
as long as the module in define(['d3', 'modules/newFolder/es6module'], ... of the compiled file is fine in a original AMD file, it should work with under existing require.js setup, such as compress files etc.
In terms of #coderC's question about require.js loaders, I was using i18n!nls/lang in AMD modules, at first I thought it would be a really tricky thing to find an alternative of AMD plugin loaders in ES6 modules, and I switched to other localization tools such as i18next. But it turned out that it's okay to do this:
import lang from 'i18n!nls/lang';
// import other modules..
because it will be compiled by gulp task to sth like:
define(['d3', 'i18n!nls/lang'], function (_d, _lang) {
// ....
This way, we don't have to worry about the require.js loader.
In a nutshell, in ES6 modules, if you want to use existing AMD plugin/modules, you just need to ensure the compiled file is conformed with the existing setup. Additionally, you can also try the ES6 module bundler Rollup to bundle all the new ES6 files.
Hope this can be helpful for those who are trying to integrate ES6 syntax in project.
A few changes for the latest version of babel:
First, babel({"modules": "amd"}) doesn't work with the latest version of babel. Instead, use babel({"plugins": ["#babel/plugin-transform-modules-amd"]}). (You'll need to install that plugin as a separate module in npm, i.e. with npm install --save-dev #babel/plugin-transform-modules-amd.)
Second, the syntax for gulp.task no longer accepts arrays as its second argument. Instead, use gulp.parallel or gulp.series to create a compound task.
Your gulpfile will end up looking like this:
"use strict";
var gulp = require('gulp');
var babel = require('gulp-babel');
gulp.task("copy", function () {
return gulp.src(["./js/**/*.js", "./index.html"], { base: '.' })
.pipe(gulp.dest("build"));
});
gulp.task("compile-es6", function () {
return gulp.src("js/**/*.es6")
.pipe(babel({"plugins": ["#babel/plugin-transform-modules-amd"]}))
.pipe(gulp.dest("build/js"));
});
gulp.task("default", gulp.parallel("copy", "compile-es6"));

How to properly require modules from mocha.opts file

I'm using the expect.js library with my mocha unit tests. Currently, I'm requiring the library on the first line of each file, like this:
var expect = require('expect.js');
describe('something', function () {
it('should pass', function () {
expect(true).to.be(true); // works
});
});
If possible, I'd like to remove the boilerplate require code from the first line of each file, and have my unit tests magically know about expect. I thought I might be able to do this using the mocha.opts file:
--require ./node_modules/expect.js/index.js
But now I get the following error when running my test:
ReferenceError: expect is not defined
This seems to make sense - how can it know that the reference to expect in my tests refers to what is exported by the expect.js library?
The expect library is definitely getting loaded, as if I change the path to something non-existent then mocha says:
"Error: Cannot find module './does-not-exist.js'"
Is there any way to accomplish what I want? I'm running my tests from a gulp task if perhaps that could help.
You are requiring the module properly but as you figured out, the symbols that the module export won't automatically find themselves into the global space. You can remedy this with your own helper module.
Create test/helper.js:
var expect = require("expect.js")
global.expect = expect;
and set your test/mocha.opts to:
--require test/helper
While Louis's answer is spot on, in the end I solved this with a different approach by using karma and the karma-chai plugin:
Install:
npm install karma-chai --save-dev
Configure:
karma.set({
frameworks: ['mocha', 'chai']
// ...
});
Use:
describe('something', function () {
it('should pass', function () {
expect(true).to.be(true); // works
});
});
Thanks to Louis answer and a bit of fiddling around I sorted out my test environment references using mocha.opts. Here is the complete setup.
My project is a legacy JavaScript application with a lot of "plain" js files which I wish to reference both in an html file using script tags and using require for unit testing with mocha.
I am not certain that this is good practice but I am used to Mocha for unit testing in node project and was eager to use the same tool with minimal adaptation.
I found that exporting is easy:
class Foo{...}
class Bar{...}
if (typeof module !== 'undefined') module.exports = { Foo, Bar };
or
class Buzz{...}
if (typeof module !== 'undefined') module.exports = Buzz;
However, trying to use require in all the files was an issue as the browser would complain about variables being already declared even when enclosed in an if block such as:
if (typeof require !== 'undefined') {
var {Foo,Bar} = require('./foobar.js');
}
So I got rid of the require part in the files and set up a mocha.opts file in my test folder with this content. The paths are relative to the root folder:
--require test/mocha.opts.js
mocha.opts.js content. The paths are relative to the location of the file:
global.assert = require('assert');
global.Foo = require("../foobar.js").Foo;
global.Bar = require("../foobar.js").Bar;
global.Buzz = require("../buzz.js");

NodeJS and Javascript (requirejs) dependency injection

I am currently using requirejs to manage module js/css dependencies.
I'd like to discover the possibilities of having node do this via a centralized config file.
So instead of manually doing something like
define([
'jquery'
'lib/somelib'
'views/someview']
within each module.
I'd have node inject the dependencies ie
require('moduleA').setDeps('jquery','lib/somelib','views/someview')
Anyway, I'm interested in any projects looking at dependency injection for node.
thanks
I've come up with a solution for dependency injection. It's called injectr, and it uses node's vm library and replaces the default functionality of require when including a file.
So in your tests, instead of require('libToTest'), use injectr('libToTest' { 'libToMock' : myMock });. I wanted to make the interface as straightforward as possible, with no need to alter the code being tested. I think it works quite well.
It's just worth noting that injectr files are relative to the working directory, unlike require which is relative to the current file, but that shouldn't matter because it's only used in tests.
I've previously toyed with the idea of providing an alternate require to make a form of dependency injection available in Node.js.
Module code
For example, suppose you have following statements in code.js:
fs = require('fs');
console.log(fs.readFileSync('text.txt', 'utf-8'));
If you run this code with node code.js, then it will print out the contents of text.txt.
Injector code
However, suppose you have a test module that wants to abstract away the file system.
Your test file test.js could then look like this:
var origRequire = global.require;
global.require = dependencyLookup;
require('./code.js');
function dependencyLookup (file) {
switch (file) {
case 'fs': return { readFileSync: function () { return "test contents"; } };
default: return origRequire(file);
}
}
If you now run node test.js, it will print out "test contents", even though it includes code.js.
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require() (except your modifications). Also debugging is fully supported.

Categories