Running all browserified Mocha tests in folder - javascript

I have an browser javascript app which uses browserify and Mocha tests which are run in Phantom.js and other browsers.
The tests use a test/tests.js file as an entry point where I require each file:
// ...
// Require test files here:
require('./framework/extendable.test');
require('./framework/creator.test');
require('./framework/container.test');
require('./framework/api_client.test');
// ...
This is very tedious and I would like to be able to require the entire folder.
I have tried using include-folder which only loads the contents of each file (I don´t want to eval for obvious reasons).
I have also looked at require-dir but Browserify does not seem to pick up on the require calls.

You can use Karma (https://github.com/karma-runner/karma) to run your Mocha tests in multiple browsers (PhantomJS, FF, IE, locally or remote via WebDriver, as you want).
Then you can use the karma-bro (https://github.com/Nikku/karma-bro) preprocessor. It will bundle your tests on the fly with browserify, only for the testing purposes.
So you can just specify the folder, that contains your tests, in the Karma config.
That's the way I do it.

You could also write your own simple transform that will simply replace specified folder name with the list of require calls. Even with the random in place if necessary. It's not that hard. I am making many transforms for myself to ease up on stuff like this.

Related

webpack build to multiple files

I have web pack application and want to build output multiple files not a single javascript file. for example if I have such folder structure
components
1.js
2.js
actions
1.js.
2.js
my webpack build have to compile files in same folder structure. How i can achieve that?
I tried babel cli:
babel ./src --out-dir ./lib --source-maps --presets es2015,react --plugins babel-plugin-add-module-exports,babel-plugin-transform-decorators-legacy,babel-plugin-transform-class-properties --watch
It outputs files as I wanted but getting error
Cannot resolve module
Because it does not know anything about webpack resolve.
Any suggestions?
Getting webpack to output multiple files is possible but it does have limitations. First it's important to understand how this works. Webpack actually provides a runtime script that can load code "chunks" as they are needed, this is important for large applications so the user doesn't have to download the javascript for the entire app just to see the homepage. But webpack needs to keep track of these chunks and what they're named at build time to be able to load them correctly at run time. For that reason it has it's own file naming conventions to enable this functionality. See more here: https://webpack.js.org/guides/code-splitting/. So you can require.ensure all of your deps, but they won't be named and foldered they way you describe as webpack's runtime wouldn't be able to find them in that case.
The other thing that's important to consider is that webpack is a bundler, so it's meant to bundle files. Essentially your saying you don't want to bundle your files. So if that's the case, you should probably look into using require.js. Many people have moved from require to bundlers such as Wepback and Browserify as typically it's not efficient to download every little module seperately. But, every app is different so you may in fact have a good reason to not bundle.
If you do in fact want to do some bundling but want to optimize how it's done, then I can help with that as well, and webpack is certainly a great tool for that. But I'll need to understand your use case a little more to give the best advice.

Bundling code for CasperJS+SlimerJS using Browserify?

TLDR; My question is: is there a way to make browserify NOT override require with its own implementation, and instead have it use a different method name (e.g. browserifyRequire) for all of its own internal requiring. To find out why I need to do this, please read on...
The Scenario
I'm trying to write some automated tests using CasperJS and running them in SlimerJS -- as opposed to the default PhantomJS (although for all I know, I would run into the same following issues with PhantomJS).
I really want to figure out how to write these in CoffeeScript. As it turns out, CasperJS or SlimerJS don't do well with CoffeeScript nowadays. The docs' recommendation is to compile to JS prior running casper. Ok... not super convenient, but I can handle it. In fact, I'm also finding out that the way require resolves paths in these tools is not as straightforward as in Node, so bundling ahead of running should help with that too.
But now I'm running into a new set of issues when trying to run the bundled code. I'm using Browserify for that.
The Problem
In my test code, I need to require('casper'). Standard practice in CasperJS world. So I had to tell browserify NOT to bundle CasperJS, by putting "browser": { "casper": false } in my package.json. No probs so far. But the problem comes next:
Browserify overrides the built-in require function, supplying its own implementation of require that does all the things that make browserify work. CasperJS is fine with it until it encounters the require('casper') directive. That's the one time CasperJS has to do the require'ing, not browserify. And that fails.
The Incomplete Solution
I'm quite sure that CasperJS just can't deal with the fact that Browserify overrides require, because CasperJS implements its own way of requireing. To test that hypothesis, I edited the resulting bundle manually, renaming every occurence of require to browserifyRequire -- including browserify's implementation of require. The only require I left unchanged was that call to require('casper'), because that's the one time I need CasperJS to handle the requireing. And indeed, this made things work as expected.
The Question
Again, is there a way to make browserify use a different name for its own internal require? I suppose I can write a script to make this change after the bundling, but I'd much rather figure out how to do this via config.
An Alternate Question
Maybe instead of Browserify there's another solution for bundling and running CoffeeScript inside CasperJS? I haven't found one yet....
Found a reasonable solution — one that can be run as an npm script, eg npm run build-test-bundle by adding to package.json
"scripts": {
"build-test-bundle": "browserify -t coffeeify casper-coffee-test.coffee | derequire | sed 's/_dereq_..casper../require(\"casper\")/g' > casper-coffee-test.compiled.js"
},
This sequence of commands does the following:
browserify -t coffeeify casper-coffee-test.coffee builds the bundle
| derequire pipes the browserify output to derequire, an npm that renames all occurrences of require function to _dereq_
| sed 's/_dereq_..casper../require(\"casper\")/g' pipes previous output to the sed command, which replaces back to normal require all occurrences of _dereq_("casper")

Compiling Modular Client-side Javascript

In Node.js, you can dynamically "require()" any javascript file likewise to PHP's require. I'd like to use this in my client-side code just for ease of development but not actually call a javascript function, but have a compiler replace the line with the contents of the respective file; effectively concatenating the files, not one after another, but inline within the code of one of the files. The closest thing I have found to this is smash. Are there any compilers, minifiers, etc that can do this?
Browserify might not be exactly what you want but it does definitely help with the ease of development issue. When you use Browserify, your code is your build tool. Browserify gives you all the benefits of writing code in node (no anon functions to avoid globals, npm, simple requires, imports instead of namespaced globals) and it allows you to package that code to run on the client with one command and only load one file.
You can checkout my open source js framework Luc JS for an example. It runs on node and IE6. I'm able keep the code modular and build the single browser file with a one line command.

Concatenate NPM package into one JS file

I am trying to get Swig (the template language) working on Parse Cloud Code with Express. Parse Cloud Code is a Node/Express host that doesn't allow NPM. Ridiculous, I know. I can still load external files into code with requires statements however, so I think that there's hope I can get this working.
So my question is how do I get the whole entire Swig package into a single JS file that I can include from my Parse Express app like so:
var swig = require("./cloud/swig.js");
Worth noting that Parse breaks normal require statements so that the NPM package as-is doesn't work without modifying each and every single file in the node_modules folder to have cloud in its path (which is why my above path has cloud in it). Parse also chokes while uploading lots of small files. Concatenation is a need on this platform.
I have tried playing with browserify for hours, but no combination of anything I do makes exposes the Swig object when I load the browserified file with the require statement. I think it may be the right option since the Browserified file includes all the files from Swig, but it doesn't expose them externally.
My question is either can this be done in browserify, and if so, how? Or is there another way to concatenate a NPM repo down to one file so it can be more easily included from this platform?
Thanks so much.
Browserify is not the right tool for the job.
As the name implies, browserify is intended to be used to generate files you want to execute in the browser. It walks the require calls from an entrypoint (i.e. some JS file you pass to browserify) and bundles them in an object that maps their names to functions wrapping the modules. It does not expect a require function to already exist and doesn't make any use of it. It substitutes its own implementation of require that only does one thing: look up names from the bundle, execute the matching function and return its exports.
You could theoretically require a browserify bundle, but it would just return an empty object (although it might mess with globals). And in all likelihood it might break because the bundled modules think they are being executed in a browser. This won't do any good.
The only sane option if you want to stick with the host, is to copy over the node_modules folder from your local project folder. This may not work if your computer and the server are not 100% compatible (e.g. 32-bit vs 64-bit, Debian vs RedHat, OSX/Windows vs Linux) but this mostly depends on your exact dependencies (basically anything that is built with node-gyp can be a problem).
Node.js uses the node_modules folder when looking up dependencies in require calls automagically. If you can somehow get a node_modules folder with the right contents on the server, require("foo") will work as long as node_modules contains a module foo.
Ultimately, you are trying to use npm modules in Parse Cloud code and currently it's not possible:
https://parse.com/questions/using-npm-modules-in-cloud-code
But if you are only trying to use Swig, then as a work-around, you can consider using underscore template instead. Parse already includes underscore:
https://parse.com/docs/cloud_modules_guide#underscore

How can I specify the PATH to PhantomJS using GRUNT and QUNIT?

Grunt uses PhantomJS to run headless QUnit tests in a very interesting way (correct me if I'm wrong please). Since I just started experimenting with those tools I don't fully understand it and don't know how to configure nor how to extend it.
I manage to get all working on my machine but I would like to not use the $PATH system variable. Instead, I would like to provide the path to PhantomJS's executable file via a setting which I could easily change and port to other environments.
How can I achieve this?
I suppose there are many ways and I think the Qunit Task from Grunt might have an easy answer. Ideally it would be just a matter of defining the path on the grun.js file, something like this:
qunit: {
phantomjsPath: 'path/to/phantomjs',
files: ['test/**/*.html']
},
My environment is a MacOSX but I accept solutions for any kind of environments like Windows - my build server.
Thanks in advance.
UPDATE The version of Grunt I am using is v0.3.17. The next big version, v0.4.x, has many changes and some are not backwards compatible.
Well I think you finally migrated onto Grunt 0.4. And propably you got grunt-contrib-qunit plugin for running qunit tests under PhantomJS. Unfortunately you'll encountered the same issue - it's not possible to supply path to phantomjs executable. That's because grunt-contrib-qunit/grunt-contrib-phantomjs use phantomjs npm module which downloads PhantomJS on installation and hard-codes path to the executable in its js code. If you're experiencing such an issue then please check my blog post.
Unfortunately, grunt 0.3.x doesn't have a built-in option to specify a path to phantomjs -- it just executes phantomjs directly on the command line. Take a look at this helper function:
https://github.com/gruntjs/grunt/blob/master/tasks/qunit.js#L231
The situation seems to have changed in the has-yet-to-be-released grunt-0.4, however:
https://github.com/gruntjs/grunt-lib-phantomjs/blob/master/lib/phantomjs.js#L22
As you can see, the next version of grunt uses the npm module phantomjs which "exports a path string that contains the path to the phantomjs binary/executable.". Since the npm module phantomjs is installed locally by grunt, it seems like this would avoid you having to worry about setting the PATH variable or installing a conflicting version of phantomjs.
Anyway, I'd consider taking a look at grunt-0.4 if you're willing to live on the bleeding edge.
Otherwise, you can always fork the qunit task and modify the grunt-qunit task to look at your custom configuration variable.

Categories