TLDR; My question is: is there a way to make browserify NOT override require with its own implementation, and instead have it use a different method name (e.g. browserifyRequire) for all of its own internal requiring. To find out why I need to do this, please read on...
The Scenario
I'm trying to write some automated tests using CasperJS and running them in SlimerJS -- as opposed to the default PhantomJS (although for all I know, I would run into the same following issues with PhantomJS).
I really want to figure out how to write these in CoffeeScript. As it turns out, CasperJS or SlimerJS don't do well with CoffeeScript nowadays. The docs' recommendation is to compile to JS prior running casper. Ok... not super convenient, but I can handle it. In fact, I'm also finding out that the way require resolves paths in these tools is not as straightforward as in Node, so bundling ahead of running should help with that too.
But now I'm running into a new set of issues when trying to run the bundled code. I'm using Browserify for that.
The Problem
In my test code, I need to require('casper'). Standard practice in CasperJS world. So I had to tell browserify NOT to bundle CasperJS, by putting "browser": { "casper": false } in my package.json. No probs so far. But the problem comes next:
Browserify overrides the built-in require function, supplying its own implementation of require that does all the things that make browserify work. CasperJS is fine with it until it encounters the require('casper') directive. That's the one time CasperJS has to do the require'ing, not browserify. And that fails.
The Incomplete Solution
I'm quite sure that CasperJS just can't deal with the fact that Browserify overrides require, because CasperJS implements its own way of requireing. To test that hypothesis, I edited the resulting bundle manually, renaming every occurence of require to browserifyRequire -- including browserify's implementation of require. The only require I left unchanged was that call to require('casper'), because that's the one time I need CasperJS to handle the requireing. And indeed, this made things work as expected.
The Question
Again, is there a way to make browserify use a different name for its own internal require? I suppose I can write a script to make this change after the bundling, but I'd much rather figure out how to do this via config.
An Alternate Question
Maybe instead of Browserify there's another solution for bundling and running CoffeeScript inside CasperJS? I haven't found one yet....
Found a reasonable solution — one that can be run as an npm script, eg npm run build-test-bundle by adding to package.json
"scripts": {
"build-test-bundle": "browserify -t coffeeify casper-coffee-test.coffee | derequire | sed 's/_dereq_..casper../require(\"casper\")/g' > casper-coffee-test.compiled.js"
},
This sequence of commands does the following:
browserify -t coffeeify casper-coffee-test.coffee builds the bundle
| derequire pipes the browserify output to derequire, an npm that renames all occurrences of require function to _dereq_
| sed 's/_dereq_..casper../require(\"casper\")/g' pipes previous output to the sed command, which replaces back to normal require all occurrences of _dereq_("casper")
Related
I developed a javascript web app with npm and webpack. Now I converted all those .js files to .ts by using the powershell command stated here. The succeeding actions in the link is using grunt; I want to directly use VS2015 Typescript project but I cannot find any reference on the net about what to do with the node_modules and how I can fully convert all my package.json and webpack into Typescript project. The Task Runner Explorer in VS2015 only supports Grunt and Gulp tasks.
I recommend going with the "bare-foot" solution first. I'd rely much less on VS2015. It's maybe the best IDE available, but JS and TS projects can be handled from command line without relying on the magic of the IDE. This way you can gain a deeper understanding of these technologies and I think now it would be easier too.
I recommend the following steps:
create a tsconfig.json in the root folder. Read around, there's plenty of info available. Just one hint: use 'filesGlob' to specify the files to compile.
use TSD to get the .d.ts files of the libs you use from DefinitelyTyped. You might create, or at least just declare, the missing packages.
run 'tsc --project .' from command line to compile everything. You'll see the errors that are to be solved.
I'm typing from mobile, I can edit the codes tomorrow. Have any comments?
I have an browser javascript app which uses browserify and Mocha tests which are run in Phantom.js and other browsers.
The tests use a test/tests.js file as an entry point where I require each file:
// ...
// Require test files here:
require('./framework/extendable.test');
require('./framework/creator.test');
require('./framework/container.test');
require('./framework/api_client.test');
// ...
This is very tedious and I would like to be able to require the entire folder.
I have tried using include-folder which only loads the contents of each file (I don´t want to eval for obvious reasons).
I have also looked at require-dir but Browserify does not seem to pick up on the require calls.
You can use Karma (https://github.com/karma-runner/karma) to run your Mocha tests in multiple browsers (PhantomJS, FF, IE, locally or remote via WebDriver, as you want).
Then you can use the karma-bro (https://github.com/Nikku/karma-bro) preprocessor. It will bundle your tests on the fly with browserify, only for the testing purposes.
So you can just specify the folder, that contains your tests, in the Karma config.
That's the way I do it.
You could also write your own simple transform that will simply replace specified folder name with the list of require calls. Even with the random in place if necessary. It's not that hard. I am making many transforms for myself to ease up on stuff like this.
I am trying to get Swig (the template language) working on Parse Cloud Code with Express. Parse Cloud Code is a Node/Express host that doesn't allow NPM. Ridiculous, I know. I can still load external files into code with requires statements however, so I think that there's hope I can get this working.
So my question is how do I get the whole entire Swig package into a single JS file that I can include from my Parse Express app like so:
var swig = require("./cloud/swig.js");
Worth noting that Parse breaks normal require statements so that the NPM package as-is doesn't work without modifying each and every single file in the node_modules folder to have cloud in its path (which is why my above path has cloud in it). Parse also chokes while uploading lots of small files. Concatenation is a need on this platform.
I have tried playing with browserify for hours, but no combination of anything I do makes exposes the Swig object when I load the browserified file with the require statement. I think it may be the right option since the Browserified file includes all the files from Swig, but it doesn't expose them externally.
My question is either can this be done in browserify, and if so, how? Or is there another way to concatenate a NPM repo down to one file so it can be more easily included from this platform?
Thanks so much.
Browserify is not the right tool for the job.
As the name implies, browserify is intended to be used to generate files you want to execute in the browser. It walks the require calls from an entrypoint (i.e. some JS file you pass to browserify) and bundles them in an object that maps their names to functions wrapping the modules. It does not expect a require function to already exist and doesn't make any use of it. It substitutes its own implementation of require that only does one thing: look up names from the bundle, execute the matching function and return its exports.
You could theoretically require a browserify bundle, but it would just return an empty object (although it might mess with globals). And in all likelihood it might break because the bundled modules think they are being executed in a browser. This won't do any good.
The only sane option if you want to stick with the host, is to copy over the node_modules folder from your local project folder. This may not work if your computer and the server are not 100% compatible (e.g. 32-bit vs 64-bit, Debian vs RedHat, OSX/Windows vs Linux) but this mostly depends on your exact dependencies (basically anything that is built with node-gyp can be a problem).
Node.js uses the node_modules folder when looking up dependencies in require calls automagically. If you can somehow get a node_modules folder with the right contents on the server, require("foo") will work as long as node_modules contains a module foo.
Ultimately, you are trying to use npm modules in Parse Cloud code and currently it's not possible:
https://parse.com/questions/using-npm-modules-in-cloud-code
But if you are only trying to use Swig, then as a work-around, you can consider using underscore template instead. Parse already includes underscore:
https://parse.com/docs/cloud_modules_guide#underscore
As part of my build/CI process on TeamCity, I would like to specify a set of JavaScript files referenced in my HTML and combine them into one.
In other words, this is what I would like to accomplish.
Before:
<script src="1.js" />
<script src="2.js" />
After
<script src="combined.js" />
I am looking for a commandline tool to concatenate 1.js and 2.js into combined.js. Then, a commandline tool (or maybe the same tool) to replace the references in the HTML file to this new file. Please tell me how I could accomplish this.
What I have tried so far:
I have been looking at grunt-usemin which looks good, but requires the build server to perform an npm install on each build to get the dependencies, then run it. That takes too long and isn't a nice solution, because we build+deploy very frequently.
I know I could also add my node_modules folder to git but this is also undesirable. It would be nice if grunt could run these modules installed globally but it is not the grunt way (unless I am mistaken, grunt wants everything installed locally).
Someone also suggested running grunt on the developer machines. Again, undesirable as we have transient VMs and this would break the development flow.
It would be nice if I could run grunt-usemin locally without grunt!
For combining the files you can just use cat
cat 1.js 2.js > combined.js
To replace the chunk of html you could use sed
sed -i "s/<script src=\"1.js\">\n<script src=\"2.js\">/<script src=\"combined.js\">/g" *.html
This is a rather general solution, but it seems a bit clunky to me. If you're rendering this html in node you might consider replacing it in javascript if the combined file exists.
if (fs.existsSync('combined.js')) {
res.end(html_contents.replace('<script src="1.js"/>\n<script src="2.js"/>','<script src="combined.js">'));
} else {
res.end(html_contents);
}
for better performance you can of course use the asynchronous version of fs.exists
I ended up creating my own: https://npmjs.org/package/hashcat
npm install -g hashcat
#gustavohenke's recommendation was close but ultimately h5bp was proving to be too problematic for use on a build server. Ultimately creating this module did not take long and serves my needs for now. I will mark this as the answer until a better answer comes along in the future.
Grunt uses PhantomJS to run headless QUnit tests in a very interesting way (correct me if I'm wrong please). Since I just started experimenting with those tools I don't fully understand it and don't know how to configure nor how to extend it.
I manage to get all working on my machine but I would like to not use the $PATH system variable. Instead, I would like to provide the path to PhantomJS's executable file via a setting which I could easily change and port to other environments.
How can I achieve this?
I suppose there are many ways and I think the Qunit Task from Grunt might have an easy answer. Ideally it would be just a matter of defining the path on the grun.js file, something like this:
qunit: {
phantomjsPath: 'path/to/phantomjs',
files: ['test/**/*.html']
},
My environment is a MacOSX but I accept solutions for any kind of environments like Windows - my build server.
Thanks in advance.
UPDATE The version of Grunt I am using is v0.3.17. The next big version, v0.4.x, has many changes and some are not backwards compatible.
Well I think you finally migrated onto Grunt 0.4. And propably you got grunt-contrib-qunit plugin for running qunit tests under PhantomJS. Unfortunately you'll encountered the same issue - it's not possible to supply path to phantomjs executable. That's because grunt-contrib-qunit/grunt-contrib-phantomjs use phantomjs npm module which downloads PhantomJS on installation and hard-codes path to the executable in its js code. If you're experiencing such an issue then please check my blog post.
Unfortunately, grunt 0.3.x doesn't have a built-in option to specify a path to phantomjs -- it just executes phantomjs directly on the command line. Take a look at this helper function:
https://github.com/gruntjs/grunt/blob/master/tasks/qunit.js#L231
The situation seems to have changed in the has-yet-to-be-released grunt-0.4, however:
https://github.com/gruntjs/grunt-lib-phantomjs/blob/master/lib/phantomjs.js#L22
As you can see, the next version of grunt uses the npm module phantomjs which "exports a path string that contains the path to the phantomjs binary/executable.". Since the npm module phantomjs is installed locally by grunt, it seems like this would avoid you having to worry about setting the PATH variable or installing a conflicting version of phantomjs.
Anyway, I'd consider taking a look at grunt-0.4 if you're willing to live on the bleeding edge.
Otherwise, you can always fork the qunit task and modify the grunt-qunit task to look at your custom configuration variable.