Output Jest text coverage reporter to .txt - javascript

Jest allows you to specify a coverage reporter in the package.json like so
{
...
"jest": {
"coverageDirectory": "./coverage",
"coverageReporters": ["json", "text", ...]
}
...
}
However, this only outputs the .json summary to the coverage directory. It only prints the text version to the console. Is there a way to output the text version to a .txt file in the coverage folder as well?
I've been referring to the docs here, which says that it is compatible with any of the Istanbul reporters. The text Istanbul reporter appears to have support for writing to a file. Is there any way to utilize it this?

In your jest config, you can specify the file option, and optionally the dir options which defaults to the current working directory:
"jest": {
"coverageReporters": ["text", { file: 'coverage.txt' }]
}
See docs for available options here.

Add json to coverageReports in your jest config:
"coverageReporters": ["json"]
Then install istanbul:
npm i -D istanbul
Add this script to package.json:
"scripts": {
"test": "jest --coverage && istanbul report --include coverage/coverage-final.json text > coverage.txt",
...
}
The script will generate the code coverage report file coverage-final.json then istanbul will generate the expected output redirected to coverage.txt

#KerSplosh We ended up writing a custom script with shelljs. Basically it runs the tests, and writes the table to a file with fs.
const shell = require("shelljs");
const path = require("path");
const fs = require("fs");
const result = shell.exec("yarn test --coverage");
fs.writeFileSync(
path.resolve(".", "coverage.txt"),
result.substring(result.indexOf("|\nFile") + 2)
);
if (result.code !== 0) {
shell.exit(1);
}
This isn't ideal though. Preferably this would be done through the Jest configuration. But at the time I implemented this, I don't think it wasn't possible.

If you happen to use react scripts on top of jest:
Add these snippets to their sections in package.json:
"scripts": {
"cover:report": "react-scripts test --coverage .> coverage/coverage-report.txt",
}
...
"jest": {
"collectCoverageFrom": [
"src/**/*.ts*"
],
"coverageReporters": ["text"]
}
This generates coverage report in file coverage/coverage-report.txt.
The dot in ".>" part tells the script to take all files (except for ignored ones) that match the "." pattern - which would typically be all of them.
Modify the "collectCoverageFrom" array of strings to include files/folders as needed.
This command doesn't exit itself unfortunately so you have to Ctrl+C to end it when it just hangs in there after being done.
To run it in terminal: "yarn cover:report"
The result contains plaintext table of coverage results.

Related

Can I turn off create-react-app chunking mechanism?

I am setting up my React app project using create-react-app.
I was wondering if there is a way to turn-off the chunking mechanism that is built-in into the react scripts. The thing is that I need to fix the name of the bundle created on the build.
It can be done by extending your CRA with react-app-rewired package which allows you to modify webpack config.
Changes needed to remove hash in build file names.
Install react-app-rewired
npm install react-app-rewired --save-dev
create config-overrides.js file in your root folder (where package.json is)
place the following code to the config-overrides.js file. It keeps all CRA settings, only remove the hash part from filenames.
module.exports = function override(config, env) {
config.output = {
...config.output, // copy all settings
filename: "static/js/[name].js",
chunkFilename: "static/js/[name].chunk.js",
};
return config;
};
use the new config. In the package.json file in scripts section replace "build": "react-scripts build", with "build": "react-app-rewired build",
Unless you are going to change more configuration, it is enough to only use react-app-rewired in build. Otherwise replace react-scripts with react-app-rewired in others scripts except eject
I've found that you can disable chunking by setting splitChunks webpack configuration. For more details check https://github.com/facebook/create-react-app/issues/5306#issuecomment-431431877
However, this does not remove the contenthash part from the bundle name and you will still have that random string in the name.
To remove this, go to your webpack.config and edit the bundle name
'static/js/[name].[contenthash:8].js' => 'static/js/[name].js'
This is extended and improved version of Darko's answer. I created it mostly to save time for others who is not fully satisfied with solution mentioned in this comment and didn't have a patience to dig to this comment that solved the issue in much nicer way.
Main idea of this "hacky" approach is to re-write standard react-scripts's webpack configuration on the fly and inject it back to original scripts.
For that you would need to install rewire package from npmjs.org, like so:
npm install rewire --save-dev
Then you create separate build script that will will "wrap" original react build script and make sure that it will relieve corrected webpack configuration. Conventional way is to save this file inside ./scripts folder. So let's call it ./scripts/build.js. It's content:
const rewire = require('rewire');
const path = require('path');
// Pointing to file which we want to re-wire — this is original build script
const defaults = rewire('react-scripts/scripts/build.js');
// Getting configuration from original build script
let config = defaults.__get__('config');
// If we want to move build result into a different folder, we can do that!
// Please note: that should be an absolute path!
config.output.path = path.join(path.dirname(__dirname), 'custom/target/folder');
// If we want to rename resulting bundle file to not have hashes, we can do that!
config.output.filename = 'custom-bundle-name.js';
// And the last thing: disabling splitting
config.optimization.splitChunks = {
cacheGroups: {
default: false,
},
};
config.optimization.runtimeChunk = false;
Then, we should use this build script instead of standard one in our packages.json, something like so:
...
"scripts": {
"start": "react-scripts start",
"build": "node ./scripts/build.js",
"test": "react-scripts test",
"eject": "react-scripts eject"
},
...
As others have pointed out you can try this with react-app-rewired instead of ejecting. Here is a version that also handles css and media files:
After installing npm install react-app-rewired --save-dev I created a config-overrides.js with the following content:
module.exports = function override(config, env) {
if (env !== "production") {
return config;
}
// Get rid of hash for js files
config.output.filename = "static/js/[name].js"
config.output.chunkFilename = "static/js/[name].chunk.js"
// Get rid of hash for css files
const miniCssExtractPlugin = config.plugins.find(element => element.constructor.name === "MiniCssExtractPlugin");
miniCssExtractPlugin.options.filename = "static/css/[name].css"
miniCssExtractPlugin.options.chunkFilename = "static/css/[name].css"
// Get rid of hash for media files
config.module.rules[1].oneOf.forEach(oneOf => {
if (!oneOf.options || oneOf.options.name !== "static/media/[name].[hash:8].[ext]") {
return;
}
oneOf.options.name = "static/media/[name].[ext]"
});
return config;
};
I don't know how to turn off chunking but what you could do try achieve you goal
Update to latest react and react-dom , run 'yarn react#next react-dom#next' (or npm command to do same)
You should now have the latest react versions - so you can code split using React.lazy/React.Suspense, use hooks and so on.
So now you can name your chunks using (component or dependency examples below)
const MyComp = lazy(() => import(/* webpackChunkName: 'MyChunkNmame'
*/ './MyComp'), );
const myLib= await import(/* webpackChunkName: "myLib" */ 'myLib');
If you have an issue with errors when using the import syntax you need to use the babel-plugin-syntax-dynamic-import plugin. Put the "babel" field in your package json.
Now you can name your chunks and implement the latest way to code split - hope that helps. Here is a link to React.lazy React.Suspense - https://reactjs.org/blog/2018/10/23/react-v-16-6.html
There is a hack without needing eject:
yarn add --dev rewire
create file in root and name it build-non-split.js
fill inside it by below codes:
const rewire = require('rewire');
const defaults = rewire('react-scripts/scripts/build.js');
let config = defaults.__get__('config');
config.optimization.splitChunks = {
cacheGroups: {
default: false,
},
};
config.optimization.runtimeChunk = false;
change the build script inside your package.json to:
"build": "node ./scripts/build-non-split.js",
yarn build

How do you use Istanbul Code Coverage with transpiled Typescript?

I've been reading articles on this all morning trying to get my environment setup correctly. But for some reason I'm not getting it. My setup-
/app
... source (mixed js and ts)
/scripts
... copied source (js)
typescripts.js // transpiled typescript with inline mapping
Tests run fine, and with the mapping debugging in the chrome debugger is mapped correctly. But Istanbul sees the typescripts.js file as one file instead of the concatenation of dozens of other files.
To generate the typescript source I'm using gulp-typescript. The source (excluding tests) are transpiled to the aforementioned typescripts.js, and the tests are transpiled individually and copied to /scripts.
var ts = require('gulp-typescript');
var sourcemaps = require('gulp-sourcemaps');
var concat = require('gulp-concat');
module.exports = function (gulp, config) {
'use strict';
// Runs dot ts files found in `www` through the typescript compiler and copies them as js
// files to the scripts directory
gulp.task('typescript', ['typescript:tests'], function () {
return gulp.src(config.paths.typescript) // [ './www/app/**/*.ts', '!./www/app/**/*.test.ts', '!./www/app/**/*.mock.ts' ]
.pipe(sourcemaps.init())
.pipe(ts(ts.createProject(config.paths.tsConfig))) // './tsconfig.json'
.js
.pipe(concat(config.sourcemaps.dest)) // typescripts.js
.pipe(sourcemaps.write(config.sourcemaps)) // { includeContent: false, sourceRoot: '/app' } - i've also tried absolute local path
.pipe(gulp.dest(config.paths.tmpScripts)); // ./www/scripts
});
gulp.task('typescript:tests', [], function() {
return gulp.src(config.paths.typescriptTests) // [ './www/app/**/*.test.ts', './www/app/**/*.mock.ts' ]
.pipe(ts(ts.createProject(config.paths.tsConfig))) // './tsconfig.json'
.pipe(gulp.dest(config.paths.tmpScripts)); // ./www/scripts
});
};
The resulting typescripts.js has the inline sourcemap. With the sourcemap, the dozen or so ts files results in 106kb.
So from here tests and debugging works fine.
Now in an attempt to get Istanbul code coverage working properly i've installed karma-sourcemap-loader and added it to the preprocessors.
preprocessors: {
'www/scripts/typescripts.js': ['sourcemap'],
'www/scripts/**/*.js': ['coverage']
},
I'd think this is what I'd need to do. But it does not show code coverage on the source files. I tried the absolute path from C:/ but that didn't work either. I also tried the different options in gulp-sourcemaps like adding source (which pushed the file to 160kb) but no like either.
Has anyone gotten this to work? Any ideas what I could be doing wrong?
TL;DR: There is a tool: https://github.com/SitePen/remap-istanbul described as A tool for remapping Istanbul coverage via Source Maps
The article on Sitepan describes it in more detail:
Intern as well as other JavaScript testing frameworks utilise Istanbul
for their code coverage analysis. As we started to adopt more and more
TypeScript for our own projects, we continued to struggle with getting
a clear picture of our code coverage as all the reports only included
the coverage of our emitted code. We had to try to use the compilers
in our minds to try to figure out where we were missing test coverage.
We also like to set metrics around our coverage to let us track if we
are headed the right direction.
A couple of us started exploring how we might be able to accomplish
mapping the coverage report back to its origins and after a bit of
work, we created remap-istanbul, a package that allows Istanbul
coverage information to be mapped back to its source when there are
Source Maps available. While we have been focused on TypeScript, it
can be used wherever the coverage is being produced on emitted code,
including the tools mentioned above!
How to use the tool with gulp: https://github.com/SitePen/remap-istanbul#gulp-plugin
If you want source map support with Istanbul, you can use the 1.0 alpha release as the current release does not support source maps. I have it set up using ts-node in http://github.com/typings/typings (see https://github.com/typings/typings/blob/bff1abad91dabec1cd8a744e0dd3f54b613830b5/package.json#L19) and source code is being mapped. It looks great and is nice to have my tests and code coverage all running in-process with zero transpilation. Of course, you can use Istanbul 1.0 with the transpiled JavaScript.
For the browser implementation you're using, I'd have to see more of code of what you're doing to see this'll just work for you, but try the 1.0.0-alpha.2 and see what happens.
As blakeembrey mentioned. Istanbul 1.x handles it well.
Below an example of pure npm script that does it with Jasmine.
See https://github.com/Izhaki/Typescript-Jasmine-Istanbul-Boilerplate.
package.json (the relevant stuff)
{
"scripts": {
"postinstall": "typings install dt~jasmine --save --global",
"test": "ts-node node_modules/.bin/jasmine JASMINE_CONFIG_PATH=jasmine.json",
"test:coverage": "ts-node node_modules/istanbul/lib/cli.js cover -e .ts -x \"*.d.ts\" -x \"*.spec.ts\" node_modules/jasmine/bin/jasmine.js -- JASMINE_CONFIG_PATH=jasmine.json"
},
"devDependencies": {
"istanbul": "^1.1.0-alpha.1",
"jasmine": "^2.4.1",
"ts-node": "^0.9.3",
"typescript": "^1.8.10",
"typings": "^1.3.1"
},
}
Output
This is repo works. I ran the repo and can see the tests running. Html view is also generated.
https://github.com/Izhaki/Typescript-Jasmine-Istanbul-Boilerplate
None of the examples provided worked for my Node.JS project (written in TypeScript). I wanted to run unit tests in Jasmine, and covered by Istanbul.
I ended up getting it working with the following.
package.json:
{
"scripts": {
"lint": "tslint 'src/**/*.ts'",
"remap": "./node_modules/.bin/remap-istanbul -i ./coverage/coverage-final.json -t html -o ./coverage && rimraf ./coverage/dist",
"test": "npm run lint && rimraf dist coverage && tsc --project tsconfig-test.json && ./node_modules/.bin/istanbul cover ./node_modules/.bin/jasmine JASMINE_CONFIG_PATH=jasmine.json && npm run remap"
},
"devDependencies": {
"#types/jasmine": "2.8.6",
"#types/node": "9.6.6",
"istanbul": "0.4.5",
"jasmine": "3.1.0",
"remap-istanbul": "0.11.1",
"rimraf": "2.6.2",
"tslint": "5.9.1",
"typescript": "2.8.1"
}
}
jasmine.json
{
"spec_dir": "dist",
"spec_files": [
"**/*.spec.js"
],
"stopSpecOnExpectationFailure": false,
"random": false
}
.istanbul.yml
instrumentation:
root: ./dist
excludes: ['**/*.spec.js', '**/fixtures/*.js']
include-all-sources: true
reporting:
reports:
- html
- json
- text-summary
dir: ./coverage
tsconfig-test.json
{
"compilerOptions": {
"declaration": true,
"lib": [
"dom",
"es6"
],
"module": "commonjs",
"noImplicitAny": true,
"outDir": "dist",
"sourceMap": true,
"target": "es5"
},
"include": [
"src/**/*.ts"
],
"exclude": [
"node_modules"
]
}

How to automatically zip files with Node.js and npm

Is there a way to automatically zip certain files at the build time with Node.js and npm?
For example, I have a project, that file structure looks like this:
Project/
--lib/
--node_modules/
--test/
--index.js
--package.json
I want to be able to zip lib folder, certain modules from node_modules and index.js into some zip archive to upload it on the AWS Lambda, for example. I do not need test folder or test Node.js modules (mocha and chai) to be zipped. I have even created a bash script for generating zip file, but is there a way to automatically execute this script, when 'npm install' is called?
This should be a standard problem and it should have a standard solution, but I was unable to discover such.
UPDATE
thanks to michael, decided to use gulp. This is my script, in case some one else will need it for AWS Lambda:
var gulp = require('gulp');
var clean = require('gulp-clean');
var zip = require('gulp-zip');
var merge = require('merge-stream');
gulp.task('clean', function () {
var build = gulp.src('build', {read: false})
.pipe(clean());
var dist = gulp.src('dist', {read: false})
.pipe(clean());
return merge(build, dist);
});
gulp.task('build', function() {
var index = gulp.src('index.js')
.pipe(gulp.dest('build'));
var lib = gulp.src('lib/**')
.pipe(gulp.dest('build/lib'));
var async = gulp.src('node_modules/async/**')
.pipe(gulp.dest('build/node_modules/async'));
var collections = gulp.src('node_modules/collections/**')
.pipe(gulp.dest('build/node_modules/collections'));
var underscore = gulp.src('node_modules/underscore/**')
.pipe(gulp.dest('build/node_modules/underscore'));
var util = gulp.src('node_modules/util/**')
.pipe(gulp.dest('build/node_modules/util'));
var xml2js = gulp.src('node_modules/xml2js/**')
.pipe(gulp.dest('build/node_modules/xml2js'));
return merge(index, lib, async, collections, underscore, util, xml2js);
});
gulp.task('zip', ['build'], function() {
return gulp.src('build/*')
.pipe(zip('archive.zip'))
.pipe(gulp.dest('dist'));
});
gulp.task('default', ['zip']);
I realize this answer comes years too late for the original poster. But I had virtually the same question about packaging up a Lambda function, so for posterity, here's a solution that doesn't require any additional devDependencies (like gulp or grunt) and just uses npm pack along with the following package.json (but does assume you have sed and zip available to you):
{
"name": "my-lambda",
"version": "1.0.0",
"scripts": {
"postpack": "tarball=$(npm list --depth 0 | sed 's/#/-/g; s/ .*/.tgz/g; 1q;'); tar -tf $tarball | sed 's/^package\\///' | zip -#r package; rm $tarball"
},
"files": [
"/index.js",
"/lib"
],
"dependencies": {
"async": "*",
"collections": "*",
"underscore": "*",
"util": "*",
"xml2js": "*"
},
"bundledDependencies": [
"async",
"collections",
"underscore",
"util",
"xml2js"
],
"devDependencies": {
"chai": "*",
"mocha": "*"
}
}
Given the above package.json, calling npm pack will produce a package.zip file that contains:
index.js
lib/
node_modules/
├── async/
├── collections/
├── underscore/
├── util/
└── xml2js/
The files array is a whitelist of what to include. Here, it's just index.js and the lib directory.
However, npm will also automatically include package.json, README (and variants like README.md, CHANGELOG (and its variants), and LICENSE (and the alternative spelling LICENCE) unless you explicitly exclude them (e.g. with .npmignore).
The bundledDependencies array specifies what packages to bundle. In this case, it's all the dependencies but none of the devDependencies.
Finally, the postpack script is run after npm pack because npm pack generates a tarball, but we need to generate a zip for AWS Lambda.
A more detailed explanation of what the postpack script is doing is available at https://hackernoon.com/package-lambda-functions-the-easy-way-with-npm-e38fc14613ba (and is also the source of the general approach).
If you're UNIX-based you could also just use the zip command in one of your scripts:
"scripts": {
"zip": "zip -r build.zip build/"
"build": "build",
"build-n-zip": "build && zip
}
The above creates a build.zip at the root, which is a zipped up version of the /build folder.
If you wanted to zip multiple folders/files, just add them to the end:
"scripts": {
"zip": "zip -r build.zip build/ some-file.js some-other-folder/"
}
Note
If a build.zip already exists in the folder, the default behaviour is for zip to add files to that existing archive. So many people who are continuously building will probably want to delete the build.zip first:
"scripts": {
"zip": "rm -f build.zip && zip -r build.zip build",
"build": "build",
"build-n-zip": "yarn build && yarn zip"
}
I would go with gulp using gulp-sftp, gulp-tar and gulp-gzip and an alias as command. Create a file called .bash_aliases in your users home folder containing
alias installAndUpload='npm install && gulp runUploader'
After a reboot you can call both actions at once with this alias.
A gulp file could look something like this
var gulp = require('gulp');
var watch = require('gulp-watch');
var sftp = require('gulp-sftp');
var gzip = require('gulp-gzip');
gulp.task('runUploader', function () {
gulp.src('.path/to/folder/to/compress/**')
.pipe(tar('archive.tar'))
.pipe(gzip())
.pipe(gulp.dest('path/to/folder/to/store')) // if you want a local copy
.pipe(sftp({
host: 'website.com',
user: 'johndoe',
pass: '1234'
}))
});
Of course, you can also add gulp-watch to automatically create the tar/zip and upload it whenever there is a change in the directory.
You should take a look to npm scripts.
You'll still need a bash script laying around in your repository, but it will be automatically triggered by some npm tasks when they are executed.
npm-pack-zip worked for me.
npm install --save-dev npm-pack-zip
To publish the whole lambda using aws I used this node script in package.json:
"publish": "npm-pack-zip && aws lambda update-function-code --function-name %npm_package_name% --zip-file fileb://%npm_package_name%.zip && rm %npm_package_name%.zip"
You can use Zip-Build, this little package will use the data in your package.json file and create a compressed file named project-name_version.zip.
Disclaimer: I am a developer of this library.
How to use zip-build
Just install in your project as dev dependency with:
$ npm install --save-dev zip-build
Then modify the build script in your package.json, adding && zip-build at the end, like this:
"scripts": {
"build": your-build-script && zip-build
}
If your build directory is named different than build and your desired directory for compressed files is named different than dist, you can provide the directory names as arguments for zip-build:
"scripts": {
"build": your-build-script && zip-build build-dirname zip-dirname
}
If you need automate tasks take a look to Grunt or Gulp.
In the case of Grunt needed plugins:
https://www.npmjs.com/package/grunt-zip
https://www.npmjs.com/package/grunt-aws-lambda
Check out my gist at https://gist.github.com/ctulek/6f16352ebdfc166ce905
This uses gulp for all the tasks you mentioned except creating the lambda function initially (it only updates the code)
It assumes every lambda function is implemented in its own folder, and you need to define your AWS credential profile.

Unit testing using Jasmine and TypeScript

I am trying to get a unit test written in Typescript using Jasmine to compile. With the following in my unit-test file, Resharper prompts me with a link to import types from jasmine.d.ts.
/// <reference path="sut.ts" />
/// <reference path="../../../scripts/typings/jasmine/jasmine.d.ts" />
describe("Person FullName", function () {
var person;
BeforeEach(function () {
person = new Person();
person.setFirstName("Joe");
person.setLastName("Smith");
});
It("should concatenate first and last names", function () {
Expect(person.getFullName()).toBe("Joe, Smith");
});
});
So I click on the link and end up with the following (actually resharper only prefixed the describe function with "Jasmine.", so I manually prefixed the other Jasmine calls):
/// <reference path="sut.ts" />
/// <reference path="../../../scripts/typings/jasmine/jasmine.d.ts" />
import Jasmine = require("../../../Scripts/typings/jasmine/jasmine");
Jasmine.describe("Person FullName", function () {
var person;
Jasmine.BeforeEach(function () {
person = new Person();
person.setFirstName("Joe");
person.setLastName("Smith");
});
Jasmine.It("should concatenate first and last names", function () {
Jasmine.Expect(person.getFullName()).toBe("Joe, Smith");
});
});
However the import statement has a red squiggly line with error message "Unable to resolve external module ../../../scripts/typings/jasmine/jasmine. Module cannot be aliased to a non-module type"
Any idea what is causing this error? I've checked that the "Module System" option is set to AMD in my project build settings. I've also checked that the jasmine module is defined in jasmine.d.ts. I downloaded this file from DefinitelyTyped site.
declare module jasmine {
...
}
Here's (in my opinion) the best way to test a ts-node app as of 2018:
npm install --save-dev typescript jasmine #types/jasmine ts-node
In package.json:
{
"scripts": {
"test": "ts-node node_modules/jasmine/bin/jasmine"
}
}
In jasmine.json change file pattern to *.ts
"spec_files": ["**/*[sS]pec.ts"],
In your spec files:
import "jasmine";
import something from "../src/something";
describe("something", () => {
it("should work", () => {
expect(something.works()).toBe(true);
});
});
To run the tests:
npm test
This will use the locally installed versions of ts-node and jasmine. This is better than using globally installed versions, because with local versions, you can be sure that everyone is using the same version.
Note: if you have a web app instead of a node app, you should probably run your tests using Karma instead of the Jasmine CLI.
Put this at the top of your typescript spec file:
/// <reference path="../../node_modules/#types/jasmine/index.d.ts" />
let Jasmine = require('jasmine');
You must install the following Jasmine modules for that to work:
$ npm install jasmine-core jasmine #types/jasmine #ert78gb/jasmine-ts --save-dev
Once you do that, the IDE (such as WebStorm) will recognize Jasmine and its functions such as describe(), it(), and expect().. So you don't need to prefix them with "Jasmine." Also, you can run your spec files from the command line using the jasmine-ts module. Install these command line tools globally:
$ npm install -g jasmine #ert78gb/jasmine-ts
Then configure the "jasmine" command line module so that Jasmine can find its configuration file. Then you should be able to run jasmine-ts and your spec file should run fine from the command line:
./node_modules/.bin/jasmine-ts src/something.spec.ts
.. and, you can configure your IDE to run it like that as well, and debug runs that way should also work (works for me).
Writing your tests this way, you can run a Jasmine test spec on the server side without Karma, or run it in a web browser using Karma. Same typescript code.
If you have issues with imports, use tsconfig-paths
npm i ts-node tsconfig-paths types/jasmine jasmine --save-dev
Run typescript-enabled jasmine:
ts-node -r tsconfig-paths/register node_modules/jasmine/bin/jasmine.js
Ensure that your jasmine will search .ts files:
"spec_files": [
"**/*[sS]pec.ts"
],
"helpers": [
"helpers/**/*.ts"
],
To test your scripts you may also need polyfills if you use them in your project. Create a helper file with required imports, like helpers/global/polifill.ts
import 'core-js';
For me I did the following:
Install Typings
npm install typings --global
Then add the typings in for jasmine
typings install dt~jasmine --save --global
You could try a side-effect only import which brings in the #types/jasmine declaration and places the jasmine functions into the global scope so you don't need to prefix each call with jasmine. allowing a quick port from existing unit tests and still plays nice with webpack.
// tslint:disable-next-line:no-import-side-effect
import "jasmine";
describe("My Unit Test", () => { /* ... */ } );
Of course you still need to install jasmine and the typings:
$ npm i jasmine #types/jasmine --save-dev
But no need for specialized jasmine loaders for ts or node. Just run jasmine against the compiled js files:
$ node ./node_modules/jasmine/bin/jasmine.js --config=test/support/jasmine.json
Assuming your typescript files are within a "test" subdirectory compiling to bin/test and you have a test/support/jasmine.json with something like this:
{
"spec_dir": "bin/test",
"spec_files": [
"**/*[sS]pec.js"
],
"stopSpecOnExpectationFailure": false,
"random": false
}
P.S. all of the above works on Windows too
Include this to your jasmine html file,...
<script type="text/javascript" src="jasmine/lib/jasmine-2.0.0/jasmine.js"></script>
...or install the npm jasmine package:
npm install --save-dev jasmine
when you are using the second way (jasmine as module) you have to import it:
var jasmine = require('jasmine');
or
import jasmine from 'jasmine';
then change the other code:
jasmine.describe("Person FullName", function () {
var person;
jasmine.beforeEach(function () {
person = new Person();
person.setFirstName("Joe");
person.setLastName("Smith");
});
jasmine.it("should concatenate first and last names", function () {
jasmine.expect(person.getFullName()).toBe("Joe, Smith");
});
});
Personally i would prefer the first way without using the jasmine npm module. (I didn't test the module yet)
You didn't ask for this, but for bonus points: once you get AJ's answer up and running (using ts-node to invoke the Jasmine startup script), you can add a new task:
"scripts": {
"watch": "ts-node-dev --respawn -- ./node_modules/jasmine/bin/jasmine src/**.spec.ts"
}
Of course, you can pass your specs or any other arguments using Jasmine's config file instead, if you like. Now, Jasmine will run all your specs once, then ts-node-dev will sit in the background waiting for your tests or anything they might have require'd to change, at which point jasmine will be run again. I haven't worked out a way to only run the tests that have changed (or tests whose imports have changed) yet -- as far as I can tell, that's not supported anyway;
My folder structure
Spec folder is on the root of project
spec
\- dist // compiled tests
\- helpers // files modified testing env
\- ts-console.ts // pretty prints of results
\- support
\- jasmine.json
\- YourTestHere.spec.ts
\- tsconfig.json // tsconfig for your tests
Files content
ts-console.ts
const TSConsoleReporter = require("jasmine-console-reporter");
jasmine.getEnv().clearReporters();
jasmine.getEnv().addReporter(new TSConsoleReporter());
jasmine.json
{
"spec_dir": "spec/dist",
"spec_files": [
"**/*[sS]pec.js"
],
"helpers": [
"spec/helpers/**/*.js"
],
"stopSpecOnExpectationFailure": false,
"random": true
}
With extra script in package.json
"scripts": {
"test": "rm -rf ./spec/dist && tsc -p ./spec && jasmine"
}
and add line "/spec/dist" to .gitignore
Run your tests!
Run your tests with npm test.
How does it work?
Directory for tests is cleaned.
Tests are compiled to spec/dist folder to JS.
Tests are runned from this location.
I hope it will help you. Good coding.

How to cover React jsx files in Istanbul?

I'm trying to integrate my existing test processes to now include React, but am struggling on the code coverage part. I've been able to get my unit tests working fine by following this project/tutorial - https://github.com/danvk/mocha-react - http://www.hammerlab.org/2015/02/14/testing-react-web-apps-with-mocha/
I've been using Istanbul to cover my node code and it's working pretty well. However, I'm having trouble getting it to cover the jsx files that I'm using in my tests.
Here's an example of an existing Istanbul task, which also runs fine on vanilla js (node backend code)
var mocha = require('gulp-mocha');
var istanbul = require('gulp-istanbul');
gulp.task('test-api', function (cb) {
gulp.src(['api/**/*.js'])
.pipe(istanbul()) // Covering files
.pipe(istanbul.hookRequire()) // Force `require` to return covered files
.on('finish', function () {
gulp.src(['test/api/*.js'])
.pipe(mocha())
.pipe(istanbul.writeReports()) // Creating the reports after tests runned
.on('end', cb);
My issue ( i think ) is I can't get Istanbul to recognize the jsx files or they're not being compared to what was run in the tests. I tried using the gulp-react module to precompile the jsx to js so it can be used by Istanbul, but I'm not sure if it's working. It's not being covered somehow and I'm not sure where the issue is.
var mocha = require('gulp-mocha');
var istanbul = require('gulp-istanbul');
var react = require('gulp-react');
gulp.task('test-site-example', function (cb) {
gulp.src(["site/jsx/*.jsx"]) //Nothing is being reported by Istanbul (not being picked up)
.pipe(react()) //converts the jsx to js and I think pipes the output to Istanbul
.pipe(istanbul())
.pipe(istanbul.hookRequire()) // Force `require` to return covered files
.on('finish', function () {
gulp.src(['test/site/jsx/*.js'], { //tests run fine in mocha, but nothing being shown as reported by mocha (not covered)
read: false
})
.pipe(mocha({
reporter: 'spec'
}))
.pipe(istanbul.writeReports())
.on('end', cb);
});
;
});
Any ideas what I'm doing wrong? Or any clue on how to integrate a test runner (preferably Istanbul) into a Gulp-Mocha-React project?
Add a .istanbul.yml file to your app root and add .jsx to extensions "extension"...
Here is what I did.
// File .istanbul.yml
instrumentation:
root: .
extensions: ['.js', '.jsx']
To kickstart istanbul and mocha with jsx
$ istanbul test _mocha -- test/**/* --recursive --compilers js:babel/register
This will convert your .jsx files to .js and then istanbul will cover them.
I hope this helps. It worked for me.
There is a library you can take a look at, gulp-jsx-coverage (https://github.com/zordius/gulp-jsx-coverage).
In case someone else is having the same problem and solutions above don't work, I found that adding a simple "-e .jsx" tag worked:
"scripts": {
"start": "meteor run",
"test": "NODE_ENV=test mocha --recursive --compilers js:babel-register --require tests/index.js ./tests/**/*.test.js",
"coverage": "NODE_ENV=test nyc -all -e .jsx npm test"
}
This solution was found at: http://www.2devs1stack.com/react/tape/testing/nyc/coverage/2016/03/05/simple-code-coverage-with-nyc.html
A great tutorial can be found at https://istanbul.js.org/docs/tutorials/es2015/
I just slightly modified it to include react. (I also used 'env' instead of 'es2015', but either should work.) Here are my configurations:
npm i babel-cli babel-register babel-plugin-istanbul babel-preset-env babel-preset-react cross-env mocha nyc --save-dev
.babelrc
{
"presets": ["env", "react"],
"env": {
"test": {
"plugins": [
"istanbul"
]
}
}
}
package.json
"scripts": {
"test": "cross-env NODE_ENV=test nyc mocha test/**/*.spec.js --compilers js:babel-register"
}
"nyc": {
"require": [
"babel-register"
],
"reporter": [
"lcov",
"text"
],
"sourceMap": false,
"instrument": false
}

Categories