Jasmine Runs Test Three Times - javascript

I am running Karma/Jasmine/Angular 2.0 tests on my development box. Just recently, Jasmine on my development box decided to start running my tests three times. Yes, exactly three times, every time.
On the first run, everything passes as expected. However, on the second and third pass, all of the same things fail. It always acknowledges that there are 7 tests, but runs 21, and 10 fails (first-grade math out the window)????
This also fails on Travis with SauceLabs. (Note: That links to an older build with 3 tests, but ran 9, and 5 fail???)
I have a screenshot, karma.conf.js file, and one suite which started this whole thing. Any help with be greatly appreciated.
Culprit [TypeScript] (Remove this and problem solved on my dev box):
Full source
describe('From the Conductor Service', () => {
let arr: Array<ComponentStatusModel> = null;
let svc: ConductorService = null;
beforeEach(() => {
arr = [/* Inits the array*/];
svc = new ConductorService();
});
describe('when it is handed a container to hold objects which need to be loaded', () => {
// More passing tests...
/// vvvvv The culprit !!!!!
describe('then when you need to access the container', () => {
beforeEach(() => {
svc.loadedContainer = arr;
});
it('it should always be available', () => {
assertIsLocalDataInTheService(arr, svc.loadedContainer);
});
});
/// ^^^^^ End of culprit !!!!!
});
// More passing tests...
});
Failing Tests:
Browser Screenshots:
Not sure if this is related, but before all of the errors happen, the Jasmine call stack is smaller (left, observe scrollbar). After the errors start, the stack just gets bigger with repeating calls to the same functions (right, observe scrollbar).
Suite Stack is Wrong:
In my test, the Nanobar and Conductor spec files are totally separate. However, you can see the suites array includes stuff from the Nanobar and Conductor specs. Somehow Jasmine mashed these two spec files together (after everything started failing), and resulted in my describe() statements not making any sense when published to the console.
Simplified karma.conf.js:
Full source
module.exports = function (config) {
config.set({
autoWatch: false,
basePath: '.',
browsers: ['Chrome'],
colors: true,
frameworks: ['jasmine'],
logLevel: config.LOG_INFO,
port: 9876,
reporters: ['coverage', 'progress'],
singleRun: true,
coverageReporter: {
// Code coverage config
},
files: [
// Loads everything I need to work
],
plugins: [
'karma-chrome-launcher',
'karma-coverage',
'karma-jasmine'
],
preprocessors: {
'app/**/*.js': ['coverage']
},
proxies: {
// Adjust the paths
}
})
}

Can you try refreshing your browser in your first assertion in each of your test files?
Try this:
browser.restart();
I had the same problem and this fixed it for me.

The first thing is these test run randomly. If you pass some data in any test case if you think you can resue that it is not possible.
You have to declare the data in before each so all test cases get data.
All test cases run independently.
If you are using array or object you must have to use this after deep cloning because array and object works on the reference. If you manipulate any value it will also change the original array.
In most of the cases if the test fails there may be an error of data you are passing in test cases.

I would try to debug this and pinpoint the exact cause.
Usually happens when I have redirection code or any reload code inside the functions I'm testing.
You can try adding an f to the prefix of describe and it (i.e. fdescribe and fit)

Related

Cypress: run entire test suite multiple times with different data

I have seen several posts about running a single test with different parameters. It is also documented here.
However, I couldn't find any examples of how to run the entire test suite, i.e. tests across multiple files in the cypress/integration folder multiple times with different data.
My scenario is that I want to stub different responses from an API I'm calling and run all test cases against the different responses. So for the 1st run, I would put in support/index.js:
beforeEach(() => {
cy.intercept("GET", "example/API", { fixture: "fixture1.json" });
});
and for the 2nd run I would put:
beforeEach(() => {
cy.intercept("GET", "example/API", { fixture: "fixture2.json" });
});
and so on. All my test cases are identical for different responses and I expect them to have the same result regardless of the data returned by the API.
Running a suite with different parameters
cypress run --env fixture=fixture1
// or
cypress run --env fixture=fixture2
Ref Environment Variables
In support/index.js
beforeEach(() => {
const fixtureName = `${Cypress.env('fixture')}.json`
cy.intercept("GET", "example/API", { fixture: fixtureName });
})
I expect them to have the same result regardless of the data returned by the API - I'm not sure what that means exactly but to me it suggests you don't have to worry about changing the fixture.
If you want a single call from the command line to run the whole suite, say 3 times with a different fixture each time, consider using the Cypress Module API
In a script file, e.g /scripts/run-fixtures.js
const fixtures = ['fixture1.json','fixture2.json','fixture3.json']
const cypress = require('cypress')
fixtures.forEach((fixtureName) => {
cypress.run({
reporter: 'junit',
browser: 'chrome',
config: {
baseUrl: 'http://localhost:8080',
video: true,
},
env: {
fixture: fixtureName,
},
})
})
Run it with node /scripts/run-fixtures.
support/index.js is the same as above.

protractor javascript execution context stack order

I'm trying to understand the order in which protractor executes according to the execution stack. What is the order in the execution stack after the global execution context (ec) is created and pushed? Is it?
stack
------
|spec1 ec|
|spec2 ec|
|spec3 ec|
|onPrepare ec|
|conf.js ec|
|global ec|
----------
I'm really sure this is not correct because i'm just guessing here. Can someone shed some light on what execution context gets create and when? Thanks.
I can Guide as per my knowledge as below:
Protractor calls conf.js as we write protractor conf.js
So the starting point is conf.js
conf.js generally contains onPrepare where you can keep environment details and reports generation options either customized or you use and package from npm packages.
Also onPrepare has been one of the most useful parts of the config.js file as it allows to define my variables in one place and have access to them across the different spec.js files.
See example
Globals: It is possible to set globals from the Protractor config file with the help of params property:
exports.config = {
// ...
params: {
glob: 'test'
}
// ...
};
You Can use it in Spec as:
browser.executeScript(function (glob) {
// use passed variables on the page
console.log(glob);
}, browser.params.glob);
Sample taken from here
conf.js, onPrepare, globals are part of setup and pre-requistes to run test cases which are in specs some being optional.
After successful creation of those, specs are run which ever way you define it in conf.js running it in parallel/sequentially upon different browsers.
Example:
multiCapabilities: [
{
shardTestFiles: true,
maxInstances: 1,
sequential: true,
browserName: 'chrome',
specs: ['specs/spec1.js','specs/spec2.js','specs/spec3.js']
},
{
shardTestFiles: true,
maxInstances: 1,
sequential: true,
browserName: 'chrome',
specs: ['specs/spec4.js',
'specs/spec5.js',
'specs/spec6.js',
]
}
You can also define suites such as regression, sanity etc and run them individually.
protractor config.js --suite regression,sanity
For your question:
1) conf.js
2) globals & on Prepare
3)specs
I hope you are clear now.

Testing Angular with Gulp-mocha: "Window is not Defined"

I am setting up a project with Gulp to run unit tests with Mocha, including Angular tests. I have the basic set up working (indexOf, etc.), however when I include angular-mocks I get this error or a node-module error:
ReferenceError in 'gulp-mocha': "Window is not defined"
I've tried including angular-module-mocks, using gulp-mocha-phantomjs... but the result is the same. (With mocha-phantomjs my error was 'Init timeout'.) I've seen many examples of configurations with Mocha and Angular or Gulp and Karma but have not yet found a solution for Gulp, Mocha and Angular alone.
I'm thinking of something similar to this Karma solution to correctly load angular-mocks by specifying it in a config file and forcing Gulp to load it (Angular testing with Karma: "module is not defined"). However, even if this would work, it seems like gulp-mocha does not support loading a configuration file (mocha.opts - https://github.com/sindresorhus/gulp-mocha/issues/26). I would be happy to hear a more straightforward solution.
I am using angular-mocks 1.2.22 and gulp-mocha 1.1.0.
Code snippets:
var mocha = require('gulp-mocha');
gulp.task('test', function () {
return gulp.src('test/*.js', {read: false})
.pipe(mocha({reporter: 'nyan', timeout: 400}));
});
test/test.js
var assert = require('assert');
var angular_mocks = require('angular-mocks'); //Fails only when this line is present
//tests
What finally worked for me with Gulp/Browserify/Mocha was using Karma and Mocha combined.
Specifically, I used gulp-karma, and defined the configuration at karma.config.js and used a dummy file for gulp.src as others have done:
gulp.task('test', function () {
return gulp.src('./foobar.js').pipe(karma({
configFile:'karma.config.js',
action: 'run'
}))
.on('error', handleErrors);
});
Then I used this karma.config.js file. I needed the npm modules karma-mocha, karma-chai, and karma-bro. (With only the first two, I was getting 'require is not defined'. Then of course I tried including karma-requirejs, but that does not work with Browserify. Then I tried karma-commonjs, which still didn't work. Then I tried karma-browserify, and got a strange error involving bundle() that no one seems to have solved (https://github.com/xdissent/karma-browserify/issues/46). Karma-bro did the trick.)
I also needed to preprocess each file referenced in the tests as well as the tests themselves. (For using phantomJS also include karma-phantomjs-launcher. And I am using the bower version of angular-mocks simply because it is more recent: v1.2.25 compared to 1.2.22 for npm - but the npm version might work.)
module.exports = function(config) {
config.set({
basePath: '',
// frameworks to use
frameworks: ['browserify', 'mocha', 'chai'],
// list of files / patterns to load in the browser
files: [
'node_modules/angular/lib/angular.min.js',
'bower_components/angular-mocks/angular-mocks.js',
'source/javascript/controllers/*.js',
'source/javascript/*.js',
'test/*.js'
],
reporters: ['progress'],
port: 9876,
colors: true,
autoWatch: true,
browsers: ['PhantomJS'],
preprocessors: {
'source/javascript/controllers/*.js': ['browserify'],
'source/javascript/*.js': ['browserify'],
'test/*.js': ['browserify']
}
});
};
And finally this test passes. At the end I needed to make sure the names of my modules and controllers were consistent (capitals etc.) to resolve 'Module not defined' errors. For debugging I replaced node_modules/angular/lib/angular.min.js with node_modules/angular/lib/angular.js in the files.
describe('Angular', function() {
describe('App Controllers', function() {
beforeEach(angular.mock.module('App'));
describe('MessageCtrl', function() {
it('should retrieve the correct amount of messsages', angular.mock.inject(function($controller) {
var scope = {},
ctrl = $controller('MessageCtrl', {$scope:scope});
assert.equal(scope.messages.length, 2);
}));
});
});
});
I do get this: 'WARNING: Tried to load angular more than once.' I can live with it.

Why is Karma refusing to serve my JSON fixture (which I'd like to use in my jasmine / angularjs tests)

As indicated in this stackoverflow answer, it looks like Karma will serve JSON fixtures. However, I've spent too many hours trying to get it to work in my environment. Reason: I'm doing angular testing and need to load mock HTTP results into the test, as Jasmine doesn't support any global setup/teardown with mock servers and stuff.
In my karma config file, I'm defining a fixture as so:
files: [
// angular
'angular/angular.min.js',
'angular/angular-route.js',
'angular/mock/angular-mocks.js',
// jasmine jquery helper
'jquery-1.10.2.min.js',
'angular/jasmine-jquery.js',
// our app
'../public/js/FooApp.js',
// our tests
'angular/*-spec.js',
// fixtures
{ pattern: 'node/mock/factoryResults.json',
watched: 'true',
served: 'true',
included: 'false' }
]
Before I even attempt to use jasmine-jquery.js in my jasmine test to load the JSON, I see karma choking on trying to serve it:
...
DEBUG [web-server]: serving: /Users/XXX/FooApp/spec/node/mock/factoryResults.json
Firefox 25.0.0 (Mac OS X 10.8) ERROR
SyntaxError: missing ; before statement
at /Users/XXX/FooApp/spec/node/mock/factoryResults.json:1
...
Here's what factoryResults.json looks like:
{ "why": "WHY" }
Any idea what's going on here? I see plenty of examples on the web of folks successfully loading JSON into jasmine tests via karma fixtures. Karma can see the file; if I put the wrong path in my fixture block, I see an error stating that it couldn't find any files that match my fixture pattern. I've tried reformatting the .json file in different ways... Any ideas?
Your problem is that 'false' has to be a boolean, not a string.
There is already an issue to validate the config better and fix such a mistakes.
Also, you might write a simple "json" preprocessor (similar to karma-html2js) that would make it valid JS and put the JSON into some global namespace so that you can keep the tests synchronous...
I also needed json fixtures in my karma test suite.
I ended up just using the html2js preprocessor with json files as well as html.
karma.conf.js:
module.exports = function (config) {
config.set({
frameworks: ["jasmine"],
files: [
'**/*.js',
'**/*.html',
'**/*.json',
'**/*.spec.js'
],
plugins: [
'karma-html2js-preprocessor'
]
preprocessors: {
'**/*.html': ['html2js'],
'**/*.json': ['html2js']
}
});
};
Then it is just a matter of getting the json from the __html__ global.
e.g.
var exampleJson = __html__['example.json'];
var jsonObj = JSON.parse(exampleJson);
var exampleHtml = __html__['example.html'];
document.body.innerHTML = exampleHtml;
So, I had a lot of issues with jasmine-jquery and I got a pretty decent workaround.
It's a little hacky, but it works. Basically, I just create a function accessible on the window, then stack the JSON fixtures inside a little switch:
if (typeof(window.fixtures === "undefined")) {
window.fixtures = {};
}
window.setFixture = function(type) {
var json;
if (type == "catalog") {
json = { ... }
}
if (typeof(type) !== "undefined") {
window.fixtures[type] = json;
}
return json;
}
Then, I can just stub it inline in the view:
describe "App.Models.Catalog", ->
it "provides the 'App.Models.Catalog' function", ->
expect(App.Models.Catalog).toEqual(jasmine.any(Function))
it "sets up a fixture", ->
setFixture("catalog")
console.log(fixtures["catalog"])
expect(fixtures["catalog"]).toBeDefined()
Boom, tests pass, and the object comes out in the log:
{
catalog_id: '2212',
merchant_id: '114',
legacy_catalog_id: '2340',
name: 'Sample Catalog',
status: '1',
description: 'Catalog Description ',
}
Now, it's accessible within my test.
It's of course not perfect or ideal, but I kept hitting strange matchErrors and the like with the jasmine-jquery plugin, and it's simple enough (and fast) for me to paste in a couple of JSON blocks and get moving.
You also save yourself the time fiddling around with the configuration and making any changes to the files for Karma.
Anyone have any better suggestions or have any luck getting jasmine-jquery to work?

Output tests to browser with Karma

I am using Karma to test my project and can see tests passing an failing in the console window, however, how do I get these to show in the browser? The browser only has a green bar (even though a test is failing) with
Karma v0.10.2 - connected
Written in it.
I have tried addong singleRun :false to the karma.config.js file.
The config file looks like this:
module.exports = function (config) {
config.set({
basePath: '../',
files: [
'app/lib/angular/angular.js',
'app/lib/angular/angular-*.js',
'test/lib/angular/angular-mocks.js',
'app/js/**/*.js',
'test/unit/**/*.js'
],
autoWatch: true,
singleRun: false,
frameworks: ['jasmine'],
browsers: ['Chrome'],
plugins: [
'karma-junit-reporter',
'karma-chrome-launcher',
'karma-firefox-launcher',
'karma-jasmine'
],
junitReporter: {
outputFile: 'test_out/unit.xml',
suite: 'unit'
}
})
}
matthias-schuetz wrote a plugin that claims to produce html test output.
https://npmjs.org/package/karma-htmlfile-reporter
Along with the instructions on the plugin page I had to include a reference to the plugin in the Karma config -
plugins: [
'karma-htmlfile-reporter'
]
Accroding to Documention, even not very perfect:
If 'singleRun' is false, it will set the ci-mode on, so, make it true, and you will see some failed red status on the top bars of browers.
There is no straight forward way to do this with Karma.
The "best" way to solve this problem would be to hunker down and write a html-reporter for karma (which would make a lot of us other Karma users very happy).
If this is too much work for you, the second best thing is to use the junit reporter which generates an xml file. You can then post-process the xml file in some way that turns it into a HTML-file which you can then view in your browser
I wanted to display HTML5 Web Notifications with Karma so I wrote something quick to get it to work with Karma version 0.11. Might behave slightly different with other versions. I load this script in with the rest of my application scripts, it will store the karma test results and after completion it will determine the success of the test and then reset to the original karma functions so they're not changed when this script gets run again.
// store all my test results
var results = [];
// Wrap the karma result function
var resultFunc = window.__karma__.result;
window.__karma__.result = function(result){
// run the original function
resultFunc(result);
// push each result on my storage array
results.push(result);
}
// wrap the karma complete function
var completeFunc = window.__karma__.complete;
window.__karma__.complete = function(result){
// run the original function
completeFunc(result);
// determine success
var success = results.every(function(r){ return r.success });
if (success) {
// display a success notification
}
else {
// display a test failure notification
}
// reset the result function
window.__karma__.result = resultFunc;
// reset the complete function
window.__karma__.complete = completeFunc;
}

Categories