I'm trying to make sure my app gets properly destroyed after all my Jest tests have run, but I'm running into some very strange behaviour trying to use Jest's global teardown config value.
Here's the situation: my app creates a database connection. It also has a destroy method that closes the database connection. This works.
I have a single test that starts the server, runs a query against the database connection. In my global teardown function, I call app.destroy(), but the process hangs.
If I comment out the destroy call in the global teardown function and put app.destroy() in my test after the query, Jest doesn't hang and closes like it's supposed to. I can also put afterAll(() => app.destroy()) in my test file and things work properly.
Here is my jest.config.js
module.exports = {
testEnvironment: 'node',
roots: [
'<rootDir>/src'
],
transform: {
'^.+\\.tsx?$': 'ts-jest'
},
testRegex: '(/__tests__/.*|(\\.|/)(test|spec))\\.tsx?$',
moduleFileExtensions: [
'ts',
'tsx',
'js',
'jsx',
'json',
'node'
],
globalSetup: '<rootDir>/src/testSetup.ts',
globalTeardown: '<rootDir>/src/testTeardown.ts'
};
Here is the test (it's currently the only test in the app):
import app from '../..';
describe('User router', () => {
it('Should respond with an array of user objects', async () => {
await app.models.User.query();
});
});
And here is the global teardown in <rootDir>/src/testTeardown.ts:
import app from './index';
module.exports = async function testTeardown() {
await app.destroy();
};
Using the code above, the process hangs after tests finish. I've tried adding a console.log to testTeardown and the end of the test, and the logs happen in the correct order: test logs, then teardown logs. However if I move app.destroy() into my test it works perfectly:
import app from '../..';
describe('User router', () => {
it('Should respond with an array of user objects', async () => {
await app.models.User.query();
await app.destroy();
});
});
This also works:
import app from '../..';
afterAll(() => app.destroy());
describe('User router', () => {
it('Should respond with an array of user objects', async () => {
await app.models.User.query();
});
});
Why is this happening?
Also just for shits and giggles I tried setting a global._app in the test and then checking it in the teardown handler, but it was undefined. Do Jest's setup/teardown functions even get run in the same process as the tests?
No, jest globalSetup and globalTeardown files don't necessarily get run in the same process as your tests. This is because jest parallelises your tests and runs each test file in a separate process, but there is only one global setup/teardown phase for the combined set of test files.
You can use setupFiles to add a file that gets run in process with each test file. In the setupFiles file you can put:
afterAll(() => app.destroy());
Your jest config is just
module.exports = {
testEnvironment: 'node',
roots: [
'<rootDir>/src'
],
transform: {
'^.+\\.tsx?$': 'ts-jest'
},
testRegex: '(/__tests__/.*|(\\.|/)(test|spec))\\.tsx?$',
moduleFileExtensions: [
'ts',
'tsx',
'js',
'jsx',
'json',
'node'
],
setupFiles: ['<rootDir>/src/testSetup.ts']
};
For the latest version of jest just include this configuration option in your jestconfiguration file
// globalSetup: '<rootDir>/tests_setup/testSetup.ts',
// globalTeardown: '<rootDir>/tests_setup/testTearDown.ts',
setupFiles: [
'<rootDir>/tests_setup/testSetup.ts',
'<rootDir>/tests_setup/testTearDown.ts',
],
With this Jest will automatically load the setup and tearDown functions automatically
All you need to do is to export a function that sets up and runs the beforeAll and another that runs the afterAll i.e
const tearDownTests = async () => {
beforeAll(async () => {
console.log('Test setup');
mockMongoose.prepareStorage().then(() => {
mongoose.connect('mongodb://localhost/test');
mongoose.connection.on('connected', () => {
console.log('db connection is now open');
});
});
});
};
export default tearDownTests;
and
const tearDownTests = async () => {
afterAll(async () => {
console.log('Finished on tests');
await mongoose.disconnect();
await mockMongoose.killMongo();
});
};
export default tearDownTests;
Related
I want to set NODE_ENV in one of the unit test but it's always set to test so my tests fails.
loggingService.ts
...
const getTransport = () => {
if (process.env.NODE_ENV !== "production") {
let console = new transports.Console({
format: format.combine(format.timestamp(), format.simple()),
});
return console;
}
const file = new transports.File({
filename: "logFile.log",
format: format.combine(format.timestamp(), format.json()),
});
return file;
};
logger.add(getTransport());
const log = (level: string, message: string) => {
logger.log(level, message);
};
export default log;
loggingService.spec.ts
...
describe("production", () => {
beforeEach(() => {
process.env = {
...originalEnv,
NODE_ENV: "production",
};
console.log("test", process.env.NODE_ENV);
log(loglevel.INFO, "This is a test");
});
afterEach(() => {
process.env = originalEnv;
});
it("should call log method", () => {
expect(winston.createLogger().log).toHaveBeenCalled();
});
it("should not log to the console in production", () => {
expect(winston.transports.Console).not.toBeCalled();
});
it("should add file transport in production", () => {
expect(winston.transports.File).toBeCalledTimes(1);
});
});
...
How can I set process.env.NODE_ENV to production in my tests preferably in the beforeEach such that the if block in my service is false and the file transport is returned. I have omitted some code for the sake of brevity.
The core problem you are facing is caused by the fact that once you attempt to import the file that you are trying to test into your test suite - the code within it will be immediately evaluated and the implicitly invoked functions will be executed, meaning that logger.add(getTransport()); will be called before any of the functions like beforeEach have a chance to set the environment variables.
The only way to get around this is to use the following approach:
You will first need to assign the process.env.NODE_ENV environment variable to a const variable within another file. Let's call it environmentVariables.ts, and the following will be its contents:
export const ENVIRONMENT = process.env.NODE_ENV;
We will then have to refactor getTransport to use this variable as follows:
const getTransport = () => {
if (ENVIRONMENT !== "production") {
In your test suite, you will then have to mock out the const file which will allow you to change what the ENVIRONMENT variable is set to. Note ../src/environmentVariables is an example directory and you will have to actually define what the relevant directory of this file is. Additionally make sure that this is outside of the describe clause, preferably above for readability:
jest.mock('../src/environmentVariables', () => ({
ENVIRONMENT: 'production',
}));
Your unit tests will then execute with the ENVIRONMENT being production.
I am trying to make test case suite with mocha and mongodb-memory-server(as an in memory db). I am trying to implement this in the way below.
Project structure:
test runner: (in package.json)
"scripts": {
"test": "mocha 'app/**/*.spec.js' --recursive --exit"
},
So, First I need to initialise the in memory MongoDB, thats why I am using global.spec.js which looks like this,
const { MongoMemoryServer } = require("mongodb-memory-server");
const mongoose = require("mongoose");
mongoose.set("usePushEach", true);
let mongoServer;
before(async function () {
// mongod donwload on first time
this.timeout(30 * 1000);
mongoServer = new MongoMemoryServer();
const mongoUri = await mongoServer.getUri();
await mongoose.connect(mongoUri, {});
process.env.AGENDA_DB_URI = mongoUri;
});
after(function () {
mongoose.disconnect();
mongoServer.stop();
});
and a test-setup.js file which looks like this,
const { MongoMemoryServer } = require("mongodb-memory-server");
(async function() {
// trigger downloading mongodb executable on first time
const mongoServer = new MongoMemoryServer();
await mongoServer.getUri();
mongoServer.stop();
})()
.then(() => {
process.exit(0);
})
.catch(err => {
console.error(err);
process.exit(1);
});
all *.spec.js files will be inside modules folder. In simple word each folder inside module will have one .spec.js file. If I try to run this using the npm run test command it is throwing me some error, that looks like this,
1) "before all" hook
0 passing (448ms)
1 failing
1) "before all" hook:
Uncaught Error: TypeError: logWarnFn is not a function
I believe this logWarnFn error is coming from i18n. But when i start the server it is working fine.
versions:
"mongodb-memory-server": "^6.6.2",
"mocha": "^6.0.2",
"i18n": "0.8.3",
I'll start with an example of how I set up my tests for a backend server. TL;DR at bottom.
This file represents my server:
//server.js
const express = require('express');
class BackendServer {
constructor(backingFileDestination, database) {
this.server = express();
/* Set up server ... */
/* Set up a mysql connection ... */
}
closeMySQLConnectionPool = () => {
/* Close the mysql connection ... */
};
}
module.exports = BackendServer;
In my package.json, I have the following:
"jest": {
"testEnvironment": "node",
"setupFilesAfterEnv": [
"<rootDir>/tests/setupTests.js"
]
}
Which allows me to use this setup file for my tests:
//setupTests.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// drop, recreate, and populate the database once before any tests run
beforeAll(async () => {
await supertester.post(endpointNames.DROP_ALL_TABLES);
await supertester.post(endpointNames.CREATE_ALL_TABLES);
await supertester.post(endpointNames.POPULATE_ALL_TABLES);
});
// clean up the local setupTests server instance after all the tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
Notice how I had to import the BackendServer class and instantiate it, then use that instance of it.
Now, I have other test files, for example test1.test.js:
//test1.test.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// clean up the local server instance after all tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
test('blah blah', () => {
/* Some test ... */
});
The problem is when I go to write test2.test.js, it will be the same as test1.test.js. This is the issue. For every test file, I need to instantiate a new server and then have a separate afterAll() call that cleans up that server's SQL connection. I can't stick that afterAll() inside of setupTests.js because it needs to operate on test1.test.js's local server instance.
TL;DR: Each of my test files instantiates a new instance of my server. What I want is to instantiate the server once in setupTests.js and then simply use that instance in all my tests. Is there a good way to share this single instance between all my test files?
I was able to figure out a way to achieve what I wanted. It involves instantiating variables in setupTests.js and then exporting getters for them.
//setupTests.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// drop, recreate, and populate the database once before any tests run
beforeAll(async () => {
await supertester.post(endpointNames.DROP_ALL_TABLES);
await supertester.post(endpointNames.CREATE_ALL_TABLES);
await supertester.post(endpointNames.POPULATE_ALL_TABLES);
});
// clean up the local setupTests server instance after all the tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
const getServer = () => { // <==== ADD THESE 2 FUNCTIONS
return server;
};
const getSupertester = () => { // <==== ADD THESE 2 FUNCTIONS
return supertester;
};
module.exports = { getServer, getSupertester }; // <==== EXPORT THEM
I added a couple functions to the end of setupTests.js that, when called, will return whatever the local variables point to at the time. In this case, server and supertester are declared with const, so I could have exported them directly I think, but in other cases I had some variables that I wanted to share that were declared with let, so I left it this way.
Now I have functions exported from setupTests.js that I can import in my test files like this:
//test1.test.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const { getServer, getSupertester } = require('./setupTests.js');
test('blah blah', () => {
/* Some test using getSupertester().post() or getServer().someFunction() ... */
});
So now I can have local variables inside setupTests.js that are accessible in my .test.js files.
This makes my whole testing process cleaner because I only need to set up and tear down one server, which means only 1 connection pool for my SQL server and less code duplication of having to instantiate and clean up a new server in every .test.js file.
Cheers.
I'm trying to switch from Mocha and Chai to Jest. In my current setup I'm also using chai-files to compare the contents of two files:
import chai, { expect } from 'chai';
import chaiFiles, { file } from 'chai-files';
import fs from 'fs-extra';
import { exec } from 'child-process-promise';
chai.use(chaiFiles);
describe('cli', () => {
before(() => {
process.chdir(__dirname);
});
it('should run', async () => {
// make a copy of entry file
fs.copySync('./configs/entry/config.version-and-build.xml', './config.xml');
// executes code that changes temp files
await exec('../dist/cli.js -v 2.4.9 -b 86');
// checks if target file and change temp file are equal
expect(file('./config.xml')).to.equal(file('./configs/expected/config.version-and-build.to.version-and-build.xml'));
});
afterEach(() => {
if (fs.existsSync(tempConfigFile)) {
fs.removeSync(tempConfigFile);
}
});
});
How should this be done in Jest? Will I need to load both files and compare the content?
Yes, simply load the contents of each like so:
expect(fs.readFileSync(actualPath)).toEqual(fs.readFileSync(expectedPath));
It seems protractor doesn't provide any out of the box solution for starting a server before it runs. Having to run multiple commands before functional tests will run is a bad user experience and bad for automated testing.
Angular-cli has its own solution that is rather complicated, which this plugin claims to duplicate, although it doesn't work for me and may be unmaintained. https://www.npmjs.com/package/protractor-webpack
EDIT: BETTER SOLUTION ACCEPTED BELOW
I came up with a solution using child_process.exec that seems to work well, although I don't like it very much. I'd like to share it in case anyone needs it and to see if anyone can come up with a better solution.
Launch the process in the beforeLaunch hook of protractor:
beforeLaunch: () => {
webpackServerProcess = exec(`webpack-dev-server --port=3003 --open=false`, null, () => {
console.log(`Webpack Server process reports that it exited. Its possible a server was already running on port ${port}`)
});
},
Then above the configuration block we set up the exit handlers to make positively sure that server gets killed when we are done.
let webpackServerProcess; // Set below in beforeLaunch hook
function cleanUpServer(eventType) {
console.log(`Server Cleanup caught ${eventType}, killing server`);
if (webpackServerProcess) {
webpackServerProcess.kill();
console.log(`SERVER KILLED`);
}
}
[`exit`, `SIGINT`, `SIGUSR1`, `SIGUSR2`, `uncaughtException`].forEach((eventType) => {
process.on(eventType, cleanUpServer.bind(null, eventType));
})
The various event listeners are needed to handle cntrl+c events and situations where the process is killed by ID. Strange that node does not provide an event to encompass all of these.
Protractor also has onCleanUp that will run after all the specs in the file have finished.
And you are doing the right thing by keeping a reference to your process so that you can kill it later.
let webpackServerProcess;
beforeLaunch: () {
webpackServerProcess = exec('blah'); // you could use spawn instead of exec
},
onCleanUp: () {
process.kill(webpackServerProcess.pid);
// or webpackServerProcess.exit();
}
Since you are launching the serverProcess with child_process.exec, and not in a detached state, it should go away if the main process is killed with SIGINT or anything else. So you might not even have to kill it or cleanup.
I found a much more reliable way to do it using the webpack-dev-server node api. That way no separate process is spawned and we don't have to clean anything. Also, it blocks protractor until webpack is ready.
beforeLaunch: () => {
return new Promise((resolve, reject) => {
new WebpackDevServer(webpack(require('./webpack.config.js')()), {
// Do stuff
}).listen(APP_PORT, '0.0.0.0', function(err) {
console.log('webpack dev server error is ', err)
resolve()
}).on('error', (error) => {
console.log('dev server error ', error)
reject(error)
})
})
},
// conf.js
var jasmineReporters = require('jasmine-reporters');
var Jasmine2HtmlReporter = require('protractor-jasmine2-html-reporter');
const path = require('path');
const WebpackDevServer = require('webpack-dev-server');
const webpack = require('webpack');
let webpackServerProcess;
beforeLaunch: () => {
return new Promise(resolve => {
setTimeout(() => {
const compiler = webpack(require('./webpack.config.js'));
const server = new WebpackDevServer(compiler, {
stats: 'errors-only'
});
server.listen(0, 'localhost', () => {
// `server.listeningApp` will be returned by `server.listen()`
// in `webpack-dev-server#~2.0.0`
const address = server.listeningApp.address();
config.baseUrl = `http://localhost:${address.port}`;
resolve();
});
}, 5000);
});
};
exports.config = {
framework: 'jasmine',
//seleniumAddress: 'http://localhost:4444/wd/hub',
specs: ['Test/spec.js'],
directConnect: true,
// Capabilities to be passed to the webdriver instance.
capabilities: {
'browserName': 'chrome'/*,
chromeOptions: {
args: [ '--headless','--log-level=1', '--disable-gpu', '--no-sandbox', '--window-size=1920x1200' ]
}*/
},
onPrepare: function() {
jasmine.getEnv().addReporter(new jasmineReporters.JUnitXmlReporter({
consolidateAll: true,
filePrefix: 'guitest-xmloutput',
savePath: 'reports'
}));
jasmine.getEnv().addReporter(new Jasmine2HtmlReporter({
savePath: 'reports/',
screenshotsFolder: 'images',
takeScreenshots: true,
takeScreenshotsOnlyOnFailures: true,
cleanDestination: false,
fileName: 'TestReport'
}));
},
}
onCleanUp: ()=> {
//process.kill(webpackServerProcess.pid);
webpackServerProcess.exit();
}