Jest: Share Variables Between Test Files - javascript

I'll start with an example of how I set up my tests for a backend server. TL;DR at bottom.
This file represents my server:
//server.js
const express = require('express');
class BackendServer {
constructor(backingFileDestination, database) {
this.server = express();
/* Set up server ... */
/* Set up a mysql connection ... */
}
closeMySQLConnectionPool = () => {
/* Close the mysql connection ... */
};
}
module.exports = BackendServer;
In my package.json, I have the following:
"jest": {
"testEnvironment": "node",
"setupFilesAfterEnv": [
"<rootDir>/tests/setupTests.js"
]
}
Which allows me to use this setup file for my tests:
//setupTests.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// drop, recreate, and populate the database once before any tests run
beforeAll(async () => {
await supertester.post(endpointNames.DROP_ALL_TABLES);
await supertester.post(endpointNames.CREATE_ALL_TABLES);
await supertester.post(endpointNames.POPULATE_ALL_TABLES);
});
// clean up the local setupTests server instance after all the tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
Notice how I had to import the BackendServer class and instantiate it, then use that instance of it.
Now, I have other test files, for example test1.test.js:
//test1.test.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// clean up the local server instance after all tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
test('blah blah', () => {
/* Some test ... */
});
The problem is when I go to write test2.test.js, it will be the same as test1.test.js. This is the issue. For every test file, I need to instantiate a new server and then have a separate afterAll() call that cleans up that server's SQL connection. I can't stick that afterAll() inside of setupTests.js because it needs to operate on test1.test.js's local server instance.
TL;DR: Each of my test files instantiates a new instance of my server. What I want is to instantiate the server once in setupTests.js and then simply use that instance in all my tests. Is there a good way to share this single instance between all my test files?

I was able to figure out a way to achieve what I wanted. It involves instantiating variables in setupTests.js and then exporting getters for them.
//setupTests.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// drop, recreate, and populate the database once before any tests run
beforeAll(async () => {
await supertester.post(endpointNames.DROP_ALL_TABLES);
await supertester.post(endpointNames.CREATE_ALL_TABLES);
await supertester.post(endpointNames.POPULATE_ALL_TABLES);
});
// clean up the local setupTests server instance after all the tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
const getServer = () => { // <==== ADD THESE 2 FUNCTIONS
return server;
};
const getSupertester = () => { // <==== ADD THESE 2 FUNCTIONS
return supertester;
};
module.exports = { getServer, getSupertester }; // <==== EXPORT THEM
I added a couple functions to the end of setupTests.js that, when called, will return whatever the local variables point to at the time. In this case, server and supertester are declared with const, so I could have exported them directly I think, but in other cases I had some variables that I wanted to share that were declared with let, so I left it this way.
Now I have functions exported from setupTests.js that I can import in my test files like this:
//test1.test.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const { getServer, getSupertester } = require('./setupTests.js');
test('blah blah', () => {
/* Some test using getSupertester().post() or getServer().someFunction() ... */
});
So now I can have local variables inside setupTests.js that are accessible in my .test.js files.
This makes my whole testing process cleaner because I only need to set up and tear down one server, which means only 1 connection pool for my SQL server and less code duplication of having to instantiate and clean up a new server in every .test.js file.
Cheers.

Related

How to mock NODE_ENV in unit test using Jest

I want to set NODE_ENV in one of the unit test but it's always set to test so my tests fails.
loggingService.ts
...
const getTransport = () => {
if (process.env.NODE_ENV !== "production") {
let console = new transports.Console({
format: format.combine(format.timestamp(), format.simple()),
});
return console;
}
const file = new transports.File({
filename: "logFile.log",
format: format.combine(format.timestamp(), format.json()),
});
return file;
};
logger.add(getTransport());
const log = (level: string, message: string) => {
logger.log(level, message);
};
export default log;
loggingService.spec.ts
...
describe("production", () => {
beforeEach(() => {
process.env = {
...originalEnv,
NODE_ENV: "production",
};
console.log("test", process.env.NODE_ENV);
log(loglevel.INFO, "This is a test");
});
afterEach(() => {
process.env = originalEnv;
});
it("should call log method", () => {
expect(winston.createLogger().log).toHaveBeenCalled();
});
it("should not log to the console in production", () => {
expect(winston.transports.Console).not.toBeCalled();
});
it("should add file transport in production", () => {
expect(winston.transports.File).toBeCalledTimes(1);
});
});
...
How can I set process.env.NODE_ENV to production in my tests preferably in the beforeEach such that the if block in my service is false and the file transport is returned. I have omitted some code for the sake of brevity.
The core problem you are facing is caused by the fact that once you attempt to import the file that you are trying to test into your test suite - the code within it will be immediately evaluated and the implicitly invoked functions will be executed, meaning that logger.add(getTransport()); will be called before any of the functions like beforeEach have a chance to set the environment variables.
The only way to get around this is to use the following approach:
You will first need to assign the process.env.NODE_ENV environment variable to a const variable within another file. Let's call it environmentVariables.ts, and the following will be its contents:
export const ENVIRONMENT = process.env.NODE_ENV;
We will then have to refactor getTransport to use this variable as follows:
const getTransport = () => {
if (ENVIRONMENT !== "production") {
In your test suite, you will then have to mock out the const file which will allow you to change what the ENVIRONMENT variable is set to. Note ../src/environmentVariables is an example directory and you will have to actually define what the relevant directory of this file is. Additionally make sure that this is outside of the describe clause, preferably above for readability:
jest.mock('../src/environmentVariables', () => ({
ENVIRONMENT: 'production',
}));
Your unit tests will then execute with the ENVIRONMENT being production.

How to mock a function using Frisby and Jest to return custom response?

I'm trying to mock a function using Frisby and Jest.
Here are some details about my code:
dependencies
axios: "^0.26.0",
dotenv: "^16.0.0",
express: "^4.17.2"
devDependencies
frisby: "^2.1.3",
jest: "^27.5.1"
When I mock using Jest, the correct response from API is returned, but I don't want it. I want to return a fake result like this: { a: 'b' }.
How to solve it?
I have the following code:
// (API Fetch file) backend/api/fetchBtcCurrency.js
const axios = require('axios');
const URL = 'https://api.coindesk.com/v1/bpi/currentprice/BTC.json';
const getCurrency = async () => {
const response = await axios.get(URL);
return response.data;
};
module.exports = {
getCurrency,
};
// (Model using fetch file) backend/model/cryptoModel.js
const fetchBtcCurrency = require('../api/fetchBtcCurrency');
const getBtcCurrency = async () => {
const responseFromApi = await fetchBtcCurrency.getCurrency();
return responseFromApi;
};
module.exports = {
getBtcCurrency,
};
// (My test file) /backend/__tests__/cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
describe("Testing GET /api/crypto/btc", () => {
beforeEach(() => {
jest.mock('../api/fetchBtcCurrency');
});
it('Verify if returns correct response with status code 200', async () => {
const fetchBtcCurrency = require('../api/fetchBtcCurrency').getCurrency;
fetchBtcCurrency.mockImplementation(() => (JSON.stringify({ a: 'b'})));
const defaultExport = await fetchBtcCurrency();
expect(defaultExport).toBe(JSON.stringify({ a: 'b'})); // This assert works
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'}); // Integration test with Frisby does not work correctly.
});
});
Response[
{
I hid the lines to save screen space.
}
->>>>>>> does not contain provided JSON [ {"a":"b"} ]
];
This is a classic lost reference problem.
Since you're using Frisby, by looking at your test, it seems you're starting the server in parallel, correct? You first start your server with, say npm start, then you run your test with npm test.
The problem with that is: by the time your test starts, your server is already running. Since you started your server with the real fetchBtcCurrency.getCurrency, jest can't do anything from this point on. Your server will continue to point towards the real module, not the mocked one.
Check this illustration: https://gist.githubusercontent.com/heyset/a554f9fe4f34101430e1ec0d53f52fa3/raw/9556a9dbd767def0ac9dc2b54662b455cc4bd01d/illustration.svg
The reason the assertion on the import inside the test works is because that import is made after the mock replaces the real file.
You didn't share your app or server file, but if you are creating the server and listening on the same module, and those are "hanging on global" (i.e: being called from the body of the script, and not part of a function), you'll have to split them. You'll need a file that creates the server (appending any route/middleware/etc to it), and you'll need a separate file just to import that first one and start listening.
For example:
app.js
const express = require('express');
const { getCurrency } = require('./fetchBtcCurrency');
const app = express()
app.get('/api/crypto/btc', async (req, res) => {
const currency = await getCurrency();
res.status(200).json(currency);
});
module.exports = { app }
server.js
const { app } = require('./app');
app.listen(4000, () => {
console.log('server is up on port 4000');
});
Then, on your start script, you run the server file. But, on your test, you import the app file. You don't start the server in parallel. You'll start and stop it as part of the test setup/teardown.
This will give jest the chance of replacing the real module with the mocked one before the server starts listening (at which point it loses control over it)
With that, your test could be:
cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
const fetchBtcCurrency = require('./fetchBtcCurrency');
const { app } = require('./app');
jest.mock('./fetchBtcCurrency')
describe("Testing GET /api/crypto/btc", () => {
let server;
beforeAll((done) => {
server = app.listen(4000, () => {
done();
});
});
afterAll(() => {
server.close();
});
it('Verify if returns correct response with status code 200', async () => {
fetchBtcCurrency.getCurrency.mockImplementation(() => ({ a: 'b' }));
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'});
});
});
Note that the order of imports don't matter. You can do the "mock" below the real import. Jest is smart enough to know that mocks should come first.

node Mqtt.js structuring code / best practices

So I've been searching for a long time on mqtt.js examples for structuring and best practices and haven't found anything worthwhile. thus [main] how do you structure your mqtt.js code in your node/express application?
[1] So the libraries mqttjs/async-MQTT provides some example on connecting and on-message but on a real app with lots of subscription and publishes how to structure code so that it initiliazes on the app.js and uses the same client (return from the mqtt.connect) for all the sub/pub in different files.
[2] and from the question[1] should my app only use 1 client for all the works or can use multiple clients as needed on multiple files (let's say I have 3 files mqttInit, subscriber, publisher. so if I use the init on subscriber and get a client should I export it or just make a new instance of a client on the publisher file)
[3] so the mqttjs API provides only an onMessage function so all subscribed topics message gets here thus I put a switch or a if else to manage this so if we have a lot of topics how do you manage this
[4] so my current setup is kind of messed up
this is the initializer file lets say'
mqttService.js
const mqtt = require("mqtt");
const { readFileSync } = require("fs");
module.exports = class mqttService {
constructor() {
this.client = mqtt.connect("mqtt://xxxxxxxxxxx", {
cert: readFileSync(process.cwd() + "/certificates/client.crt"),
key: readFileSync(process.cwd() + "/certificates/client.key"),
rejectUnauthorized: false,
});
this.client.on("error", (err) => {
console.log(err);
});
this.client.once("connect", () => {
console.log("connected to MQTT server");
});
}
};
subscriber.js
this is the function(subscribe()) that I call in app.js to init the mqtt thing
const { sendDeviceStatus, sendSensorStatus } = require("../socketApi");
const { client } = new (require("./mqttService"))();
function subscribe() {
let state = {
timer: false,
};
...
let topics = {
....
},
client.subscribe([...]);
client.on("message", async (topic, buffer) => {
if (topic) {
...
}
});
}
module.exports = {
subscribe,
client,
};
publish.js
const { AsyncClient } = require("async-mqtt");
const _client = require("./subscribe").client;
const client = new AsyncClient(_client);
async function sendSensorList(daqId) {
let returnVal = await client.publish(
`${daqId}-GSL-DFC`,
JSON.stringify(publishObject),
{ qos: 1 }
);
console.log(returnVal);
return publishObject;
}
.....
module.exports = {
sendSensorList,
.......
};
so as you can see from the above code everything is kind of linked with one another and messed up thus I need some expo on how you structure code
thanks for reading, please feel free to provide any info and any info is much appreciated

Export logging in Nodejs between files

I have two files in nodejs :
index.js
function.js
The index.js is my main file in which i call the functions inside function.js. In function.js i need to use logging, the problem is i didn't figure out how to use it.
function.js
module.exports = {
Exemplfunciton: async () => {
app.log('#### This is just an exemple im trying to run')
}
checkCalcul:async(a,b) = > {
log.(`The Val of A : ${a}, the Val of B: ${b}`
return a+b
}
}
index.js
const functionToCall = require('/function.js)
module.exports = app => {
functionToCall.Exemplfunciton()
functionToCall.checkCalcul(4,5)
}
Will return
app is not defined
tried it without the app in the function.js it returned to me
log not defined.
I only need to use the app.log between the functions ( my main one the index.js and the function.js )
Pass as an argument
module.exports = app => {
functionToCall.Exemplfunciton(app) // add here
}
Then consume
module.exports = {
Exemplfunciton: async (app) => { // add here
app.log('#### This is just an exemple im trying to run')
}
}
To log in Node.js, you should use console https://nodejs.org/api/console.html
Example
module.exports = {
ExampleFunction: async () => {
console.log('#### This is just an example I\'m trying to run')
}
}
const functionToCall = require('./function.js')
functionToCall.ExampleFunction() // logs #### This is just an example I\'m trying to run
Consider extracting the log functionality out into its own file that can be referenced by function.js, index.js, and anything else in your app. For example:
logger.js
module.exports = {
log: function() {
/* aggregate logs and send to your logging service, like TrackJS.com */
}
}
function.js
var logger = require(“./log.js”);d
module.exports = {
exampleFunction: function() {
logger.log(“foo bar”);
}
};
index.js
var functions = require(“./functions.js”);
var logger = require(“./log.js”);
functions.exampleFunction();
logger.log(“foo”);
You should send the logs off to a service like TrackJS to aggregate, report, and alert you to production problems.

Outputting file details using ffprobe in ffmpeg AWS Lambda layer

I am trying to output the details of an audio file with ffmpeg using the ffprobe option. But it is just returning 'null' at the moment? I have added the ffmpeg layer in Lambda. can anyone spot why this is not working?
const { spawnSync } = require("child_process");
const { readFileSync, writeFileSync, unlinkSync } = require("fs");
const util = require('util');
var fs = require('fs');
let path = require("path");
exports.handler = (event, context, callback) => {
spawnSync(
"/opt/bin/ffprobe",
[
`var/task/myaudio.flac`
],
{ stdio: "inherit" }
);
};
This is the official AWS Lambda layer I am using, it is a great prooject but a little lacking in documentation.
https://github.com/serverlesspub/ffmpeg-aws-lambda-layer
First of all, I would recommend using NodeJS 8.10 over NodeJs 6.10 (it will be soon EOL, although AWS is unclear on how long it will be supported)
Also, I would not use the old style handler with a callback.
A working example below - since it downloads a file from the internet (couldn't be bothered to create a package to deploy on lambda with the file uploaded) give it a bit more time to work.
const { spawnSync } = require('child_process');
const util = require('util');
var fs = require('fs');
let path = require('path');
const https = require('https');
exports.handler = async (event) => {
const source_url = 'https://upload.wikimedia.org/wikipedia/commons/b/b2/Bell-ring.flac'
const target_path = '/tmp/test.flac'
async function downloadFile() {
return new Promise((resolve, reject) => {
const file = fs.createWriteStream(target_path);
const request = https.get(source_url, function(response) {
const stream = response.pipe(file)
stream.on('finish', () => {resolve()})
});
});
}
await downloadFile()
const test = spawnSync('/opt/bin/ffprobe',[
target_path
]);
console.log(test.output.toString('utf8'))
const response = {
statusCode: 200,
body: JSON.stringify([test.output.toString('utf8')]),
};
return response;
}
NB! In production be sure to generate a unique temporary file as instances that the Lambda function run on are often shared from invocation to invocation, you don't want multiple invocations stepping on each others files! When done, delete the temporary file, otherwise you might run out of free space on the instance executing your functions. The /tmp folder can hold 512MB, so it can run out fast if you work with many large flac files
I'm not fully familiar with this layer, however from looking at the git repo of the thumbnail-builder it looks like child_process is a promise, so you should be waiting for it's result using .then(), otherwise it is returning null because it doesn't wait for the result.
So try something like:
return spawnSync(
"/opt/bin/ffprobe",
[
`var/task/myaudio.flac`
],
{ stdio: "inherit" }
).then(result => {
return result;
})
.catch(error => {
//handle error
});

Categories