I have two files in nodejs :
index.js
function.js
The index.js is my main file in which i call the functions inside function.js. In function.js i need to use logging, the problem is i didn't figure out how to use it.
function.js
module.exports = {
Exemplfunciton: async () => {
app.log('#### This is just an exemple im trying to run')
}
checkCalcul:async(a,b) = > {
log.(`The Val of A : ${a}, the Val of B: ${b}`
return a+b
}
}
index.js
const functionToCall = require('/function.js)
module.exports = app => {
functionToCall.Exemplfunciton()
functionToCall.checkCalcul(4,5)
}
Will return
app is not defined
tried it without the app in the function.js it returned to me
log not defined.
I only need to use the app.log between the functions ( my main one the index.js and the function.js )
Pass as an argument
module.exports = app => {
functionToCall.Exemplfunciton(app) // add here
}
Then consume
module.exports = {
Exemplfunciton: async (app) => { // add here
app.log('#### This is just an exemple im trying to run')
}
}
To log in Node.js, you should use console https://nodejs.org/api/console.html
Example
module.exports = {
ExampleFunction: async () => {
console.log('#### This is just an example I\'m trying to run')
}
}
const functionToCall = require('./function.js')
functionToCall.ExampleFunction() // logs #### This is just an example I\'m trying to run
Consider extracting the log functionality out into its own file that can be referenced by function.js, index.js, and anything else in your app. For example:
logger.js
module.exports = {
log: function() {
/* aggregate logs and send to your logging service, like TrackJS.com */
}
}
function.js
var logger = require(“./log.js”);d
module.exports = {
exampleFunction: function() {
logger.log(“foo bar”);
}
};
index.js
var functions = require(“./functions.js”);
var logger = require(“./log.js”);
functions.exampleFunction();
logger.log(“foo”);
You should send the logs off to a service like TrackJS to aggregate, report, and alert you to production problems.
Related
I am trying to write unit testing using Jest for a Node JS project. It was importing all the modules using require.main.require
Below are the simulation of the issue. Code can be found here: https://stackblitz.com/edit/node-jest-demo?file=index.js
I have the following test file present in my root directory in which I am importing index.js
./sample.pass.test.js
const { checkUser } = require('./index');
console.log(checkUser); // This is purely to check If i can access checkUser from this file or not
describe('Testing...', () => {
it('Should pass', () => {
expect(0).toBe(0);
});
});
In my index.js I am importing another function using require.main.require
const { getUserById } = require.main.require('./models/UserModel');
function checkUser(id) {
const user = getUserById(id);
return user ? 'Found' : 'Not Found';
}
module.exports.checkUser = checkUser;
The above test case is passing. But If I were to place the same test file in some other directly (like _test_ ) then it fails.
E.g.: ./__test__/sample.fail.test.js
Notice here I adjusted require statement of index since it is now one level up
const { checkUser } = require('../index');
console.log(checkUser);
describe('Testing...', () => {
it('Should pass', () => {
expect(0).toBe(0);
});
});
The result shows it is unable to access UserModel.
Cannot find module './models/UserModel' from '__tests__/sample.fail.test.js'
Require stack:
index.js
__tests__/sample.fail.test.js
> 1 | const { getUserById } = require.main.require('./models/UserModel');
| ^
2 |
3 | function checkUser(id) {
4 | const user = getUserById(id);
What could be the solution in this case?
Thank in advance!
I'm trying to mock a function using Frisby and Jest.
Here are some details about my code:
dependencies
axios: "^0.26.0",
dotenv: "^16.0.0",
express: "^4.17.2"
devDependencies
frisby: "^2.1.3",
jest: "^27.5.1"
When I mock using Jest, the correct response from API is returned, but I don't want it. I want to return a fake result like this: { a: 'b' }.
How to solve it?
I have the following code:
// (API Fetch file) backend/api/fetchBtcCurrency.js
const axios = require('axios');
const URL = 'https://api.coindesk.com/v1/bpi/currentprice/BTC.json';
const getCurrency = async () => {
const response = await axios.get(URL);
return response.data;
};
module.exports = {
getCurrency,
};
// (Model using fetch file) backend/model/cryptoModel.js
const fetchBtcCurrency = require('../api/fetchBtcCurrency');
const getBtcCurrency = async () => {
const responseFromApi = await fetchBtcCurrency.getCurrency();
return responseFromApi;
};
module.exports = {
getBtcCurrency,
};
// (My test file) /backend/__tests__/cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
describe("Testing GET /api/crypto/btc", () => {
beforeEach(() => {
jest.mock('../api/fetchBtcCurrency');
});
it('Verify if returns correct response with status code 200', async () => {
const fetchBtcCurrency = require('../api/fetchBtcCurrency').getCurrency;
fetchBtcCurrency.mockImplementation(() => (JSON.stringify({ a: 'b'})));
const defaultExport = await fetchBtcCurrency();
expect(defaultExport).toBe(JSON.stringify({ a: 'b'})); // This assert works
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'}); // Integration test with Frisby does not work correctly.
});
});
Response[
{
I hid the lines to save screen space.
}
->>>>>>> does not contain provided JSON [ {"a":"b"} ]
];
This is a classic lost reference problem.
Since you're using Frisby, by looking at your test, it seems you're starting the server in parallel, correct? You first start your server with, say npm start, then you run your test with npm test.
The problem with that is: by the time your test starts, your server is already running. Since you started your server with the real fetchBtcCurrency.getCurrency, jest can't do anything from this point on. Your server will continue to point towards the real module, not the mocked one.
Check this illustration: https://gist.githubusercontent.com/heyset/a554f9fe4f34101430e1ec0d53f52fa3/raw/9556a9dbd767def0ac9dc2b54662b455cc4bd01d/illustration.svg
The reason the assertion on the import inside the test works is because that import is made after the mock replaces the real file.
You didn't share your app or server file, but if you are creating the server and listening on the same module, and those are "hanging on global" (i.e: being called from the body of the script, and not part of a function), you'll have to split them. You'll need a file that creates the server (appending any route/middleware/etc to it), and you'll need a separate file just to import that first one and start listening.
For example:
app.js
const express = require('express');
const { getCurrency } = require('./fetchBtcCurrency');
const app = express()
app.get('/api/crypto/btc', async (req, res) => {
const currency = await getCurrency();
res.status(200).json(currency);
});
module.exports = { app }
server.js
const { app } = require('./app');
app.listen(4000, () => {
console.log('server is up on port 4000');
});
Then, on your start script, you run the server file. But, on your test, you import the app file. You don't start the server in parallel. You'll start and stop it as part of the test setup/teardown.
This will give jest the chance of replacing the real module with the mocked one before the server starts listening (at which point it loses control over it)
With that, your test could be:
cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
const fetchBtcCurrency = require('./fetchBtcCurrency');
const { app } = require('./app');
jest.mock('./fetchBtcCurrency')
describe("Testing GET /api/crypto/btc", () => {
let server;
beforeAll((done) => {
server = app.listen(4000, () => {
done();
});
});
afterAll(() => {
server.close();
});
it('Verify if returns correct response with status code 200', async () => {
fetchBtcCurrency.getCurrency.mockImplementation(() => ({ a: 'b' }));
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'});
});
});
Note that the order of imports don't matter. You can do the "mock" below the real import. Jest is smart enough to know that mocks should come first.
So I've been searching for a long time on mqtt.js examples for structuring and best practices and haven't found anything worthwhile. thus [main] how do you structure your mqtt.js code in your node/express application?
[1] So the libraries mqttjs/async-MQTT provides some example on connecting and on-message but on a real app with lots of subscription and publishes how to structure code so that it initiliazes on the app.js and uses the same client (return from the mqtt.connect) for all the sub/pub in different files.
[2] and from the question[1] should my app only use 1 client for all the works or can use multiple clients as needed on multiple files (let's say I have 3 files mqttInit, subscriber, publisher. so if I use the init on subscriber and get a client should I export it or just make a new instance of a client on the publisher file)
[3] so the mqttjs API provides only an onMessage function so all subscribed topics message gets here thus I put a switch or a if else to manage this so if we have a lot of topics how do you manage this
[4] so my current setup is kind of messed up
this is the initializer file lets say'
mqttService.js
const mqtt = require("mqtt");
const { readFileSync } = require("fs");
module.exports = class mqttService {
constructor() {
this.client = mqtt.connect("mqtt://xxxxxxxxxxx", {
cert: readFileSync(process.cwd() + "/certificates/client.crt"),
key: readFileSync(process.cwd() + "/certificates/client.key"),
rejectUnauthorized: false,
});
this.client.on("error", (err) => {
console.log(err);
});
this.client.once("connect", () => {
console.log("connected to MQTT server");
});
}
};
subscriber.js
this is the function(subscribe()) that I call in app.js to init the mqtt thing
const { sendDeviceStatus, sendSensorStatus } = require("../socketApi");
const { client } = new (require("./mqttService"))();
function subscribe() {
let state = {
timer: false,
};
...
let topics = {
....
},
client.subscribe([...]);
client.on("message", async (topic, buffer) => {
if (topic) {
...
}
});
}
module.exports = {
subscribe,
client,
};
publish.js
const { AsyncClient } = require("async-mqtt");
const _client = require("./subscribe").client;
const client = new AsyncClient(_client);
async function sendSensorList(daqId) {
let returnVal = await client.publish(
`${daqId}-GSL-DFC`,
JSON.stringify(publishObject),
{ qos: 1 }
);
console.log(returnVal);
return publishObject;
}
.....
module.exports = {
sendSensorList,
.......
};
so as you can see from the above code everything is kind of linked with one another and messed up thus I need some expo on how you structure code
thanks for reading, please feel free to provide any info and any info is much appreciated
I'll start with an example of how I set up my tests for a backend server. TL;DR at bottom.
This file represents my server:
//server.js
const express = require('express');
class BackendServer {
constructor(backingFileDestination, database) {
this.server = express();
/* Set up server ... */
/* Set up a mysql connection ... */
}
closeMySQLConnectionPool = () => {
/* Close the mysql connection ... */
};
}
module.exports = BackendServer;
In my package.json, I have the following:
"jest": {
"testEnvironment": "node",
"setupFilesAfterEnv": [
"<rootDir>/tests/setupTests.js"
]
}
Which allows me to use this setup file for my tests:
//setupTests.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// drop, recreate, and populate the database once before any tests run
beforeAll(async () => {
await supertester.post(endpointNames.DROP_ALL_TABLES);
await supertester.post(endpointNames.CREATE_ALL_TABLES);
await supertester.post(endpointNames.POPULATE_ALL_TABLES);
});
// clean up the local setupTests server instance after all the tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
Notice how I had to import the BackendServer class and instantiate it, then use that instance of it.
Now, I have other test files, for example test1.test.js:
//test1.test.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// clean up the local server instance after all tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
test('blah blah', () => {
/* Some test ... */
});
The problem is when I go to write test2.test.js, it will be the same as test1.test.js. This is the issue. For every test file, I need to instantiate a new server and then have a separate afterAll() call that cleans up that server's SQL connection. I can't stick that afterAll() inside of setupTests.js because it needs to operate on test1.test.js's local server instance.
TL;DR: Each of my test files instantiates a new instance of my server. What I want is to instantiate the server once in setupTests.js and then simply use that instance in all my tests. Is there a good way to share this single instance between all my test files?
I was able to figure out a way to achieve what I wanted. It involves instantiating variables in setupTests.js and then exporting getters for them.
//setupTests.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const BackendServer = require('../server.js');
const server = new BackendServer(
'../backing_files_test/',
'test',
);
const supertester = require('supertest')(server.server);
// drop, recreate, and populate the database once before any tests run
beforeAll(async () => {
await supertester.post(endpointNames.DROP_ALL_TABLES);
await supertester.post(endpointNames.CREATE_ALL_TABLES);
await supertester.post(endpointNames.POPULATE_ALL_TABLES);
});
// clean up the local setupTests server instance after all the tests are done
afterAll(async () => {
await server.closeMySQLConnectionPool();
});
const getServer = () => { // <==== ADD THESE 2 FUNCTIONS
return server;
};
const getSupertester = () => { // <==== ADD THESE 2 FUNCTIONS
return supertester;
};
module.exports = { getServer, getSupertester }; // <==== EXPORT THEM
I added a couple functions to the end of setupTests.js that, when called, will return whatever the local variables point to at the time. In this case, server and supertester are declared with const, so I could have exported them directly I think, but in other cases I had some variables that I wanted to share that were declared with let, so I left it this way.
Now I have functions exported from setupTests.js that I can import in my test files like this:
//test1.test.js
const { endpointNames } = require('../../src/common/endpointNames.js');
const { getServer, getSupertester } = require('./setupTests.js');
test('blah blah', () => {
/* Some test using getSupertester().post() or getServer().someFunction() ... */
});
So now I can have local variables inside setupTests.js that are accessible in my .test.js files.
This makes my whole testing process cleaner because I only need to set up and tear down one server, which means only 1 connection pool for my SQL server and less code duplication of having to instantiate and clean up a new server in every .test.js file.
Cheers.
I'm using mocha and sinon for nodejs unit tests. I have the following
users.js
const Database = require('./lib/Database');
exports.setupNewUser = (name) => {
var user = {
name: name
};
try {
Database.save(user);
}
catch(err) {
console.error('something failed');
}
}
Database.js
exports.save = (user) => {
console.log(`saving: ${user}`);
};
userTest.js
const sinon = require('sinon');
require('chai').should();
const users = require('../src/users');
describe('users', () => {
it('should log an error when the Database save fails', () => {
var databaseSpy = sinon.spy(Database, 'save').throws(); // this is supposed to work??
users.setupNewUser('Charles');
databaseSpy.should.be.called;
});
});
According to the sinon tutorials I've read, I should be able to create that databaseSpy but I keep getting this error: ReferenceError: Database is not defined
What am I missing?
This seems like it might be a pathing issue. Your require might not be getting the correct path.
users.js
const Database = require('./lib/Database');
Where is /lib/Database in relation with the users.js file? I think that might be a good place to start looking.