How to mock es6 class using Jest - javascript

I am attempting to mock a class Mailer using jest and I can't figure out how to do it. The docs don't give many examples of how this works. The process is the I will have a node event password-reset that is fired and when that event is fired, I want to send an email using Mailer.send(to, subject, body). Here is my directory structure:
project_root
-- __test__
---- server
------ services
-------- emails
---------- mailer.test.js
-- server
---- services
------ emails
-------- mailer.js
-------- __mocks__
---------- mailer.js
Here is my mock file __mocks__/mailer.js:
const Mailer = jest.genMockFromModule('Mailer');
function send(to, subject, body) {
return { to, subject, body };
}
module.exports = Mailer;
and my mailer.test.js
const EventEmitter = require('events');
const Mailer = jest.mock('../../../../server/services/emails/mailer');
test('sends an email when the password-reset event is fired', () => {
const send = Mailer.send();
const event = new EventEmitter();
event.emit('password-reset');
expect(send).toHaveBeenCalled();
});
and finally my mailer.js class:
class Mailer {
constructor() {
this.mailgun = require('mailgun-js')({
apiKey: process.env.MAILGUN_API_KEY,
domain: process.env.MAILGUN_DOMAIN,
});
}
send(to, subject, body) {
return new Promise((reject, resolve) => {
this.mailgun.messages().send({
from: 'Securely App <friendly-robot#securelyapp.com>',
to,
subject: subject,
html: body,
}, (error, body) => {
if (error) {
return reject(error);
}
return resolve('The email was sent successfully!');
});
});
}
}
module.exports = new Mailer();
So, how do I successfully mock and test this class, using Jest? Many thanks for helping!

You don't have to mock your mailer class but the mailgun-js module. So mailgun is a function that returns the function messages that return the function send. So the mock will look like this.
for the happy path
const happyPath = () => ({
messages: () => ({
send: (args, callback) => callback()
})
})
for the error case
const errorCase = () => ({
messages: () => ({
send: (args, callback) => callback('someError')
})
})
as you have this 2 cases it make sense to mock the module inside your test. First you have to mock it with a simple spy where we later can set the implementation for our cases and then we have to import the module.
jest.mock('mailgun-js', jest.fn())
import mailgun from 'mailgun-js'
import Mailer from '../../../../server/services/emails/mailer'
As your module uses promises we have 2 options either return the promise from the test or use async/await. I use the later one for more info have a look here.
test('test the happy path', async() => {
//mock the mailgun so it returns our happy path mock
mailgun.mockImplementation(() => happyPath)
//we need to use async/awit here to let jest recognize the promise
const send = await Mailer.send();
expect(send).toBe('The email was sent successfully!')
});
If you would like to test that the mailgun send method was called with the correct parameter you need to adapt the mock like this:
const send = jest.fn((args, callback) => callback())
const happyPath = () => ({
messages: () => ({
send: send
})
})
Now you could check that the first parameter for send was correct:
expect(send.mock.calls[0][0]).toMatchSnapshot()

Just for Googlers and future visitors, here's how I've setup jest mocking for ES6 classes.
I also have a working example at github, with babel-jest for transpiling the ES module syntax so that jest can mock them properly.
__mocks__/MockedClass.js
const stub = {
someMethod: jest.fn(),
someAttribute: true
}
module.exports = () => stub;
Your code can call this with new, and in your tests you can call the function and overwrite any default implementation.
example.spec.js
const mockedClass = require("path/to/MockedClass")();
const AnotherClass = require("path/to/AnotherClass");
let anotherClass;
jest.mock("path/to/MockedClass");
describe("AnotherClass", () => {
beforeEach(() => {
mockedClass.someMethod.mockImplementation(() => {
return { "foo": "bar" };
});
anotherClass = new AnotherClass();
});
describe("on init", () => {
beforeEach(() => {
anotherClass.init();
});
it("uses a mock", () => {
expect(mockedClass.someMethod.toHaveBeenCalled();
expect(anotherClass.settings)
.toEqual(expect.objectContaining({ "foo": "bar" }));
});
});
});

Related

Trying to stub a function results in Descriptor for property is non-configurable and non-writable

I'm trying to write out a unit test that stubs the getSignedUrl function from the #aws-sdk/s3-request-presigner package, however when I try stub out the function with sinon, I receive the error:
TypeError: Descriptor for property getSignedUrl is non-configurable and non-writable
const s3RequestSigner = require("#aws-sdk/s3-request-presigner");
const expect = require('chai').expect;
const sinon = require('sinon')
....
it('should throw an error when getSignedUrl rejects', async function() {
const sandbox = sinon.createSandbox();
sandbox.stub(s3RequestSigner, "getSignedUrl").rejects("fakeUrl");
sandbox.restore();
})
I'm using node.js 16 and writing javascript rather than typescript. Is there a way to mock out my function, i'm struggling to write my tests otherwise?
I came up with the following workaround for ES6 modules. You can wrap getSignedUrl in your own module and mock that module instead. This approach should work for any modules where sinon is unable to mock a "non-configurable and non-writable" method.
For example:
my-s3-client-internals.js - Your custom wrapper module
// You'll need to import the original method, assign it to
// a new const, then export that const
import { getSignedUrl as getSignedUrl_orig } from '#aws-sdk/s3-request-presigner';
export const getSignedUrl = getSignedUrl_orig;
my-s3-client.js - Consumer of getSignedUrl
// Import the method instead from your custom file
import { getSignedUrl } from './my-s3-client-internals';
// Call it however you normally would, for example:
export const getUrl(bucket, key) {
const command = new GetObjectCommand({ Bucket: bucket, Key: key });
return getSignedUrl(client, command, { expiresIn: 300 });
}
my-s3-client.spec.js - Unit tests for the consumer module
import { getUrl } from './my-s3-client';
import * as clientInternals from './my-s3-client-internals';
import sinon from 'sinon';
it('does something', () => {
// Mock the method exported from your wrapper module
sinon.stub(clientInternals, 'getSignedUrl')
.callsFake(async (client, command, options) => {
return 'fake-url';
});
// Then call your consumer method to test
const url = await getUrl('test-bucket', 'test-key');
expect(url).to.equal('fake-url');
});
So I won't make this the official answer, unless there are no better solutions, but this is what my research has brought about a solution.
The issue is related to this: https://github.com/sinonjs/sinon/issues/2377
Where sinon will throw an error when the Object.descriptor is non-configurable.
There is no obvious way around that currently, that I can find. The way to solve it is to use proxyquire:
const sinon = require('sinon')
const proxyquire = require('proxyquire')
...
it('should throw an error when getSignedUrl rejects', async function() {
const fakeurl = 'hello world'
const fakeURL = sinon.stub().resolves(fakeurl)
const handler = proxyquire(
'../../handlers/presigned_url',
{
'#aws-sdk/s3-request-presigner': {
'getSignedUrl': async () => {
return fakeURL()
}
}
}
)
This will then resolve with whatever you want fakeurl to be.
Another possible solution is to use mockery. E.g. to mock uuid
import { expect } from 'chai';
import mockery from 'mockery';
import sinon from 'sinon';
describe('domain/books', () => {
let createBook;
let uuidStub;
before(async () => {
mockery.enable({
warnOnReplace: false,
warnOnUnregistered: false,
});
uuidStub = sinon.stub();
mockery.registerMock('uuid', { v4: uuidStub });
({ createBook } = await import('../domain/books.js'));
});
afterEach(() => {
sinon.resetHistory();
});
after(() => {
sinon.restore();
mockery.disable();
mockery.deregisterAll();
});
describe('createBook', () => {
it('should save a book and return the id', () => {
const id = 'abc123';
uuidStub.returns(id);
const { id: bookId } = createBook({
title: 'My Book',
author: 'Jane Doe',
});
expect(bookId).to.equal(id);
});
});
});
The mockery setup is a bit tedious, but the library saved me a number of times.

How to mock a function using Frisby and Jest to return custom response?

I'm trying to mock a function using Frisby and Jest.
Here are some details about my code:
dependencies
axios: "^0.26.0",
dotenv: "^16.0.0",
express: "^4.17.2"
devDependencies
frisby: "^2.1.3",
jest: "^27.5.1"
When I mock using Jest, the correct response from API is returned, but I don't want it. I want to return a fake result like this: { a: 'b' }.
How to solve it?
I have the following code:
// (API Fetch file) backend/api/fetchBtcCurrency.js
const axios = require('axios');
const URL = 'https://api.coindesk.com/v1/bpi/currentprice/BTC.json';
const getCurrency = async () => {
const response = await axios.get(URL);
return response.data;
};
module.exports = {
getCurrency,
};
// (Model using fetch file) backend/model/cryptoModel.js
const fetchBtcCurrency = require('../api/fetchBtcCurrency');
const getBtcCurrency = async () => {
const responseFromApi = await fetchBtcCurrency.getCurrency();
return responseFromApi;
};
module.exports = {
getBtcCurrency,
};
// (My test file) /backend/__tests__/cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
describe("Testing GET /api/crypto/btc", () => {
beforeEach(() => {
jest.mock('../api/fetchBtcCurrency');
});
it('Verify if returns correct response with status code 200', async () => {
const fetchBtcCurrency = require('../api/fetchBtcCurrency').getCurrency;
fetchBtcCurrency.mockImplementation(() => (JSON.stringify({ a: 'b'})));
const defaultExport = await fetchBtcCurrency();
expect(defaultExport).toBe(JSON.stringify({ a: 'b'})); // This assert works
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'}); // Integration test with Frisby does not work correctly.
});
});
Response[
{
I hid the lines to save screen space.
}
->>>>>>> does not contain provided JSON [ {"a":"b"} ]
];
This is a classic lost reference problem.
Since you're using Frisby, by looking at your test, it seems you're starting the server in parallel, correct? You first start your server with, say npm start, then you run your test with npm test.
The problem with that is: by the time your test starts, your server is already running. Since you started your server with the real fetchBtcCurrency.getCurrency, jest can't do anything from this point on. Your server will continue to point towards the real module, not the mocked one.
Check this illustration: https://gist.githubusercontent.com/heyset/a554f9fe4f34101430e1ec0d53f52fa3/raw/9556a9dbd767def0ac9dc2b54662b455cc4bd01d/illustration.svg
The reason the assertion on the import inside the test works is because that import is made after the mock replaces the real file.
You didn't share your app or server file, but if you are creating the server and listening on the same module, and those are "hanging on global" (i.e: being called from the body of the script, and not part of a function), you'll have to split them. You'll need a file that creates the server (appending any route/middleware/etc to it), and you'll need a separate file just to import that first one and start listening.
For example:
app.js
const express = require('express');
const { getCurrency } = require('./fetchBtcCurrency');
const app = express()
app.get('/api/crypto/btc', async (req, res) => {
const currency = await getCurrency();
res.status(200).json(currency);
});
module.exports = { app }
server.js
const { app } = require('./app');
app.listen(4000, () => {
console.log('server is up on port 4000');
});
Then, on your start script, you run the server file. But, on your test, you import the app file. You don't start the server in parallel. You'll start and stop it as part of the test setup/teardown.
This will give jest the chance of replacing the real module with the mocked one before the server starts listening (at which point it loses control over it)
With that, your test could be:
cryptoBtc.test.js
require("dotenv").config();
const frisby = require("frisby");
const URL = "http://localhost:4000/";
const fetchBtcCurrency = require('./fetchBtcCurrency');
const { app } = require('./app');
jest.mock('./fetchBtcCurrency')
describe("Testing GET /api/crypto/btc", () => {
let server;
beforeAll((done) => {
server = app.listen(4000, () => {
done();
});
});
afterAll(() => {
server.close();
});
it('Verify if returns correct response with status code 200', async () => {
fetchBtcCurrency.getCurrency.mockImplementation(() => ({ a: 'b' }));
await frisby
.get(`${URL}api/crypto/btc`)
.expect('status', 200)
.expect('json', { a: 'b'});
});
});
Note that the order of imports don't matter. You can do the "mock" below the real import. Jest is smart enough to know that mocks should come first.

Remix: middleware pattern to run code before loader on every request?

Is there a recommended pattern in Remix for running common code on every request, and potentially adding context data to the request? Like a middleware? A usecase for this might be to do logging or auth, for example.
The one thing I've seen that seems similar to this is loader context via the getLoadContext API. This lets you populate a context object which is passed as an arg to all route loaders.
It does work, and initially seems like the way to do this, but the docs for it say...
It's a way to bridge the gap between the adapter's request/response API with your Remix app
This API is an escape hatch, it’s uncommon to need it
...which makes me think otherwise, because
This API is explicitly for custom integrations with the server runtime. But it doesn't seem like middlewares should be specific to the server runtime - they should just be part of the 'application' level as a Remix feature.
Running middlewares is a pretty common pattern in web frameworks!
So, does Remix have any better pattern for middleware that runs before every loader?
Instead of middleware, you can call a function directly inside the loader, this will also be more explicit. If you want to early return a response from those "middlewares" Remix let you throw the response object.
For example, if you wanted to check the user has a certain role you could create this function:
async function verifyUserRole(request: Request, expectedRole: string) {
let user = await getAuthenticatedUser(request); // somehow get the user
if (user.role === expectedRole) return user;
throw json({ message: "Forbidden" }, { status: 403 });
}
And in any loader call it this way:
let loader: LoaderFunction = async ({ request }) => {
let user = await verifyUserRole(request, "admin");
// code here will only run if user is an admin
// and you'll also get the user object at the same time
};
Another example could be to require HTTPS
function requireHTTPS(request: Request) {
let url = new URL(request.url);
if (url.protocol === "https:") return;
url.protocol = "https:";
throw redirect(url.toString());
}
let loader: LoaderFunction = async ({ request }) => {
await requireHTTPS(request);
// run your loader (or action) code here
};
There is no way inside Remix to run code before loaders.
As you found out, there is the loader context but it runs even before remix starts to do its job (so you won't know which route modules are matched for example).
You can also run arbitrary code before handing the request to remix in the JS file where you use the adapter for the platform you're deploying to (this depend on the starter you used. This file doesn't exist if you've chosen remix server as your server)
For now it should work for some use cases, but I agree this is a missing feature in remix for now.
Inside app/root.tsx
export let loader: LoaderFunction = ({ request }) => {
const url = new URL(request.url);
const hostname = url.hostname;
const proto = request.headers.get("X-Forwarded-Proto") ?? url.protocol;
url.host =
request.headers.get("X-Forwarded-Host") ??
request.headers.get("host") ??
url.host;
url.protocol = "https:";
if (proto === "http" && hostname !== "localhost") {
return redirect(url.toString(), {
headers: {
"X-Forwarded-Proto": "https",
},
});
}
return {};
};
Source: https://github.com/remix-run/remix-jokes/blob/8f786d9d7fa7ea62203e87c1e0bdaa9bda3b28af/app/root.tsx#L25-L46
here is my middlewares implementation for remix with typescript,it's works well
ctx.return(something)=== useLoaderData()
import compose from '#utils/compose';
export default function Index() {
const ctx = useLoaderData();
return <div>{ctx.name}</div>;
}
type DefaultCtx = {
name: string;
} & Request;
export const loader =(...args)=>compose<DefaultCtx>(
async (ctx, next) => {
ctx.name = 'first';
await next();
},
async (ctx, next) => {
ctx.name = 'secnod';
await next();
},
async (ctx, next) => {
ctx.name = 'third';
ctx.return(ctx);
await next();
}
)(args);
compose is same as koa;
here is the compose's implementation
type Next = () => Promise<void>;
type Context = {};
type Middle<T = {}> = (ctx: Context & T, next: Next) => void;
const compose = <T>(...middlewares: Middle<T>[]) => {
return middlewares.reverse().reduce(
(dispatch, middleware) => {
return async ctx =>
middleware(ctx, async () => dispatch(ctx, async () => {}));
},
async () => {}
);
};
export type Middleware<T = {}, P = unknown> = (
ctx: Context & T & { return: (param: P) => void },
next: Next
) => void;
const returnEarly: Middleware = async (ctx, next) => {
return new Promise<any>(async resolve => {
ctx.return = resolve;
await next();
});
};
const componseWithReturn = <T>(...middlewares: Middleware<T>[]) =>
compose(returnEarly, ...middlewares) as (ctx: T) => void;
export default componseWithReturn;

How do I listen to events from a smart contract using ethers.js contract.on() in a node.js application?

I'm trying to listen to events emitted from the USDT contract Transfer function using ethers.js (not web3) in a node.js application.
When I run the script, the code runs with no errors and then quickly exits. I'd expect to get the event logs. I'm not sure what step I'm missing.
I've tested this script by calling the getOwner() method and console logging that result, this works fine, so my connection to mainnet is ok.
I'm using alchemy websocket.
My index.js file
const hre = require("hardhat");
const ethers = require('ethers');
const USDT_ABI = require('../abis/USDT_ABI.json')
async function main() {
const usdt = "0xdAC17F958D2ee523a2206206994597C13D831ec7";
const provider = new ethers.providers.WebSocketProvider("wss://eth-mainnet.ws.alchemyapi.io/v2/MY_API");
const contract = new ethers.Contract(usdt, USDT_ABI, provider)
contract.on('Transfer', (from, to, value) => console.log(from, to, value))
}
main()
.then(() => process.exit(0))
.catch(error => {
console.error(error);
process.exit(1);
});
My hardhat.config.js file
require("#nomiclabs/hardhat-waffle");
require('dotenv').config()
// This is a sample Hardhat task. To learn how to create your own go to
// https://hardhat.org/guides/create-task.html
task("accounts", "Prints the list of accounts", async () => {
const accounts = await ethers.getSigners();
for (const account of accounts) {
console.log(account.address);
}
});
// You need to export an object to set up your config
// Go to https://hardhat.org/config/ to learn more
/**
* #type import('hardhat/config').HardhatUserConfig
*/
module.exports = {
paths: {
artifacts: './src/artifacts',
},
networks: {
mainnet: {
url: "wss://eth-mainnet.ws.alchemyapi.io/v2/MY_API",
accounts: [`0x${process.env.PRIVATE_KEY}`]
},
hardhat: {
chainId: 1337
},
},
solidity: "0.4.8"
};`
I solved this by removing
.then(() => process.exit(0))
.catch(error => {
console.error(error);
process.exit(1);
});
and just calling main. It's recommended in the hardhat docs to use the .then and .catch code but when running a long running process like this script does with contract.on(), it causes the script to exit.
I do this:
const ethers = require('ethers');
const abi = [{...}]
const contractAddress = '0x000...'
const webSocketProvider = new ethers.providers.WebSocketProvider(process.env.ETHEREUM_NODE_URL, process.env.NETWORK_NAME);
const contract = new ethers.Contract(contractAddress, abi, webSocketProvider);
contract.on("Transfer", (from, to, value, event) => {
console.log({
from: from,
to: to,
value: value.toString(),
data: event
});
});
The event return all data related to event and transaction.

FeathersJS: Injecting HTTP headers in Service Test

In a feathersJS service, I have a before hook being ran that expects a certain HTTP header to exist:
src/services/service_name/service_name.hooks.js
const validationHook = () => (context, next) => {
if (!context.params.headers.hasOwnProperty('header-wanted'))
throw new errors.BadRequest();
next(null, context);
};
module.exports = {
before: {
all: [cronValidationHook()],
...
..
.
When testing this service in a generated test file from feathers-cli, however, I haven't found a way to inject headers prior to the before hook being called. The test in question is:
test/services/service_name.test.js
describe('get', () => {
it('should run "id" endpoint', async () => {
const service = app.service('v1/cron');
const resp = await service.get('id', params);
// Assertions exist after this call
});
});
Is there a way to do this that does not require utilizing an HTTP call via node-fetch or requests?
params will be whatever you pass. Just set params.headers to what you would like to test, e.g.
const getParams = {
...params,
headers: { 'header-wanted': 'something' }
};
const resp = await service.get('id', getParams);

Categories