Next.js middleware Module not found: Can't resolve 'fs' - javascript

Getting this error in Next.js _middleware file when I try to initialize Firebase admin V9. Anyone know how to solve this issue?
./node_modules/#google-cloud/storage/build/src/bucket.js:22:0
Module not found: Can't resolve 'fs'
../../firebase/auth-admin
import * as admin from "firebase-admin";
if (!admin.apps.length) {
admin.initializeApp({
credential: admin.credential.cert({
projectId: process.env.NEXT_PUBLIC_FIREBASE_PROJECT_ID,
clientEmail: process.env.FIREBASE_CLIENT_EMAIL,
privateKey: process.env.FIREBASE_ADMIN_PRIVATE_KEY,
}),
});
}
const firestore = admin.firestore();
const auth = admin.auth();
export { firestore, auth };
Calling it in my _middleware
import { NextFetchEvent, NextRequest, NextResponse } from "next/server";
import { auth } from "../../firebase/auth-admin";
export default async function authenticate(
req: NextRequest,
ev: NextFetchEvent
) {
const token = req.headers.get("token");
console.log("auth = ", auth);
// const decodeToken = await auth.verifyIdToken(token);
return NextResponse.next();
}
I saw a solution here by customizing webpack but this does not fix it.
/** #type {import('next').NextConfig} */
const nextConfig = {
reactStrictMode: true,
webpack: (config, { isServer, node }) => {
node = {
...node,
fs: "empty",
child_process: "empty",
net: "empty",
tls: "empty",
};
return config;
},
};
module.exports = nextConfig;

The Edge Runtime, which is used by Next.js Middleware, does not support Node.js native APIs.
From the Edge Runtime documentation:
The Edge Runtime has some restrictions including:
Native Node.js APIs are not supported. For example, you can't read or write to the filesystem
Node Modules can be used, as long as they implement ES Modules and do not use any native Node.js APIs
You can't use Node.js libraries that use fs in Next.js Middleware. Try using a client-side library instead.

I wasted a lot of time tying to get this to work. The weird thing is that this will work in the api itself.
So instead of calling firebase-admin action in the _middleware file. Call it in the api itself like:
import type { NextApiRequest, NextApiResponse } from 'next'
import { auth } from "../../firebase/auth-admin";
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const authorization = req.headers.authorization
console.log(`Handler auth header: ${authorization}`)
if (!authorization) {
return res.status(401).json({ message: 'Authorisation header not found.' })
}
const token = authorization.split(' ')[1]
if (!token) {
return res.status(401).json({ message: 'Bearer token not found.' })
}
console.log(`Token: ${token}`)
try {
const {uid} = await auth.verifyIdToken("sd" + token)
console.log(`User uid: ${uid}`)
res.status(200).json({ userId: uid })
} catch (error) {
console.log(`verifyIdToken error: ${error}`)
res.status(401).json({ message: `Error while verifying token. Error: ${error}` })
}
}
A workaround to make this reusable is to create a wrapper function.
If anyone knows how to make this work in a _middleware file, I would be really grateful.
Edit: Gist for the wrapper middleware function:
https://gist.github.com/jvgrootveld/ed1863f0beddc1cc2bf2d3593dedb6da

make sure you're not calling firebase-admin in the client
import * as admin from "firebase-admin";

I've recently released a library that aims to solve the problem: https://github.com/ensite-in/next-firebase-auth-edge
It allows to create and verify tokens inside Next.js middleware and Next.js 13 server components. Built entirely upon Web Crypto API.
Please note it does rely on Next.js ^13.0.5 experimental "appDir" and "allowMiddlewareResponseBody" features.

Related

How can I configure postgreSQL in the Nestjs way?

So I'm in the process of learning NestJs ways. I have a small NestJs backend with only a few routes. Some of them call postgreSQL. I don't want to use any ORM and directly use pg package.
So my next step is learning how to use ConfigService. I have successfully used it to configure all env vars in the backend, but I'm struggling to use it in a small file I use to configure postgreSQL. This is the configuration file (pgconnect.ts):
import { Pool } from 'pg';
import configJson from './config/database.json';
import dotenv from 'dotenv';
dotenv.config();
const config = configJson[process.env.NODE_ENV];
const poolConfig = {
user: config.username,
host: config.host,
database: config.database,
password: config.password,
port: config.port,
max: config.maxClients
};
export const pool = new Pool(poolConfig)
database.json is a json file where I have all connect values divided by environment. Then in service classes I just:
import { Injectable } from '#nestjs/common';
import { Response } from 'express';
import { pool } from 'src/database/pgconnect';
#Injectable()
export class MyService {
getDocumentByName(res: Response, name: string) {
pool.query(
<query, error treatment, etc>
});
}
<...> more queries for insert, update, other selects, etc
}
So how could I use ConfigService inside my configuration file ? I already tried to instance class like this:
let configService = new ConfigService();
and what I would like to do is:
const config = configJson[configService.get<string>('NODE_ENV')];
but it didn't work. You have to pass .env file path to new ConfigService(). And I need to use NODE_ENV var to get it, because it depends on environment. To get NODE_ENV without using ConfigService I would have to use dotenv, but if I'm going to use dotenv I don't need ConfigService in the first place.
So then I tried to create a class:
import { Injectable, HttpException, HttpStatus } from '#nestjs/common';
import { ConfigService } from '#nestjs/config'
const { Pool } = require('pg');
import configJson from './config/database.json';
#Injectable()
export class PgPool {
constructor(private configService: ConfigService) { };
config = configJson[this.configService.get<string>('NODE_ENV')];
poolConfig = {
user: this.config.username,
host: this.config.host,
database: this.config.database,
password: this.config.password,
port: this.config.port,
max: this.config.maxClients
};
static pool = new Pool(this.poolConfig);
}
export const PgPool.pool;
But this doesn't work in several ways. If I use non-static members, I canĀ“t export pool member which is the only thing I need. If I use static members one can't access the other or at least I'm not understanding how one access the other.
So, the questions are: How do I use ConfigService outside of a class or how can I change pgconnect.ts file to do it's job ? If it's through a class the best would be to export only pool method.
Also if you think there's a better way to configure postgreSQL I would be glad to hear.
What I would do, if you're going to be using the pg package directly, is create a PgModule that exposes the Pool you create as a provider that can be injected. Then you can also create a provider for the options specifically for ease of swapping in test. Something like this:
#Module({
imports: [ConfigModule],
providers: [
{
provide: 'PG_OPTIONS',
inject: [ConfigService],
useFactory: (config) => ({
host: config.get('DB_HOST'),
port: config.get('DB_PORT'),
...etc
}),
},
{
provide: 'PG_POOL',
inject: ['PG_OPTIONS'],
useFactory: (options) => new Pool(options),
}
],
exports: ['PG_POOL'],
})
export class PgModule {}
Now, when you need to use the Pool in another service you add PgModule to that service's module's imports and you add #Inject('PG_POOL') private readonly pg: Pool to the service's constructor.
If you want to see an overly engineered solution, you can take a look at my old implementation here
I normally have my own pg module handling the pool with either an additional config file (json) or via processing a .env file:
node-pg-sql.js:
/* INFO: Require json config file */
const fileNameConfigPGSQL = require('./config/pgconfig.json');
/* INFO: Require file operations package */
const { Pool } = require('pg');
const pool = new Pool(fileNameConfigPGSQL);
module.exports = {
query: (text, params, callback) => {
const start = Date.now()
return pool.query(text, params, (err, res) => {
const duration = Date.now() - start
// console.log('executed query', { text, duration, rows: res.rowCount })
callback(err, res)
})
},
getClient: (callback) => {
pool.connect((err, client, done) => {
const query = client.query.bind(client)
// monkey patch for the query method to track last queries
client.query = () => {
client.lastQuery = arguments
client.query.apply(client, arguments)
}
// Timeout of 5 secs,then last query is logged
const timeout = setTimeout(() => {
// console.error('A client has been checked out for more than 5 seconds!')
// console.error(`The last executed query on this client was: ${client.lastQuery}`)
}, 5000)
const release = (err) => {
// calling 'done'-method to return client to pool
done(err)
// cleat timeout
clearTimeout(timeout)
// reset query-methode before the Monkey Patch
client.query = query
}
callback(err, client, done)
})
}
}
pgconfig.json:
{
"user":"postgres",
"host":"localhost",
"database":"mydb",
"password":"mypwd",
"port":"5432",
"ssl":true
}
If you prefer processing a .env file:
NODE_ENV=develepment
NODE_PORT=45500
HOST_POSTGRESQL='localhost'
PORT_POSTGRESQL='5432'
DB_POSTGRESQL='mydb'
USER_POSTGRESQL='postgres'
PWD_POSTGRESQL='mypwd'
and process the file and export vars:
var path = require('path');
const dotenvAbsolutePath = path.join(__dirname, '.env');
/* INFO: Require dotenv package for retieving and setting env-vars at runtime via absolute path due to pkg */
const dotenv = require('dotenv').config({
path: dotenvAbsolutePath
});
if (dotenv.error) {
console.log(`ERROR WHILE READING ENV-VARS:${dotenv.error}`);
throw dotenv.error;
}
module.exports = {
nodeEnv: process.env.NODE_ENV,
nodePort: process.env.NODE_PORT,
hostPostgresql: process.env.HOST_POSTGRESQL,
portPostgresql: process.env.PORT_POSTGRESQL,
dbPostgresql: process.env.DB_POSTGRESQL,
userPostgresql: process.env.USER_POSTGRESQL,
pwdPostgresql: process.env.PWD_POSTGRESQL,
};

ConnectionNotFoundError: Connection "default" was not found. Can someone help me? [duplicate]

I use TypeORM with NestJS and I am not able to save properly an entity.
The connection creation works, postgres is running on 5432 port. Credentials are OK too.
However when I need to save a resource with entity.save() I got :
Connection "default" was not found.
Error
at new ConnectionNotFoundError (/.../ConnectionNotFoundError.ts:11:22)
I checked the source file of TypeORM ConnectionManager (https://github.com/typeorm/typeorm/blob/master/src/connection/ConnectionManager.ts) but it seems that the first time TypeORM creates connection it attributes "default" name if we don't provide one, which is the case for me.
I setup TypeORM with TypeOrmModule as
TypeOrmModule.forRoot({
type: config.db.type,
host: config.db.host,
port: config.db.port,
username: config.db.user,
password: config.db.password,
database: config.db.database,
entities: [
__dirname + '/../../dtos/entities/*.entity.js',
]
})
Of course my constants are correct. Any ideas ?
You are trying to create a repository or manager without the connection being established.
Try doing this const shopkeeperRepository = getRepository(Shopkeeper); inside a function. it will work
the upvoted answer is not necessarily correct, if you not specify the connection name it will default to "default".
const manager = getConnectionManager().get('your_orm_name');
const repository = manager.getRepository<AModel>(Model);
If anyone else has this problem in the future, check this out just in case:
I accidentally did "user.save()" instead of "userRepo.save(user)".
(And of course above initializing the connection like this:
const userRepo = getConnection(process.env.NODE_ENV).getRepository(User)
We are using lerna and using code from library A in package B.
The problem was that both TypeOrm versions in each package differ.
Solution is to make sure that you have exactly the same version installed in each package.
To be on the safe side, delete your node_modules directory and reinstall everything again with yarn install or npm install
Check your yarn.lock for multiple entries of typeorm and make sure there is only one.
If anyone using Express Router with getRepository(), check the code below
const router = Router();
router.get("/", async function (req: Request, res: Response) {
// here we will have logic to return all users
const userRepository = getRepository(User);
const users = await userRepository.find();
res.json(users);
});
router.get("/:id", async function (req: Request, res: Response) {
// here we will have logic to return user by id
const userRepository = getRepository(User);
const results = await userRepository.findOne(req.params.id);
return res.send(results);
});
Just make sure to call getRepository() in every route just like Saras Arya said in the accepted answer.
I follow the below approach creating the Database class. If the connection doesn't exist then it creates the connection else return the existing connection.
import { Connection, ConnectionManager, ConnectionOptions, createConnection, getConnectionManager } from 'typeorm';
export class Database {
private connectionManager: ConnectionManager;
constructor() {
this.connectionManager = getConnectionManager();
}
public async getConnection(name: string): Promise<Connection> {
const CONNECTION_NAME: string = name;
let connection: Connection;
const hasConnection = this.connectionManager.has(CONNECTION_NAME);
if (hasConnection) {
connection = this.connectionManager.get(CONNECTION_NAME);
if (!connection.isConnected) {
connection = await connection.connect();
}
} else {
const connectionOptions: ConnectionOptions = {
name: 'default',
type: 'mysql',
host: 'localhost',
port: 3306,
username: 'root',
password: 'password',
database: 'DemoDb',
synchronize: false,
logging: true,
entities: ['src/entities/**/*.js'],
migrations: ['src/migration/**/*.js'],
subscribers: ['src/subscriber/**/*.js'],
};
connection = await createConnection(connectionOptions);
}
return connection;
}
}
If you are using webpack the make sure entities are imported specifically & returned in array.
import {User} from 'src/entities/User.ts';
import {Album} from 'src/entities/Album.ts';
import {Photos} from 'src/entities/Photos.ts';
const connectionOptions: ConnectionOptions = {
name: 'default',
type: 'mysql',
host: 'localhost',
port: 3306,
username: 'root',
password: 'password',
database: 'DemoDb',
synchronize: false,
logging: true,
entities: [User, Album, Photos],
migrations: ['src/migration/**/*.js'],
subscribers: ['src/subscriber/**/*.js'],
};
Finally
const connectionName = 'default';
const database = new Database();
const dbConn: Connection = await database.getConnection(connectionName);
const MspRepository = dbConn.getRepository(Msp);
await MspRepository.delete(mspId);
For those of you looking for another answer, check this out.
In my case, the issue was because I was passing name in my db config.
export const dbConfig = {
name: 'myDB',
...
}
await createConnection(dbConfig) // like this
As a result, the only connection server knows is myDB not default.
At the same time, in my service, repository was injected without name which will fallback to default. (Service will looking for default connection as a result)
#Service() // typedi
export class Service {
constructor(
// inject without name -> fallback to default
#InjectRepository() private readonly repository
) {}
}
As a fix, I removed name property in my db config.
Or you can pass myDB as a parameter for InjectRepository like #InjectRepository('myDB'), either way works.
In my own case, the actual problem was that my index file imports my router file which imports my controllers which then import my services (where the call to getRepository was made). So the imports were resolving (and thus the call to getRepository) before the connection was established.
I considered implementing Sarya's answer but it's going to leave my code more verbose.
What I did was create a function to connect to the DB in a db/index.ts file
import { createConnection } from "typeorm";
export const getDBConnection = async () => {
const dbConnection = await createConnection();
if (!dbConnection.isConnected) await dbConnection.connect();
return dbConnection;
}
Then create an async function to bootstrap my app. I wait on getDBConnection to resolve before instantiating my app then I import my router file after. That way the import resolution only happens after the connection has been established.
routers/index.ts
import { Router } from 'express';
const router = Router();
/* ... route configurations ... */
export default router;
app.ts
const bootstrap = async () => {
try {
// wait on connection to be established
await getDBConnection();
} catch (error) {
// log error then throw
throw error;
}
// create app
const app = express();
// some middleware configuration...
// now import and setup the router
const { default: router } = await import("./routers");
app.use("/api", router);
// some more middleware configuration...
const server = http.createServer(app);
server.listen(3000, () => console.log('app running at port: 3000'));
};
bootstrap();
I got this error while using getConnectionOptions for different environments. Using one database for development and another for testing. This is how I fixed it:
const connectionOptions = await getConnectionOptions(process.env.NODE_ENV);
await createConnection({...connectionOptions, name:"default"});
I usegetConnectionOptions to get the connection for the current environment, in order to do that successfully you have to change ormconfig.json to an array, with keys "name" containing the different environments you want, like so:
[
{
"name" : "development",
"type": "USER",
"host": "localhost",
"port": 5432,
"username": "postgres",
"password": "PASS",
"database": "YOURDB"
},
{
"name" : "test",
"type": "USERTEST",
"host": "localhost",
"port": 5432,
"username": "postgres",
"password": "PASSTEST",
"database": "YOURDBTEST"
}
]
Now connectionOptions will contain the connection parameters of the current environment, but loading it to createConnection threw the error you pointed. Changing connectionOptions name to "default" fixed the issue.
I know it is super weird but someone might need this:
Windows related reason.
I had the same error caused by the current location set with the drive letter in the lower case (d:/apps/app-name/etc).
The problem got fixed once I updated the directory change instruction to use capital D (D:/apps/app-name/etc).
After verifying TypeOrm versions is same in both the packages i.e- external package and consumer repository as mentioned by #InsOp still issue persist then issue could be-
Basically when we create an external package - TypeORM tries to get the "default" connection option, but If not found then throws an error:
ConnectionNotFoundError: Connection "default" was not found.
We can solve this issue by doing some kind of sanity check before establishing a connection - luckily we have .has() method on getConnectionManager().
import { Connection, getConnectionManager, getConnectionOptions,
createConnection, getConnection, QueryRunner } from 'typeorm';
async init() {
let connection: Connection;
let queryRunner: QueryRunner;
if (!getConnectionManager().has('default')) {
const connectionOptions = await getConnectionOptions();
connection = await createConnection(connectionOptions);
} else {
connection = getConnection();
}
queryRunner = connection.createQueryRunner();
}
Above is a quick code-snippet which was the actual root cause for this issue but If you are interested to see complete working repositories (different examples) -
External NPM Package :
Git Repo : git-unit-of-work (specific file- src/providers/typeorm/typeorm-uow.ts)
Published in NPM : npm-unit-of-work
Consumer of above package : nest-typeorm-postgre (specific files- package.json, src/countries/countries.service.ts & countries.module.ts)
In my case was that I have an array of multiple connections, instead of just one. You have 2 alternatives.
To have at least one default named connection, example:
createConnections([
{
name: 'default',
type: 'mysql',
host: 'localhost',
port: 3306,
username: 'root',
password: 'root',
database: 'users',
entities: [`${__dirname}/entity/*{.js,.ts}`],
synchronize: true,
logging: true
}
]);
To be specific when using the connection:
import {getConnection} from "typeorm";
const db1Connection = getConnection("db1Connection");
// you can work with "db1" database now...
I had this same problem with the following code:
import { HttpException, Inject, NotFoundException } from "#nestjs/common";
import { Not } from "typeorm";
import { Transactional } from "typeorm-transactional-cls-hooked";
import { TENANT_CONNECTION } from "../tenant/tenant.module";
import {Feriados} from './feriados.entity';
export class FeriadosService {
repository: any;
constructor(
#Inject(TENANT_CONNECTION) private connection)
{
this.repository = connection.getRepository(Feriados)
}
#Transactional()
async agregar(tablaNueva: Feriados): Promise<Number> {
const tablaAGuardar = await this.repository.create(tablaNueva)
return await this.guardar(tablaAGuardar)
}
#Transactional()
async actualizar(tablaActualizada: Feriados): Promise<Number>{
const tablaAGuardar = await this.repository.merge(tablaActualizada);
return await this.guardar(tablaAGuardar)
}
async guardar(tabla:Feriados){
await this.repository.save(tabla)
return tabla.id
}
I fixed it by removing the 2 #Transactional()
I hope someone helps.
In typeorm v0.3 the Connection API was replaced by the DataSource API. NestJS adapted this change as well, so if you relied on the old API (e.g. getConnection method) you might see the Connection "default" was not found error.
You can read about the changes and the new API in the release notes: https://github.com/typeorm/typeorm/releases/tag/0.3.0
If you used getConnection you can use app.get(DataSource) instead.
In the new version of Typeorm, 0.3.7, a solution to this problem is next:
In the app.module.ts, change the constructor of the AppModule class and create a method to return Datasource:
export class AppModule {
constructor(private dataSource: DataSource) {}
getDataSource() {
return this.dataSource;
}
}
Then, in the file you need to use add:
const repository = app
.get(AppModule)
.getDataSource()
.getRepository('Entity_name');
Although Saras Arya has provided the correct answer, I have encountered the same error
ConnectionNotFoundError: Connection "default" was not found.
due to the fact that my typeORM entity did have an #Entity() decorator as well as that it had extended BaseEntity.
The two can't live together.

Apollo Server studio not picking up schema from apollo set up

My stack is:
Apollo Server,
graphql,
prisma,
nextjs
I have added a resolver.ts and schema.ts for my graphql config under /graphql
resolver.ts
export const resolvers = {
Query: {
books: () => books,
},
};
const books = [
{
title: 'The Awakening',
author: 'Kate Chopin',
},
{
title: 'City of Glass',
author: 'Paul Auster',
},
];
schema.ts
import { gql } from "apollo-server-micro";
export const typeDefs = gql`
# This "Book" type defines the queryable fields for every book in our data source.
type Book {
title: String
author: String
}
# The "Query" type is special: it lists all of the available queries that
# clients can execute, along with the return type for each. In this
# case, the "books" query returns an array of zero or more Books (defined above).
type Query {
books: [Book]
}
/pages/api/graphql.ts
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import { ApolloServer } from 'apollo-server-micro';
import { typeDefs } from '../../graphql/schema';
import { resolvers } from '../../graphql/resolver';
const apolloServer = new ApolloServer ({typeDefs, resolvers});
const startServer = apolloServer.start();
export default async function handler(req, res) {
res.setHeader('Access-Control-Allow-Credentials', 'true');
res.setHeader(
'Access-Control-Allow-Origin',
'https://studio.apollographql.com'
);
res.setHeader(
'Access-Control-Allow-Headers',
'Origin, X-Requested-With, Content-Type, Accept'
);
if (req.method === 'OPTIONS') {
res.end();
return false;
}
await startServer;
await apolloServer.createHandler({
path: "/api/graphql",
})(req, res);
}
export const config = {
api: {
bodyParse: false
}
}
When I navigate to my api endpoint /api/graphql it takes me to the apollo studio explorer but its not picking up the endpoint or the schema. The errors in dev tools seem to be studio libraries specifically they dont seem very helpful:
StaleWhileRevalidate.js:112 Uncaught (in promise) no-response: no-response :: [{"url":"https://cdn.segment.com/analytics.js/v1/xPczztcxJ39mG3oX3wle6XlgpwJ62XAA/analytics.min.js"}]
at O._handle (https://studio.apollographql.com/service-worker.js:2:71211)
at async O._getResponse (https://studio.apollographql.com/service-worker.js:2:47966)
_handle # StaleWhileRevalidate.js:112
useTelemetryInitializer.ts:174 GET https://cdn.segment.com/analytics.js/v1/xPczztcxJ39mG3oX3wle6XlgpwJ62XAA/analytics.min.js net::ERR_FAILED
I don't think its anything to do with prisma as all I have done is set up a postgresql db and defined some basic schema. Dont see why studio is not picking my endpoint, it doesn't seem to be CORS related as im not getting cross origin errors.
Studio screenshot:
I'm sorry if is too late, but the one thing that fixed this for me was cleaning the storage of the website

How to store files in production in Next.js?

I have a file exchange Next.js app that I would like to deploy. In development whenever file is dropped, the app stores the file in root public folder, and when the file is downloaded the app takes it from there as well using the <a> tag with href attribute of uploads/{filename}. This all works pretty well in development, but not in production.
I know that whenever npm run build is run, Next.js takes the files from public folder and the files added there at runtime will not be served.
The question is are there any ways of persistent file storage in Next.js apart from third party services like AWS S3?
Next.js does allow file storage at buildtime, but not at runtime. Next.js will not be able to fulfill your file upload requirement. AWS S3 is the best option here.
next in node runtime is NodeJS, thus If your cloud provider allows create persistent disk and mount it to your project, then you can do it:
e.g. pages/api/saveFile.ts:
import { writeFileSync } from 'fs';
import { NextApiRequest, NextApiResponse } from 'next';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const { path = null } = req.query;
if (!path) {
res.status(400).json({ name: 'no path provided' })
} else {
// your file content here
const content = Date.now().toString();
writeFileSync(`/tmp/${path}.txt`, content);
res.json({
path,
content
})
}
}
/tmp works almost in every cloud provider (including Vercel itself), but those files will be lost on next deployment; instead you should use your mounted disk path
pages/api/readFile.ts
import { readFileSync } from 'fs';
import { NextApiRequest, NextApiResponse } from 'next';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const { path = '' } = req.query;
if (!path) {
res.status(400).json({ name: 'wrong' })
} else {
res.send(readFileSync(`/tmp/${path}`));
}
}
Live running example:
fetch('https://eaglesdoc.vercel.app/api/writefile?path=test')
.then(res => res.text()).then(res => console.log(res)).then( () =>
fetch('https://eaglesdoc.vercel.app/api/readfile?path=test'))
.then(res => res.text()).then(res => console.log(res))

How do I stub a function that is not directly passed to the calling function?

I have an express app with API endpoints secured using JWT token. I have a method that verifies a received token.
// authentication.js
import jwt from 'jsonwebtoken';
import Settings from '../settings';
const AuthenticationMiddleware = {
verifyToken: (req, res, next) => {
const token = req.headers['x-access-token'];
if (!token) {
const msg = 'Include a valid token in the x-access-token header';
return res.status(422).json({
error: 'No token provided',
msg
});
}
try {
req.user = jwt.verify(token, Settings.jwtSecret);
req.token = token;
return next();
}
catch (e) {
return res.status(422).json({ error: 'Invalid token' });
}
}
};
export default AuthenticationMiddleware;
This works fine when I call the API endpoints from postman with the token header included.
Now I have a test such as shown below. There's about 40 of them, each requiring a token to be sent with each API request.
// should is not used directly in the file but is added as a mocha requirement
import supertest from 'supertest';
import app from '../app';
const server = supertest.agent(app);
const BASE_URL = '/api/v1';
describe('/loans: Get all loans', () => {
it('should return a list of all loans', done => {
server
.get(`${BASE_URL}/loans`)
.expect(200)
.end((err, res) => {
res.status.should.equal(200);
res.body.data.should.be.an.instanceOf(Array);
for (const each of res.body.data) {
each.should.have.property('id');
each.should.have.property('userid');
}
done();
});
});
});
I've looked at sinon and tried stubbing the verifyToken function in mocha's before hook like so
import sinon from 'sinon';
import AuthenticationMiddleware from '../middleware/authentication';
before(() => {
const stub = sinon.stub(AuthenticationMiddleware, 'verifyToken');
stub.returnsThis()
});
But I can already see a problem here. While the verifyToken stub may have been created, it is NOT used during the test. The verifyToken that is being called during the test is passed as middleware from the route like so
router.get('/loans', AuthenticationMiddleware.verifyToken, LoansController.get_all_loans);
I want a way to stub verifyToken during the test so that I can just return next() immediately.
My question is, is it possible to stub AuthenticationMiddleware.verifyToken universally during the test so that all calls to the API endpoint call the stubbed version?
According to these two posts, Sinon stub being skipped as node express middleware and How to mock middleware in Express to skip authentication for unit test?, the reason for my stub not being active was that the app was being imported and cached before the stub is even created, so the app uses the one it has cached.
So the solution was to alter the required function before the app gets a chance to cache it. What I did was that (I stumble upon it by trial and error) was to create a file in my test folder called stubs.js and here's the content.
import sinon from 'sinon';
import AuthenticationMiddleware from '../middleware/authentication';
sinon.stub(AuthenticationMiddleware, 'verifyToken').callsFake(
(req, res, next) => next()
);
Then I require this file in my test runner in package.json like so
"scripts": {
"test": "nyc --reporter=html --reporter=text --reporter=lcov mocha -r #babel/register -r should -r test/stubs.js"
},

Categories