Is this a Vercel bug? Cannot find module './model' - javascript

I have 2 files:
controller.js
model.js
I'm making an express.js app.
So, model.js is required inside controller.js but I have this error when I call my api
And the logs are:
But here is the problem, './model.js' really does exist, but vercel doesn't recognize it
And works fine in local development and it is required correctly
this is model.js
const { nanoid } = require("nanoid");
const getDateStr = () => {
const now = new Date();
return `${now.getFullYear()}-${now.getMonth()+1}-${now.getDate()}`;
}
const Purchase = ({ id_user_buyer, id_user_seller, id_product, quantity }) => (
{
id_user_buyer,
id_user_seller,
id_product,
quantity,
id_purchase: nanoid(),
date: getDateStr(),
}
);
module.exports = {
Purchase
}
And this is controller.js
const err = require("../../../utils/error");
const { Purchase } = require("./model")
// others modules and they are well imported, so the problem is './model'
const userController = require("../user");
const productController = require("../product");
const cartController = require("../cart");
const TABLE = 'purchase';
function purchaseController(injectedStore) {
// example code
async function makePurchase(data) {
const purchase = Purchase(data);
await injectedStore.insert(TABLE, purchase);
}
return {
makePurchase,
}
}
module.exports = purchaseController;
As you can see, model.js is well imported inside controller.js I don't know why vercel says ERROR Cannot find module './model' I say it again, works fine in local development but not in vercel
A quick fix is copy and paste all the code of model.js inside controller.js I tried it, I deploy him and it worked.
All my app also works fine if I just comment that line where I import ./model , but obviously my application would stop having that functionality, so the first solution is uggly but works, but those are not the best solutions. the best solution is for the file to be imported correctly
Curious fact I tried renaming the file and it didn't help either. It also doesn't work if I import a new file.
NOTE I changed the nodejs version from 12 to 14, will that have something to do with it?
just in case I put my folder structure
root
api
this is my vercel.json
{
"version": 2,
"builds": [
{
"src": "/api/index.js",
"use": "#vercel/node"
}
],
"routes": [
{
"src": "/api/auth(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/users(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/products(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/cart(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/purchases(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/sales(.*)",
"dest": "/api/index.js"
}
]
}
I don't know if it's a vercel bug or it's a mistake on my part. My application works currently but doing the trick that I named before, which was about putting all the code of model.js inside of controller.js
Thanks for reading the whole issue.

Well, I don't know why, but i think was a vercel bug, the solution for my problem was not to use optional chaining.
In controller.js I have more code and I was using optional chaining, just change the logic and works fine. Vercel supports optional chaining but with node 14.x and I was using node 12.x in vercel, so I change it to 14.x and works fine in others files, but not in controller.js
So, if you are using javascript new things, like optional chaining, nullish operator, etc. And you are using version 12.x in vercel, that will produce errors. So you change to node 14.x, it could give you bugs too.
So the best you can do is make sure from the beginning that you have version 14.x of node.js in vercel
how to change node version in vercel

Related

Module not found when requiring library submodule in Jest tests

I am currently trying to write some tests for a node-based GCP Cloud Functions application.
At this point I've stripped it down to the bare minimum:
// index.js
const functions = require("#google-cloud/functions-framework");
const testing = require('#google-cloud/functions-framework/testing');
functions.http("updateProvider",
(req, res) => { res.send("OK"); });
My test file follows the sample here:
// index.spec.js
const {getFunction} = require('#google-cloud/functions-framework/testing');
require('../../');
describe("HelloTests", () => {
test("is testable", () => {
});
});
When I run jest I get the following error:
Cannot find module '#google-cloud/functions-framework/testing' from 'spec/unit/index.spec.js'
Some additional observations:
If I put that import statement into index.js and run the application, it imports just fine.
If I comment out the import statement from index.spec.js but leave it in index.js and run jest, I get the same error for the import in index.js.
This leads me to assume that Jest is not properly handling submodules. I've never worked with submodules like this before (that I can remember), so I'm at a complete loss. I did some digging and this is from the functions-framework node module's package.js:
"exports": {
".": {
"types": "./build/src/index.d.ts",
"default": "./build/src/index.js"
},
"./testing": {
"types": "./build/src/testing.d.ts",
"default": "./build/src/testing.js"
}
},
No idea if this is relevant but wanted to include it in case it's useful.
Any idea why I'm getting this error and/or how to resolve it without switching to ESM?
Update: I switched to ESM and get the exact same error.
This apparently got fixed earlier this year:
Issue: https://github.com/facebook/jest/issues/9771
Initial Release: https://github.com/facebook/jest/releases/tag/v28.0.0-alpha.3
I had copied an older (but still fairly recent!) package.json that was stuck on v27 so it wasn't picking up the latest library. Did a clean install and confirmed no further issue with at least v29.3.1.
Rookie mistake.

Next.js (React) - Can't import local typescript file into config file

Situation
I would like to run some Database code (mongoDB(mongoose)) on server startup / during builds. Considering next js doesn't have any lifecycle hooks that you can hook into in an easy manner, I was trying to perform the database actions in my webpack (next.config.mjs) configuration. However I ran into some problems with importing local files.
Current setup
This is the code of my current next.config.mjs file. (PS. I have also tried the CommonJS way of requiring the needed files, but that also fails with error meessage "module not found".)
None of the lines that import a local typescript file appear to succeed and I have checked the paths multiple times. They always end up with the error message "ERR_MODULE_NOT_FOUND". Only if a node_module package is imported, it works as expected (the mongoose npm package).
Code
/** #type {import('next').NextConfig} */
const { EmployeesSchema } = await import("./mongodb_schemas/employee_schema");
import { EmployeesSchema } from "./mongodb_schemas/employee_schema";
import "./util/test"
import mongoose from "mongoose";
const nextConfig = {
experimental: {
externalDir: true,
},
reactStrictMode: true,
swcMinify: true,
images: {
domains: ["*", "**", "www.google.com"],
},
webpack: (
config,
{ buildId, dev, isServer, defaultLoaders, nextRuntime, webpack }
) => {
if (isServer) {
console.log(process.cwd());
}
return config;
},
};
export default nextConfig;
Anyone got a clue to why this might end up happening / have any possible solutions to the problem? I have also tried with a normal JavaScript file instead of a Typescript file, which also didn't work. I have found some similar asked questions on Stack Overflow but which were all left unanswered.
My guess for the reason why this occurs: during the build of the project, so when "npm run dev" is ran, the next.config.mjs is copied to a different location into the file structure, which means that the relative paths aren't correct anymore and thus the files can't be found.
PS. My apologize if the question is unclear / in an unusual format, it is my first post so not used to it.

expo sqlite use existing database

According to the expo sqlite documentation for react-native I can initialize a db like so:
const db = SQLite.openDatabase('db.db');
This works and I can update the db like so:
update() {
db.transaction(tx => {
tx.executeSql(
`select * from items where done = ?;`,
[this.props.done ? 1 : 0],
(_, { rows: { _array } }) => this.setState({ items: _array })
);
});
}
From my limited understanding this creates a database in the device. And then it's manipulated keeping all the db local.
I have a database with all the necessary tables already setup. How can I have it use the current database I already have setup?
For example: (not correct syntax)
const db = SQLite.openDatabaseIShipWithApp('mypath/mydb.db');
I couldn't find any documentation to help me with this.
The only reason I mention the above is because I already have the db with the tables and data.
Any help would be appreciated!
I was able to achieve this by using expo's FileSystem.downloadAsync:
first I import it since I'm using expo managed app:
import { FileSystem } from 'expo';
Then I download it from a server like so:
// load DB for expo
FileSystem.downloadAsync(
'http://example.com/downloads/data.sqlite',
FileSystem.documentDirectory + 'data.sqlite'
)
.then(({ uri }) => {
console.log('Finished downloading to ', uri)
})
.catch(error => {
console.error(error);
})
The first parameter is the uri for the location, the second one is where I'd like to place it. Here I am using documentDirectory.
If using local prepopulated database in assets:
import * as FileSystem from "expo-file-system";
import {Asset} from "expo-asset";
async function openDatabaseIShipWithApp() {
const internalDbName = "dbInStorage.sqlite"; // Call whatever you want
const sqlDir = FileSystem.documentDirectory + "SQLite/";
if (!(await FileSystem.getInfoAsync(sqlDir + internalDbName)).exists) {
await FileSystem.makeDirectoryAsync(sqlDir, {intermediates: true});
const asset = Asset.fromModule(require("../assets/database/mydb.sqlite"));
await FileSystem.downloadAsync(asset.uri, sqlDir + internalDbName);
}
this.database = SQLite.openDatabase(internalDbName);
}
This creates the SQLite directory and database if not exists. Otherwise FileSystem.downloadAsync() will throw an error on fresh installed app.
Some remarks:
You cannot use variable in require() (only string). See e.g. this.
You have to explicitly allow file extension .db or .sqlite to be loadable in Expo, see this. You have to create a file metro.config.js in root:
const defaultAssetExts = require("metro-config/src/defaults/defaults").assetExts;
module.exports = {
resolver: {
assetExts: [
...defaultAssetExts,
"db", "sqlite"
]
}
};
And may add following to app.json
"expo": {
"assetBundlePatterns": [
"**/*"
]
}
If want to delete loaded database (e.g. for testing) you have to clear whole Expo App data in Phone settings (deleting cache not sufficient). Or write a method like this:
async function removeDatabase() {
const sqlDir = FileSystem.documentDirectory + "SQLite/";
await FileSystem.deleteAsync(sqlDir + "dbInStorage.sqlite", {idempotent: true});
}
It's pretty straight forward
If you bundle your app, you have to move the Database from the asset folder to the document directory first. In order to do that, check if a folder named SQLite exists. If not, create it. Why do you need a folder called SQLite? That is because SQLite.openDatabase(databaseName) looks per default in FileSystem.documentDirectory + 'SQLite'. Then, when the folder is created, you can download the database from the asset folder. Make sure you have your database in a folder called asset. Locate the foler asset under src/asset of your app document tree. Also, make sure to configure your app.json and metro.config.js.
import * as SQLite from 'expo-sqlite';
import * as FileSystem from 'expo-file-system';
import { Asset } from 'expo-asset';
const FOO = 'foo.db'
if (!(await FileSystem.getInfoAsync(FileSystem.documentDirectory + 'SQLite')).exists) {
await FileSystem.makeDirectoryAsync(FileSystem.documentDirectory + 'SQLite');
};
await FileSystem.downloadAsync(
// the name 'foo.db' is hardcoded because it is used with require()
Asset.fromModule(require('../../asset/foo.db')).uri,
// use constant FOO constant to access 'foo.db' whereever possible
FileSystem.documentDirectory + `SQLite/${FOO}`
);
// Then you can use the database like this
SQLite.openDatabase(FOO).transaction(...);
// app.json
{
"name": "Your App name",
"displayName": "Your App name",
"assetBundlePatterns": [
"assets/**"
],
"packagerOpts": {
"assetExts": ["db"]
}
}
// metro config
const { getDefaultConfig } = require('#expo/metro-config');
const defaultConfig = getDefaultConfig(__dirname);
module.exports = {
resolver: {
assetExts: [...defaultConfig.resolver.assetExts, 'db', 'json'],
},
transformer: {
getTransformOptions: async () => ({
transform: {
experimentalImportSupport: false,
inlineRequires: false,
},
}),
},
};
This is all extracted from the documentation of expo.
I don't believe this is possible in expo. There is a way to use an existing database if you are using a bare android project which involves writing some native code to copy the database from the project assets to the standard location on the phone (/data/ etc) for your application.
https://medium.com/#johann.pardanaud/ship-an-android-app-with-a-pre-populated-database-cd2b3aa3311f
I have always personally created the database myself with CREATE TABLE IF NOT EXISTS since sqlite requires you to define the schema before you query it. If you need to seed the database, this step would then be followed by additional steps to insert the required data.
In most cases, you will need to also check your reference data and update it from the server at regular intervals (it might even change from publishing your apk to someone downloading the app) and this code would also work when there is no reference data in the database.
There are a couple of services which try and take his hassle away from you (e.g. parse) but you will need to decide if you are happy with them hosting your data.) I haven't used them so not sure how this works exactly but I'm told it tries to solve the offline first type problems.
Remember that in future iterations you may need to modify the structure for future versions (to add fields etc) so you will probably need to define some code that loads when the application is first started, checks the database version and applies any changes that are required to bring the database up to the appropriate level.

Vue cli 3 display info from the package.json

In a vue cli 3 project I want to display a version number in the webpage. The version number lies in the package.json file.
The only reference to this that I found is this link in the vue forum.
However, I can't get the proposed solution to work.
Things I tried
Use webpack.definePlugin as in the linked resource:
vue.config.js
const webpack = require('webpack');
module.exports = {
lintOnSave: true,
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'process.env': {
VERSION: require('./package.json').version,
}
})
]
}
},
}
Then in main.ts I read process.env, but it does not contain VERSION (I tried several variants to this, like generating a PACKAGE_JSON field like in the linked page, and generating plain values like 'foo' instead of reading from package-json). It never worked, it is like the code is being ignored. I guess the process.env is being redefined later by vue webpack stuff.
The process log in main.ts contains, however, all the stuff that process usually contains in a vue-cli project, like the mode and the VUE_APP variables defined in .env files.
Try to write to process right on the configure webpack function,
like:
configureWebpack: config => {
process.VERSION = require('./package.json').version
},
(to be honest I did not have much hope with this, but had to try).
Tried the other solution proposed in the linked page,
like:
// vue.config.js
module.exports = {
chainWebpack: config => {
config.plugin('define').tap( ([options = {}]) => {
return [{
...options, // these are the env variables from your .env file, if any arr defined
VERSION: JSON.stringify(require('./package.json').version)
}]
})
}
}
But this fail silently too.
Use the config.plugin('define') syntax suggested by #Oluwafemi,
like:
chainWebpack: (config) => {
return config.plugin('define').tap(
args => merge(args, [VERSION])
)
},
Where VERSION is a local variable defined as:
const pkgVersion = require('./package.json').version;
const VERSION = {
'process.env': {
VUE_APP_VERSION: JSON.stringify(pkgVersion)
}
}
But this is not working either.
I am re-starting the whole project everytime, so that's not the reason why the process stuff does not show up.
My vue-cli version is 3.0.1.
I am adding my 2 cents, as I found a shorter way and apparently the right way (https://docs.npmjs.com/misc/scripts#packagejson-vars)
Add this in your vue.config.file before the export, not inside:
process.env.VUE_APP_VERSION = process.env.npm_package_version
And voilà!
You can then use it from a component with process.env.VUE_APP_VERSION
TLDR
The following snippet in the vue.config.js file will do the trick, and will allow you to access the version of your app as APPLICATION_VERSION:
module.exports = {
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'APPLICATION_VERSION': JSON.stringify(require('./package.json').version),
})
]
}
},
}
TIP:
Don't even try to add some key to process.env via webpack.definePlugin: it won't work as you probably expect.
 Why my previous efforts did not work
At the end, I solved the issue via webpack.DefinePlugin. The main issue I had is that the original solution I found was using definePlugin to write to a process.env.PACKAGE_JSON variable.
This suggests that definePlugin somehow allows to add variables to process or process.env, which is not the case. Whenever I did log process.env in the console, I didn't find the variables I was trying to push into process.env : so I though the definePlugin tech was not working.
Actually, what webpack.definePlugin does is to check for strings at compile time and change them to its value right on your code. So, if you define an ACME_VERSION variable via:
module.exports = {
lintOnSave: true,
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'ACME_VERSION': 111,
})
]
}
},
}
and then, in main.js you print console.log(ACME_VERSION), you will get 111 properly logged.
Now, however, this is just a string change at compile time. If instead of ACME_VERSION you try to define process.env.VUE_APP_ACME_VERSION...
when you log process.env the VUE_APP_ACME_VERSION key won't show up in the object. However, a raw console.log('process.env.VUE_APP_ACME_VERSION') will yield 111 as expected.
So, basically, original link and the proposed solutions were correct to some degree. However, nothing was really being added to the process object. I was logging proccess.env during my initial tries, so I didn't see anything working.
Now, however, since the process object is not being modified, I strongly suggest anyone trying to load variables to their vue app at compile time not to use it. Is misleading at best.
You can simply import your package.json file and use its variables.
import { version } from "../../package.json";
console.log(version)
If you are using Webpack, you can inject the variable in the following way:
// webpack.config.js
plugins: [
new webpack.DefinePlugin({
VERSION: JSON.stringify(require("package.json").version)
})
]
// any-module.js
console.log("Version: " + VERSION);
https://github.com/webpack/webpack/issues/237
When building the Vue app, environment variables that don't begin with the VUE_APP_ prefix are filtered out. NODE_ENV and BASE_URL environment variables are the exception.
The above information applies when the environment variables are set prior to building the Vue app and not in this situation.
In a situation where environment variables are set during the build, it's important to look at what Vue CLI is doing.
The Vue CLI uses webpack.DefinePlugin to set environment variables using the object returned from the call to resolveClientEnv.
resolveClientEnv returns
{
'process.env': {}
}
This means when configuring your environment variables at build time, you need to come upon a way to merge with the existing one.
You need to perform a deep merge of both arrays, so that value for process.env key is an object containing keys from the resolved client environment and your keys.
chainWebpack key in the default export for vue.config.js is just about one of the ways to get this done.
The arguments passed to initialize the DefinePlugin can be merged with new environment variables that you like to configure using the underlying webpack-chain API. Here is an example:
// vue.config.js
const merge = require('deepmerge');
const pkgVersion = require('./package.json').version;
const VERSION = {
'process.env': {
VERSION: JSON.stringify(pkgVersion)
}
}
module.exports = {
chainWebpack: config =>
config
.plugin('define')
.tap(
args => merge(args, [VERSION])
)
}
Your initial attempt was fine, you were just missing the JSON.stringify part:
const webpack = require('webpack');
module.exports = {
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'process.env': {
VERSION: JSON.stringify(require('./package.json').version),
}
})
]
}
},
}
Edit: although the webpack docs recommend the 'process.env.VERSION' way (yellow panel):
new webpack.DefinePlugin({
'process.env.VERSION': JSON.stringify(require('./package.json').version),
}),
Official solutions tend to be more reliable Modes and Environment Variables | Vue CLI
TIP
You can have computed env vars in your vue.config.js file. They still need to be prefixed with VUE_APP_. This is useful for version info
process.env.VUE_APP_VERSION = require('./package.json').version
module.exports = {
// config
}
I attempted the accepted answer, and had errors. However, in the vue docs, I was able to find an answer similar (but not quite) that of #antoni's answer.
In short, just have the following in vue.config.js:
process.env.VUE_APP_VERSION = require('./package.json').version
module.exports = {
// config
}
Docs 2020-10-27:
You can access env variables in your application code:
process.env.VUE_APP_NOT_SECRET_CODE = require('./package.json').version
During build, process.env.VUE_APP_NOT_SECRET_CODE will be replaced by the corresponding value. In the case of VUE_APP_NOT_SECRET_CODE=some_value, it will be replaced by "some_value".
In addition to VUE_APP_* variables, there are also two special variables that will always be available in your app code:
NODE_ENV - this will be one of "development", "production" or "test" depending on the mode the app is running in.
BASE_URL - this corresponds to the publicPath option in vue.config.js and is the base path your app is deployed at.
The answer for this on the official VueJS forum is like so:
chainWebpack: config => {
config
.plugin('define')
.tap(args => {
let v = JSON.stringify(require('./package.json').version)
args[0]['process.env']['VERSION'] = v
return args
})
}
Description
Add this line to your vue.config.js file
module.exports = {
...
chainWebpack: config => {
config
.plugin('define')
.tap(args => {
let v = JSON.stringify(require('./package.json').version)
args[0]['process.env']['VERSION'] = v
return args
})
}
};
Then you can use this in your vue files like so:
version: function () {
return process.env.VERSION
}
A one liner alternative:
//script tag
let packageJsonInfo = require("../../package.json");
//Then you can for example, get the version no
packageJsonInfo.version

How can I mock Webpack's require.context in Jest?

Suppose I have the following module:
var modulesReq = require.context('.', false, /\.js$/);
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
Jest complains because it doesn't know about require.context:
FAIL /foo/bar.spec.js (0s)
● Runtime Error
- TypeError: require.context is not a function
How can I mock it? I tried using setupTestFrameworkScriptFile Jest configuration but the tests can't see any changes that I've made in require.
I had the same problem, then I've made a 'solution'.
I'm pretty sure that this is not the best choice. I ended up stopping using it, by the points answered here:
https://github.com/facebookincubator/create-react-app/issues/517
https://github.com/facebook/jest/issues/2298
But if you really need it, you should include the polyfill below in every file that you call it (not on the tests file itself, because the require will be no global overridden in a Node environment).
// This condition actually should detect if it's an Node environment
if (typeof require.context === 'undefined') {
const fs = require('fs');
const path = require('path');
require.context = (base = '.', scanSubDirectories = false, regularExpression = /\.js$/) => {
const files = {};
function readDirectory(directory) {
fs.readdirSync(directory).forEach((file) => {
const fullPath = path.resolve(directory, file);
if (fs.statSync(fullPath).isDirectory()) {
if (scanSubDirectories) readDirectory(fullPath);
return;
}
if (!regularExpression.test(fullPath)) return;
files[fullPath] = true;
});
}
readDirectory(path.resolve(__dirname, base));
function Module(file) {
return require(file);
}
Module.keys = () => Object.keys(files);
return Module;
};
}
With this function, you don't need to change any require.context call, it will execute with the same behavior as it would (if it's on webpack it will just use the original implementation, and if it's inside Jest execution, with the polyfill function).
After spending some hours trying each of the answers above. I would like to contribute.
Adding babel-plugin-transform-require-context plugin to .babelrc for test env fixed all the issues.
Install - babel-plugin-transform-require-context here https://www.npmjs.com/package/babel-plugin-transform-require-context (available with yarn too)
Now add plugin to .babelrc
{
"env": {
"test": {
"plugins": ["transform-require-context"]
}
}
}
It will simply transform require-context for test env into dummy fn calls so that code can run safely.
If you are using Babel, look at babel-plugin-require-context-hook. Configuration instructions for Storybook are available at Storyshots | Configure Jest to work with Webpack's require.context(), but they are not Storyshots/Storybook specific.
To summarise:
Install the plugin.
yarn add babel-plugin-require-context-hook --dev
Create a file .jest/register-context.js with the following contents:
import registerRequireContextHook from 'babel-plugin-require-context-hook/register';
registerRequireContextHook();
Configure Jest (the file depends on where you are storing your Jest configuration, e.g. package.json):
setupFiles: ['<rootDir>/.jest/register-context.js']
Add the plugin to .babelrc
{
"presets": ["..."],
"plugins": ["..."],
"env": {
"test": {
"plugins": ["require-context-hook"]
}
}
}
Alternatively, add it to babel.config.js:
module.exports = function(api) {
api.cache(true)
const presets = [...]
const plugins = [...]
if (process.env.NODE_ENV === "test") {
plugins.push("require-context-hook")
}
return {
presets,
plugins
}
}
It may be worth noting that using babel.config.js rather than .babelrc may cause issues. For example, I found that when I defined the require-context-hook plugin in babel.config.js:
Jest 22 didn't pick it up;
Jest 23 picked it up; but
jest --coverage didn't pick it up (perhaps Istanbul isn't up to speed with Babel 7?).
In all cases, a .babelrc configuration was fine.
Remarks on Edmundo Rodrigues's answer
This babel-plugin-require-context-hook plugin uses code that is similar to Edmundo Rodrigues's answer here. Props to Edmundo! Because the plugin is implemented as a Babel plugin, it avoids static analysis issues. e.g. With Edmundo's solution, Webpack warns:
Critical dependency: require function is used in a way in which dependencies cannot be statically extracted
Despite the warnings, Edmundo's solution is the most robust because it doesn't depend on Babel.
Extract the call to a separate module:
// src/js/lib/bundle-loader.js
/* istanbul ignore next */
module.exports = require.context('bundle-loader?lazy!../components/', false, /.*\.vue$/)
Use the new module in the module where you extracted it from:
// src/js/lib/loader.js
const loadModule = require('lib/bundle-loader')
Create a mock for the newly created bundle-loader module:
// test/unit/specs/__mocks__/lib/bundle-loader.js
export default () => () => 'foobar'
Use the mock in your test:
// test/unit/specs/lib/loader.spec.js
jest.mock('lib/bundle-loader')
import Loader from 'lib/loader'
describe('lib/loader', () => {
describe('Loader', () => {
it('should load', () => {
const loader = new Loader('[data-module]')
expect(loader).toBeInstanceOf(Loader)
})
})
})
Alrighty! I had major issues with this and managed to come to a solution that worked for me by using a combination of other answers and the Docs. (Took me a good day though)
For anyone else who is struggling:
Create a file called bundle-loader.js and add something like:
module.exports = {
importFiles: () => {
const r = require.context(<your_path_to_your_files>)
<your_processing>
return <your_processed_files>
}
}
In your code import like:
import bundleLoader from '<your_relative_Path>/bundle-loader'
Use like
let <your_var_name> = bundleLoader.importFiles()
In your test file right underneath other imports:
jest.mock('../../utils/bundle-loader', () => ({
importFiles: () => {
return <this_will_be_what_you_recieve_in_the_test_from_import_files>
}
}))
Installing
babel-plugin-transform-require-context
package and adding the plugin in the .babelrc resolved the issue for me.
Refer to the documentation here:
https://www.npmjs.com/package/babel-plugin-transform-require-context
The easiest and fastest way to solve this problem will be to install require-context.macro
npm install --save-dev require-context.macro
then just replace:
var modulesReq = require.context('.', false, /\.js$/);
with:
var modulesReq = requireContext('.', false, /\.js$/);
Thats it, you should be good to go!
Cheers and good luck!
Implementation problems not mentioned:
Jest prevents out-of-scope variables in mock, like __dirname.
Create React App limits Babel and Jest customization. You need to use src/setupTests.js which is run before every test.
fs is not supported in the browser. You will need something like browserFS. Now your app has file system support, just for dev.
Potential race condition. Export after this import. One of your require.context imports includes that export. I'm sure require takes care of this, but now we are adding a lot of fs work on top of it.
Type checking.
Either #4 or #5 created undefined errors. Type out the imports, no more errors. No more concerns about what can or can't be imported and where.
Motivation for all this? Extensibility. Keeping future modifications limited to one new file. Publishing separate modules is a better approach.
If there's an easier way to import, node would do it. Also this smacks of premature optimization. You end up scrapping everything anyways because you're now using an industry leading platform or utility.
If you're using Jest with test-utils in Vue.
Install these packages:
#vue/cli-plugin-babel
and
babel-plugin-transform-require-context
Then define babel.config.js at the root of the project with this configuration:
module.exports = function(api) {
api.cache(true);
const presets = [
'#vue/cli-plugin-babel/preset'
];
const plugins = [];
if (process.env.NODE_ENV === 'test') {
plugins.push('transform-require-context');
}
return {
presets,
plugins
};
};
This will check if the current process is initiated by Jest and if so, it mocks all the require.context calls.
I faced the same issue with an ejected create-react-app project
and no one from the answers above helped me...
My solution were to copy to config/babelTransform.js the follwoing:
module.exports = babelJest.createTransformer({
presets: [
[
require.resolve('babel-preset-react-app'),
{
runtime: hasJsxRuntime ? 'automatic' : 'classic',
},
],
],
plugins:["transform-require-context"],
babelrc: false,
configFile: false,
});
Simpleset Solution for this
Just Do
var modulesReq = require.context && require.context('.', false, /\.js$/);
if(modulesReq) {
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
}
So Here I have added extra check if require.context is defined then only execute By Doing this jest will no longer complain

Categories