Is there a way to get the version set in package.json in a nodejs app? I would want something like this
var port = process.env.PORT || 3000
app.listen port
console.log "Express server listening on port %d in %s mode %s", app.address().port, app.settings.env, app.VERSION
I found that the following code fragment worked best for me. Since it uses require to load the package.json, it works regardless of the current working directory.
var pjson = require('./package.json');
console.log(pjson.version);
A warning, courtesy of #Pathogen:
Doing this with Browserify has security implications.
Be careful not to expose your package.json to the client, as it means that all your dependency version numbers, build and test commands and more are sent to the client.
If you're building server and client in the same project, you expose your server-side version numbers too.
Such specific data can be used by an attacker to better fit the attack on your server.
If your application is launched with npm start, you can simply use:
process.env.npm_package_version
See package.json vars for more details.
Using ES6 modules you can do the following:
import {version} from './package.json';
Or in plain old shell:
$ node -e "console.log(require('./package.json').version);"
This can be shortened to
$ node -p "require('./package.json').version"
There are two ways of retrieving the version:
Requiring package.json and getting the version:
const { version } = require('./package.json');
Using the environment variables:
const version = process.env.npm_package_version;
Please don't use JSON.parse, fs.readFile, fs.readFileSync and don't use another npm modules it's not necessary for this question.
For those who look for a safe client-side solution that also works on server-side, there is genversion. It is a command-line tool that reads the version from the nearest package.json and generates an importable CommonJS module file that exports the version. Disclaimer: I'm a maintainer.
$ genversion lib/version.js
I acknowledge the client-side safety was not OP's primary intention, but as discussed in answers by Mark Wallace and aug, it is highly relevant and also the reason I found this Q&A.
Here is how to read the version out of package.json:
fs = require('fs')
json = JSON.parse(fs.readFileSync('package.json', 'utf8'))
version = json.version
EDIT: Wow, this answer was originally from 2012! There are several better answers now. Probably the cleanest is:
const { version } = require('./package.json');
There is another way of fetching certain information from your package.json file namely using pkginfo module.
Usage of this module is very simple. You can get all package variables using:
require('pkginfo')(module);
Or only certain details (version in this case)
require('pkginfo')(module, 'version');
And your package variables will be set to module.exports (so version number will be accessible via module.exports.version).
You could use the following code snippet:
require('pkginfo')(module, 'version');
console.log "Express server listening on port %d in %s mode %s", app.address().port, app.settings.env, module.exports.version
This module has very nice feature - it can be used in any file in your project (e.g. in subfolders) and it will automatically fetch information from your package.json. So you do not have to worry where you package.json is.
I hope that will help.
NPM one liner:
From npm v7.20.0:
npm pkg get version
Prior to npm v7.20.0:
npm -s run env echo '$npm_package_version'
Note the output is slightly different between these two methods: the former outputs the version number surrounded by quotes (i.e. "1.0.0"), the latter without (i.e. 1.0.0).
One solution to remove the quotes in Unix is using xargs
npm pkg get version | xargs echo
Option 1
Best practice is to version from package.json using npm environment variables.
process.env.npm_package_version
more information on: https://docs.npmjs.com/using-npm/config.html
This will work only when you start your service using NPM command.
Quick Info: you can read any values in pacakge.json using process.env.npm_package_[keyname]
Option 2
Setting version in environment variable using https://www.npmjs.com/package/dotenv as .env file and reading it as process.env.version
Just adding an answer because I came to this question to see the best way to include the version from package.json in my web application.
I know this question is targetted for Node.js however, if you are using Webpack to bundle your app just a reminder the recommended way is to use the DefinePlugin to declare a global version in the config and reference that. So you could do in your webpack.config.json
const pkg = require('../../package.json');
...
plugins : [
new webpack.DefinePlugin({
AppVersion: JSON.stringify(pkg.version),
...
And then AppVersion is now a global that is available for you to use. Also make sure in your .eslintrc you ignore this via the globals prop
If you are looking for module (package.json: "type": "module") (ES6 import) support, e.g. coming from refactoring commonJS, you should (at the time of writing) do either:
import { readFile } from 'fs/promises';
const pkg = JSON.parse(await readFile(new URL('./package.json', import.meta.url)));
console.log(pkg.version)
or, run the node process with node --experimental-json-modules index.js to do:
import pkg from './package.json'
console.log(pkg.version)
You will however get a warning, until json modules will become generally available.
If you get Syntax or (top level) async errors, you are likely in a an older node version. Update to at least node#14.
You can use ES6 to import package.json to retrieve version number and output the version on console.
import {name as app_name, version as app_version} from './path/to/package.json';
console.log(`App ---- ${app_name}\nVersion ---- ${app_version}`);
A safe option is to add an npm script that generates a separate version file:
"scripts": {
"build": "yarn version:output && blitz build",
"version:output": "echo 'export const Version = { version: \"'$npm_package_version.$(date +%s)'\" }' > version.js"
}
This outputs version.js with the contents:
export const Version = { version: "1.0.1.1622225484" }
To determine the package version in node code, you can use the following:
const version = require('./package.json').version; for < ES6 versions
import {version} from './package.json'; for ES6 version
const version = process.env.npm_package_version;
if application has been started using npm start, all npm_* environment variables become available.
You can use following npm packages as well - root-require, pkginfo, project-version.
we can read the version or other keys from package.json in two ways
1- using require and import the key required e.g:
const version = require('./package.json')
2 - using package_vars as mentioned in doc
process.env.npm_package_version
You can use the project-version package.
$ npm install --save project-version
Then
const version = require('project-version');
console.log(version);
//=> '1.0.0'
It uses process.env.npm_package_version but fallback on the version written in the package.json in case the env var is missing for some reason.
Why don't use the require resolve...
const packageJson = path.dirname(require.resolve('package-name')) + '/package.json';
const { version } = require(packageJson);
console.log('version', version)
With this approach work for all sub paths :)
In case you want to get version of the target package.
import { version } from 'TARGET_PACKAGE/package.json';
Example:
import { version } from 'react/package.json';
I know this isn't the intent of the OP, but I just had to do this, so hope it helps the next person.
If you're using docker-compose for your CI/CD process, you can get it this way!
version:
image: node:7-alpine
volumes:
- .:/usr/src/service/
working_dir: /usr/src/service/
command: ash -c "node -p \"require('./package.json').version.replace('\n', '')\""
for the image, you can use any node image. I use alpine because it is the smallest.
The leanest way I found:
const { version } = JSON.parse(fs.readFileSync('./package.json'))
I've actually been through most of the solutions here and they either did not work on both Windows and Linux/OSX, or didn't work at all, or relied on Unix shell tools like grep/awk/sed.
The accepted answer works technically, but it sucks your whole package.json into your build and that's a Bad Thing that only the desperate should use temporarily to get unblocked, and in general should be avoided, at least for production code. The alternative is to use that method only to write the version to a single constant that can be used instead of the whole file.
So for anyone else looking for a cross-platform solution (not reliant on Unix shell commands) and local (without external dependencies):
Since it can be assumed that Node.js is installed, and it's already cross-platform for this, I just created a make_version.js file with:
const PACKAGE_VERSION = require("./package.json").version;
console.log(`export const PACKAGE_VERSION = "${PACKAGE_VERSION}";`);
console.error("package.json version:", PACKAGE_VERSION);
and added a version command to package.json:
scripts: {
"version": "node make_version.js > src/version.js",
and then added:
"prebuild": "npm run version",
"prestart": "npm run version",
and it creates a new src/versions.js on every start or build. Of course this can be easily tuned in the version script to be a different location, or in the make_version.js file to output different syntax and constant name, etc.
I do this with findup-sync:
var findup = require('findup-sync');
var packagejson = require(findup('package.json'));
console.log(packagejson.version); // => '0.0.1'
I am using gitlab ci and want to automatically use the different versions to tag my docker images and push them. Now their default docker image does not include node so my version to do this in shell only is this
scripts/getCurrentVersion.sh
BASEDIR=$(dirname $0)
cat $BASEDIR/../package.json | grep '"version"' | head -n 1 | awk '{print $2}' | sed 's/"//g; s/,//g'
Now what this does is
Print your package json
Search for the lines with "version"
Take only the first result
Replace " and ,
Please not that i have my scripts in a subfolder with the according name in my repository. So if you don't change $BASEDIR/../package.json to $BASEDIR/package.json
Or if you want to be able to get major, minor and patch version I use this
scripts/getCurrentVersion.sh
VERSION_TYPE=$1
BASEDIR=$(dirname $0)
VERSION=$(cat $BASEDIR/../package.json | grep '"version"' | head -n 1 | awk '{print $2}' | sed 's/"//g; s/,//g')
if [ $VERSION_TYPE = "major" ]; then
echo $(echo $VERSION | awk -F "." '{print $1}' )
elif [ $VERSION_TYPE = "minor" ]; then
echo $(echo $VERSION | awk -F "." '{print $1"."$2}' )
else
echo $VERSION
fi
this way if your version was 1.2.3. Your output would look like this
$ > sh ./getCurrentVersion.sh major
1
$> sh ./getCurrentVersion.sh minor
1.2
$> sh ./getCurrentVersion.sh
1.2.3
Now the only thing you will have to make sure is that your package version will be the first time in package.json that key is used otherwise you'll end up with the wrong version
I'm using create-react-app and I don't have process.env.npm_package_version available when executing my React-app.
I did not want to reference package.json in my client-code (because of exposing dangerous info to client, like package-versions), neither I wanted to install an another dependency (genversion).
I found out that I can reference version within package.json, by using $npm_package_version in my package.json:
"scripts": {
"my_build_script": "REACT_APP_VERSION=$npm_package_version react-scripts start"
}
Now the version is always following the one in package.json.
I made a useful code to get the parent module's package.json
function loadParentPackageJson() {
if (!module.parent || !module.parent.filename) return null
let dir = path.dirname(module.parent.filename)
let maxDepth = 5
let packageJson = null
while (maxDepth > 0) {
const packageJsonPath = `${dir}/package.json`
const exists = existsSync(packageJsonPath)
if (exists) {
packageJson = require(packageJsonPath)
break
}
dir = path.resolve(dir, '../')
maxDepth--
}
return packageJson
}
If using rollup, the rollup-plugin-replace plugin can be used to add the version without exposing package.json to the client.
// rollup.config.js
import pkg from './package.json';
import { terser } from "rollup-plugin-terser";
import resolve from 'rollup-plugin-node-resolve';
import commonJS from 'rollup-plugin-commonjs'
import replace from 'rollup-plugin-replace';
export default {
plugins: [
replace({
exclude: 'node_modules/**',
'MY_PACKAGE_JSON_VERSION': pkg.version, // will replace 'MY_PACKAGE_JSON_VERSION' with package.json version throughout source code
}),
]
};
Then, in the source code, anywhere where you want to have the package.json version, you would use the string 'MY_PACKAGE_JSON_VERSION'.
// src/index.js
export const packageVersion = 'MY_PACKAGE_JSON_VERSION' // replaced with actual version number in rollup.config.js
const { version } = require("./package.json");
console.log(version);
const v = require("./package.json").version;
console.log(v);
Import your package.json file into your server.js or app.js and then access package.json properties into server file.
var package = require('./package.json');
package variable contains all the data in package.json.
Used to version web-components like this:
const { version } = require('../package.json')
class Widget extends HTMLElement {
constructor() {
super()
this.attachShadow({ mode: 'open' })
}
public connectedCallback(): void {
this.renderWidget()
}
public renderWidget = (): void => {
this.shadowRoot?.appendChild(this.setPageTemplate())
this.setAttribute('version', version)
}
}
I'm trying to move some icons in my app directory based on a function i have inside my Gruntfile.js. Would it be possible to do something like this? I've tried the following (going into dev or staging folder and copying all files to the previous directory), but coudn't get it to work. Thanks in advance.
grunt.registerTask('setAppIcon', 'Task that sets the app icon', function(environment) {
if (environment.toLowerCase() == "development") {
grunt.task.run(['exec:command:cd app/lib/extras/res/icon/ios/dev && cp -a . ../']);
} else if (environment.toLowerCase() == "staging") {
grunt.task.run(['exec:command:cd app/lib/extras/res/icon/ios/staging && cp -a . ../']);
}
});
Yes, it's possible to achieve your requirement, however, when you invoke the grunt.task.run command inside your function (i.e. custom task) you need to provide a reference to a task to run.
If you define a separate Target, (Let's call call them copy_dev and copy_staging - as shown in the example below), for each cd ... && cp ... command in the grunt-exec Task it should work successfully.
Gruntfile.js
The following Gruntfile.js gist shows how this can be achieved:
module.exports = function (grunt) {
grunt.loadNpmTasks('grunt-exec');
grunt.initConfig({
exec: {
copy_dev: {
cmd: 'cd app/lib/extras/res/icon/ios/dev && cp -a . ../'
},
copy_staging: {
cmd: 'cd app/lib/extras/res/icon/ios/staging && cp -a . ../'
}
}
});
grunt.registerTask('setAppIcon', 'Task that sets the app icon', function() {
var environment = process.env.NODE_ENV;
// Exit early if NODE_ENV variable has not been set.
if (!environment) {
grunt.log.writeln(
'\"setAppIcon\"" task failed - NODE_ENV has not been set.'['yellow']
)
return
}
if (environment.toLowerCase() == "development") {
grunt.task.run('exec:copy_dev');
grunt.log.writeln('>> Copying icons from \"dev\"...')
} else if (environment.toLowerCase() == "staging") {
grunt.task.run('exec:copy_staging');
grunt.log.writeln('>> Copying icons from \"staging\"...')
}
});
grunt.registerTask('default', [ 'setAppIcon' ]);
};
Additional notes
Inside the custom task/function named setAppIcon we obtain the current node environment using nodes builtin process.env
When running $ grunt via your CLI (using the gist shown above), and assuming your process.env.NODE_ENV variable has not been set, or it has possibly been unset by running $ unset NODE_ENV, you will see the following message:
"setAppIcon"" task failed - NODE_ENV has not been set.
However, if the process.env.NODE_ENV variable has been set to either development or staging the files will be copied as expected.
For example running either of the following via your CLI will work successfully:
$ export NODE_ENV=development && grunt
or
$ export NODE_ENV=staging && grunt
You will also see either of the following messages logged to the console:
>> Copying icons from "dev"...
or
>> Copying icons from "staging"...
After process.env.NODE_ENV has been set to either development or staging then just running $ grunt via your CLI, will copy files according to which environment is set.
I want to configure tslint to run only on git modified files.
In packages.json I have the 'lint' task defined as:
{
"scripts": {
"lint": "tslint \"./src/**/*.{ts,tsx}\"",
...
},
"dependencies": {
"tslint": "4.3.1",
...
},
...
}
So lint task works fine but it processes all files in project by hardcoded mask. Suppose I have a javascript listModified.js which produces list of git modified files. How do I pass the string produced by js as argument to tslint?
I can't use piping here since tslint accepts argument not a pipe. I failed to use eval to form argument.
How to do it?
try xargs
echo "file1.ts\nfile2.ts" | xargs tslint
Obviously you can also use git status or git log to produce list of files to check
If xargs is not available in your system then try to use regular javascript API
https://www.npmjs.com/package/tslint#library-1
I suggest you to use some build system because it should have plugins that allow to check if file was modified by computing hash or by checking last modified time.
For example, using gulp with gulp-tslint and gulp-changed-in-place plugins, and jsonfile package for reading and writing JSON files:
var changedInPlace = require('gulp-changed-in-place'),
gulpTsLint = require('gulp-tslint'),
jsonfile = require('jsonfile');
var tsHashFile = 'ts.hash.json';
gulp.task('tslint', function (cb) {
jsonfile.readFile(tsHashFile, function (err, obj) {
var firstPass = false;
if (err) {
firstPass = true;
obj = {};
}
return gulp.src(tsGlob)
.pipe(changedInPlace({
cache: obj,
firstPass: firstPass
}))
.pipe(gulpTsLint())
.pipe(gulpTsLint.report())
.on('end', function (event) {
jsonfile.writeFile(tsHashFile, obj, function () {
cb(event);
});
});
});
});
tsHashFile is used to persist computed hashes and when you will run gulp tslint again tslint will not be executed for files which hash is not changed.
I've created node application which I can run locally and in the cloud
Now I want that it be done somehow smoother and cleaner ,so I try to put some property in config.json file to check if I want to deploy the app or use it locally but I need to update manually this property before I change the propose , there is a better way to do it with node ?
let runnerServer = `http://localhost:8060/service/runner/${server.address().port}`;
if (cfg.isHosted) {
blogServer = `http://${serverName}/service/runner/${server.address().port}`;
}
and in the conig.json I've the field isHosted which I change manually(true/false) if I want to deploy or not...
update
maybe I can use process.env.PORT but this is just one example that I need to use in my code , currently I've several of fork that need to konw if Im in deployment or running locally ..
One option is to use use node's in built object called process.env (https://nodejs.org/api/process.html) and use two config files per se. This approach is somewhat similar to what you are doing but may be cleaner
config.localhost.json
config.production.json
then by setting properties on this object based on environment such as process.env.NODE_ENV = 'localhost' or process.env.NODE_ENV = 'production', you could read the corresponding file to import the configurations.
var config = require('./config.production.json');
if(process.env.NODE_ENV === 'localhost')
{
config = require('./config.localhost.json');
}
So to set this environment variable when running locally on your dev box , if
OSX - then on terminal export NODE_ENV=localhost
WINDOWS - then on cmd line set NODE_ENV=localhost
An easy way to solve this, if every environment configuration can be in the repo:
config.json:
production: {
// prod config
},
staging: {
// staging config
},
devel: {
// devel config
}
config.js:
const environment = process.env['ENV'] || 'devel';
module.exports = require('./config.json')[environment];
then in your package.json you could add the following scripts:
package.json
// npm stuff
scripts {
prod: "ENV=production node index.js",
stage: "ENV=staging node index.js",
dev: "ENV=devel node index.js"
}
and with this setup, you can run each configuration with the following commands:
production: npm run prod
staging: npm run stage
devel: npm run dev
Suppose I have the following module:
var modulesReq = require.context('.', false, /\.js$/);
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
Jest complains because it doesn't know about require.context:
FAIL /foo/bar.spec.js (0s)
● Runtime Error
- TypeError: require.context is not a function
How can I mock it? I tried using setupTestFrameworkScriptFile Jest configuration but the tests can't see any changes that I've made in require.
I had the same problem, then I've made a 'solution'.
I'm pretty sure that this is not the best choice. I ended up stopping using it, by the points answered here:
https://github.com/facebookincubator/create-react-app/issues/517
https://github.com/facebook/jest/issues/2298
But if you really need it, you should include the polyfill below in every file that you call it (not on the tests file itself, because the require will be no global overridden in a Node environment).
// This condition actually should detect if it's an Node environment
if (typeof require.context === 'undefined') {
const fs = require('fs');
const path = require('path');
require.context = (base = '.', scanSubDirectories = false, regularExpression = /\.js$/) => {
const files = {};
function readDirectory(directory) {
fs.readdirSync(directory).forEach((file) => {
const fullPath = path.resolve(directory, file);
if (fs.statSync(fullPath).isDirectory()) {
if (scanSubDirectories) readDirectory(fullPath);
return;
}
if (!regularExpression.test(fullPath)) return;
files[fullPath] = true;
});
}
readDirectory(path.resolve(__dirname, base));
function Module(file) {
return require(file);
}
Module.keys = () => Object.keys(files);
return Module;
};
}
With this function, you don't need to change any require.context call, it will execute with the same behavior as it would (if it's on webpack it will just use the original implementation, and if it's inside Jest execution, with the polyfill function).
After spending some hours trying each of the answers above. I would like to contribute.
Adding babel-plugin-transform-require-context plugin to .babelrc for test env fixed all the issues.
Install - babel-plugin-transform-require-context here https://www.npmjs.com/package/babel-plugin-transform-require-context (available with yarn too)
Now add plugin to .babelrc
{
"env": {
"test": {
"plugins": ["transform-require-context"]
}
}
}
It will simply transform require-context for test env into dummy fn calls so that code can run safely.
If you are using Babel, look at babel-plugin-require-context-hook. Configuration instructions for Storybook are available at Storyshots | Configure Jest to work with Webpack's require.context(), but they are not Storyshots/Storybook specific.
To summarise:
Install the plugin.
yarn add babel-plugin-require-context-hook --dev
Create a file .jest/register-context.js with the following contents:
import registerRequireContextHook from 'babel-plugin-require-context-hook/register';
registerRequireContextHook();
Configure Jest (the file depends on where you are storing your Jest configuration, e.g. package.json):
setupFiles: ['<rootDir>/.jest/register-context.js']
Add the plugin to .babelrc
{
"presets": ["..."],
"plugins": ["..."],
"env": {
"test": {
"plugins": ["require-context-hook"]
}
}
}
Alternatively, add it to babel.config.js:
module.exports = function(api) {
api.cache(true)
const presets = [...]
const plugins = [...]
if (process.env.NODE_ENV === "test") {
plugins.push("require-context-hook")
}
return {
presets,
plugins
}
}
It may be worth noting that using babel.config.js rather than .babelrc may cause issues. For example, I found that when I defined the require-context-hook plugin in babel.config.js:
Jest 22 didn't pick it up;
Jest 23 picked it up; but
jest --coverage didn't pick it up (perhaps Istanbul isn't up to speed with Babel 7?).
In all cases, a .babelrc configuration was fine.
Remarks on Edmundo Rodrigues's answer
This babel-plugin-require-context-hook plugin uses code that is similar to Edmundo Rodrigues's answer here. Props to Edmundo! Because the plugin is implemented as a Babel plugin, it avoids static analysis issues. e.g. With Edmundo's solution, Webpack warns:
Critical dependency: require function is used in a way in which dependencies cannot be statically extracted
Despite the warnings, Edmundo's solution is the most robust because it doesn't depend on Babel.
Extract the call to a separate module:
// src/js/lib/bundle-loader.js
/* istanbul ignore next */
module.exports = require.context('bundle-loader?lazy!../components/', false, /.*\.vue$/)
Use the new module in the module where you extracted it from:
// src/js/lib/loader.js
const loadModule = require('lib/bundle-loader')
Create a mock for the newly created bundle-loader module:
// test/unit/specs/__mocks__/lib/bundle-loader.js
export default () => () => 'foobar'
Use the mock in your test:
// test/unit/specs/lib/loader.spec.js
jest.mock('lib/bundle-loader')
import Loader from 'lib/loader'
describe('lib/loader', () => {
describe('Loader', () => {
it('should load', () => {
const loader = new Loader('[data-module]')
expect(loader).toBeInstanceOf(Loader)
})
})
})
Alrighty! I had major issues with this and managed to come to a solution that worked for me by using a combination of other answers and the Docs. (Took me a good day though)
For anyone else who is struggling:
Create a file called bundle-loader.js and add something like:
module.exports = {
importFiles: () => {
const r = require.context(<your_path_to_your_files>)
<your_processing>
return <your_processed_files>
}
}
In your code import like:
import bundleLoader from '<your_relative_Path>/bundle-loader'
Use like
let <your_var_name> = bundleLoader.importFiles()
In your test file right underneath other imports:
jest.mock('../../utils/bundle-loader', () => ({
importFiles: () => {
return <this_will_be_what_you_recieve_in_the_test_from_import_files>
}
}))
Installing
babel-plugin-transform-require-context
package and adding the plugin in the .babelrc resolved the issue for me.
Refer to the documentation here:
https://www.npmjs.com/package/babel-plugin-transform-require-context
The easiest and fastest way to solve this problem will be to install require-context.macro
npm install --save-dev require-context.macro
then just replace:
var modulesReq = require.context('.', false, /\.js$/);
with:
var modulesReq = requireContext('.', false, /\.js$/);
Thats it, you should be good to go!
Cheers and good luck!
Implementation problems not mentioned:
Jest prevents out-of-scope variables in mock, like __dirname.
Create React App limits Babel and Jest customization. You need to use src/setupTests.js which is run before every test.
fs is not supported in the browser. You will need something like browserFS. Now your app has file system support, just for dev.
Potential race condition. Export after this import. One of your require.context imports includes that export. I'm sure require takes care of this, but now we are adding a lot of fs work on top of it.
Type checking.
Either #4 or #5 created undefined errors. Type out the imports, no more errors. No more concerns about what can or can't be imported and where.
Motivation for all this? Extensibility. Keeping future modifications limited to one new file. Publishing separate modules is a better approach.
If there's an easier way to import, node would do it. Also this smacks of premature optimization. You end up scrapping everything anyways because you're now using an industry leading platform or utility.
If you're using Jest with test-utils in Vue.
Install these packages:
#vue/cli-plugin-babel
and
babel-plugin-transform-require-context
Then define babel.config.js at the root of the project with this configuration:
module.exports = function(api) {
api.cache(true);
const presets = [
'#vue/cli-plugin-babel/preset'
];
const plugins = [];
if (process.env.NODE_ENV === 'test') {
plugins.push('transform-require-context');
}
return {
presets,
plugins
};
};
This will check if the current process is initiated by Jest and if so, it mocks all the require.context calls.
I faced the same issue with an ejected create-react-app project
and no one from the answers above helped me...
My solution were to copy to config/babelTransform.js the follwoing:
module.exports = babelJest.createTransformer({
presets: [
[
require.resolve('babel-preset-react-app'),
{
runtime: hasJsxRuntime ? 'automatic' : 'classic',
},
],
],
plugins:["transform-require-context"],
babelrc: false,
configFile: false,
});
Simpleset Solution for this
Just Do
var modulesReq = require.context && require.context('.', false, /\.js$/);
if(modulesReq) {
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
}
So Here I have added extra check if require.context is defined then only execute By Doing this jest will no longer complain