In a vue cli 3 project I want to display a version number in the webpage. The version number lies in the package.json file.
The only reference to this that I found is this link in the vue forum.
However, I can't get the proposed solution to work.
Things I tried
Use webpack.definePlugin as in the linked resource:
vue.config.js
const webpack = require('webpack');
module.exports = {
lintOnSave: true,
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'process.env': {
VERSION: require('./package.json').version,
}
})
]
}
},
}
Then in main.ts I read process.env, but it does not contain VERSION (I tried several variants to this, like generating a PACKAGE_JSON field like in the linked page, and generating plain values like 'foo' instead of reading from package-json). It never worked, it is like the code is being ignored. I guess the process.env is being redefined later by vue webpack stuff.
The process log in main.ts contains, however, all the stuff that process usually contains in a vue-cli project, like the mode and the VUE_APP variables defined in .env files.
Try to write to process right on the configure webpack function,
like:
configureWebpack: config => {
process.VERSION = require('./package.json').version
},
(to be honest I did not have much hope with this, but had to try).
Tried the other solution proposed in the linked page,
like:
// vue.config.js
module.exports = {
chainWebpack: config => {
config.plugin('define').tap( ([options = {}]) => {
return [{
...options, // these are the env variables from your .env file, if any arr defined
VERSION: JSON.stringify(require('./package.json').version)
}]
})
}
}
But this fail silently too.
Use the config.plugin('define') syntax suggested by #Oluwafemi,
like:
chainWebpack: (config) => {
return config.plugin('define').tap(
args => merge(args, [VERSION])
)
},
Where VERSION is a local variable defined as:
const pkgVersion = require('./package.json').version;
const VERSION = {
'process.env': {
VUE_APP_VERSION: JSON.stringify(pkgVersion)
}
}
But this is not working either.
I am re-starting the whole project everytime, so that's not the reason why the process stuff does not show up.
My vue-cli version is 3.0.1.
I am adding my 2 cents, as I found a shorter way and apparently the right way (https://docs.npmjs.com/misc/scripts#packagejson-vars)
Add this in your vue.config.file before the export, not inside:
process.env.VUE_APP_VERSION = process.env.npm_package_version
And voilà!
You can then use it from a component with process.env.VUE_APP_VERSION
TLDR
The following snippet in the vue.config.js file will do the trick, and will allow you to access the version of your app as APPLICATION_VERSION:
module.exports = {
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'APPLICATION_VERSION': JSON.stringify(require('./package.json').version),
})
]
}
},
}
TIP:
Don't even try to add some key to process.env via webpack.definePlugin: it won't work as you probably expect.
Why my previous efforts did not work
At the end, I solved the issue via webpack.DefinePlugin. The main issue I had is that the original solution I found was using definePlugin to write to a process.env.PACKAGE_JSON variable.
This suggests that definePlugin somehow allows to add variables to process or process.env, which is not the case. Whenever I did log process.env in the console, I didn't find the variables I was trying to push into process.env : so I though the definePlugin tech was not working.
Actually, what webpack.definePlugin does is to check for strings at compile time and change them to its value right on your code. So, if you define an ACME_VERSION variable via:
module.exports = {
lintOnSave: true,
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'ACME_VERSION': 111,
})
]
}
},
}
and then, in main.js you print console.log(ACME_VERSION), you will get 111 properly logged.
Now, however, this is just a string change at compile time. If instead of ACME_VERSION you try to define process.env.VUE_APP_ACME_VERSION...
when you log process.env the VUE_APP_ACME_VERSION key won't show up in the object. However, a raw console.log('process.env.VUE_APP_ACME_VERSION') will yield 111 as expected.
So, basically, original link and the proposed solutions were correct to some degree. However, nothing was really being added to the process object. I was logging proccess.env during my initial tries, so I didn't see anything working.
Now, however, since the process object is not being modified, I strongly suggest anyone trying to load variables to their vue app at compile time not to use it. Is misleading at best.
You can simply import your package.json file and use its variables.
import { version } from "../../package.json";
console.log(version)
If you are using Webpack, you can inject the variable in the following way:
// webpack.config.js
plugins: [
new webpack.DefinePlugin({
VERSION: JSON.stringify(require("package.json").version)
})
]
// any-module.js
console.log("Version: " + VERSION);
https://github.com/webpack/webpack/issues/237
When building the Vue app, environment variables that don't begin with the VUE_APP_ prefix are filtered out. NODE_ENV and BASE_URL environment variables are the exception.
The above information applies when the environment variables are set prior to building the Vue app and not in this situation.
In a situation where environment variables are set during the build, it's important to look at what Vue CLI is doing.
The Vue CLI uses webpack.DefinePlugin to set environment variables using the object returned from the call to resolveClientEnv.
resolveClientEnv returns
{
'process.env': {}
}
This means when configuring your environment variables at build time, you need to come upon a way to merge with the existing one.
You need to perform a deep merge of both arrays, so that value for process.env key is an object containing keys from the resolved client environment and your keys.
chainWebpack key in the default export for vue.config.js is just about one of the ways to get this done.
The arguments passed to initialize the DefinePlugin can be merged with new environment variables that you like to configure using the underlying webpack-chain API. Here is an example:
// vue.config.js
const merge = require('deepmerge');
const pkgVersion = require('./package.json').version;
const VERSION = {
'process.env': {
VERSION: JSON.stringify(pkgVersion)
}
}
module.exports = {
chainWebpack: config =>
config
.plugin('define')
.tap(
args => merge(args, [VERSION])
)
}
Your initial attempt was fine, you were just missing the JSON.stringify part:
const webpack = require('webpack');
module.exports = {
configureWebpack: config => {
return {
plugins: [
new webpack.DefinePlugin({
'process.env': {
VERSION: JSON.stringify(require('./package.json').version),
}
})
]
}
},
}
Edit: although the webpack docs recommend the 'process.env.VERSION' way (yellow panel):
new webpack.DefinePlugin({
'process.env.VERSION': JSON.stringify(require('./package.json').version),
}),
Official solutions tend to be more reliable Modes and Environment Variables | Vue CLI
TIP
You can have computed env vars in your vue.config.js file. They still need to be prefixed with VUE_APP_. This is useful for version info
process.env.VUE_APP_VERSION = require('./package.json').version
module.exports = {
// config
}
I attempted the accepted answer, and had errors. However, in the vue docs, I was able to find an answer similar (but not quite) that of #antoni's answer.
In short, just have the following in vue.config.js:
process.env.VUE_APP_VERSION = require('./package.json').version
module.exports = {
// config
}
Docs 2020-10-27:
You can access env variables in your application code:
process.env.VUE_APP_NOT_SECRET_CODE = require('./package.json').version
During build, process.env.VUE_APP_NOT_SECRET_CODE will be replaced by the corresponding value. In the case of VUE_APP_NOT_SECRET_CODE=some_value, it will be replaced by "some_value".
In addition to VUE_APP_* variables, there are also two special variables that will always be available in your app code:
NODE_ENV - this will be one of "development", "production" or "test" depending on the mode the app is running in.
BASE_URL - this corresponds to the publicPath option in vue.config.js and is the base path your app is deployed at.
The answer for this on the official VueJS forum is like so:
chainWebpack: config => {
config
.plugin('define')
.tap(args => {
let v = JSON.stringify(require('./package.json').version)
args[0]['process.env']['VERSION'] = v
return args
})
}
Description
Add this line to your vue.config.js file
module.exports = {
...
chainWebpack: config => {
config
.plugin('define')
.tap(args => {
let v = JSON.stringify(require('./package.json').version)
args[0]['process.env']['VERSION'] = v
return args
})
}
};
Then you can use this in your vue files like so:
version: function () {
return process.env.VERSION
}
A one liner alternative:
//script tag
let packageJsonInfo = require("../../package.json");
//Then you can for example, get the version no
packageJsonInfo.version
Related
I have a situation where I don't have Terser configured correctly and it's causing the compressed version of my page to break. My issue is that I have some functional declarations in my index.js. These declarations need to be accessible from Bootstrap modal windows that can be loaded in at a later time.
For example, in my index.js there is a function declared like this:
function doThis() { }
Then, say, a user opens an address form in a modal window, and a different javascript file called 'address-form.js' is loaded. In this form there is a button with an onclick handler that calls doThis(). The onclick handler lives in 'address-form.js' but is able to access doThis() in the parent index.js. The button works fine when index.js isn't compressed. But after it's compressed, I get an error saying doThis() doesn't exist.
I believe this to be a scoping issue, because I do see the doThis() declaration in index.js, but it appears to be wrapped in a bunch of parentheses. I'm not sure how to make the scope of doThis() the top-level window. The same scoping issue exists for literally hundreds of function declarations and variables in index.js, so I'm looking for a solution where I don't have to tinker too much with the gargantuan file.
Notice if I change the function declaration to an expression, it DOES seem to work:
window.doThis = function() {
}
However, because there are hundreds of vars and const and let variables in the file (in addition to dozens of function declarations), it's not really practical for me to change the scope of all of them just so the compression will work.
Here is my webpack.config.js:
const TerserPlugin = require("terser-webpack-plugin")
const glob = require('glob')
const path = require('path')
const webpack = require('webpack')
module.exports = {
entry: glob.sync('./js/Pages/*.js').reduce((object, element) => {
object[path.parse(element).name] = element
return object
}, {}),
output: {
filename: '[name].js',
path: path.resolve(__dirname, './js/Pages/minified')
},
optimization: {
minimize: true,
minimizer: [
new TerserPlugin({
parallel: true,
test: /\.js(\?.*)?$/i,
terserOptions: {
mangle: false,
compress: true,
keep_fnames: true,
keep_classnames: true,
ie8: false,
safari10: false,
toplevel: false
}
})
]
},
plugins: [
new webpack.optimize.LimitChunkCountPlugin({
maxChunks: 1
})
]
}
The command I'm running is:
webpack --mode=production --config webpack.config.js
SOLUTION
Solution as accepted below was to use terser-cli directly. I wrote a little script that runs terser on every file in a directory if anyone should find it helpful:
const fs = require('fs')
const path = require('path')
const exec = require('child_process').exec;
const Terser = require('terser')
const srcDir = '../js/Pages'
const destDir = '../js/Pages/minified'
function minifyPagesDir() {
let srcFileNames = fs.readdirSync(srcDir).filter(path => path.match(/\.js$/)) || []
let srcFilePaths = []
let destFilePaths = srcFileNames.map((item, i) => {
srcFilePaths[i] = `${srcDir}/${srcFileNames[i]}`
return `${destDir}/${item}`
})
if (!fs.existsSync(destDir))
fs.mkdirSync(destDir)
srcFileNames.forEach((item, i) => {
exec(
`terser ${srcFilePaths[i]} -c -o ${destFilePaths[i]} --ecma 2021`,
(error, stdout, stderr) => {
if (error !== null)
console.log(`RUHROH: ${error}`, ' stderr: ', stderr);
}
)
console.log(`Minified '${srcFilePaths[i]}' to '${destFilePaths[i]}'!`)
})
console.log('Minification complete!')
}
minifyPagesDir()
I created a repro project with your configuration and verified that your problems are not caused by terser, but by webpack.
I've used the following test file:
function doThis() { }
With minimize disabled and no minimizer set, this is the result of building the test file with webpack:
/******/ (() => { // webpackBootstrap
var __webpack_exports__ = {};
function doThis() { }
/******/ })()
;
If we reformat it and remove the comments, it's clear that everything inside the file is dead code. Webpack is a bundler and wraps everything in an IIFE. This means that functions won't be assigned to global scope.
(() => {
var __webpack_exports__ = {};
function doThis() { }
})();
Since you're not exporting anything or causing any side-effects, like assigning to window, terser will remove the content. This is correct and even without removing it, you still would get the same errors, since the function in webpack's output is not assigned to the global scope and your handlers couldn't access it. Compressing webpack's export with the terser cli results in an empty file.
Running the terser cli directly on the test input file, results in the desired compressed output:
function doThis(){}
You said you've ran terser from the command line and had the same problems as when bundling and compressing through webpack. Did you run the source files through terser or the output of webpack? Because running the source files through terser should result in the desired output. Please try running terser foo/bar.js -c -o foo/bar.min.js and test if it solves your problem.
If you want to use webpack, you need to either assign to window or move away from using global scope altogether.
I don't know when it was added, but webpack has an IIFE option.
https://webpack.js.org/configuration/output/#outputiife
But unfortunately, it always generates unnecessary code var __webpack_exports__ = {}.
I don't know how to delete that code.
I would like to know if there is an option.
When trying to fetch data from an API, the key that I was using was considered "undefined". My key works because I replaced the {key=undefined} in the network console with the actual string and I was able to get the data that I needed. Any thoughts? Also, I know you shouldn't hide any API keys in a React app but I am just using this for testing purposes. If it helps to clarify things, I did use Create-React-App and they did have a major update in the last 3 months, so I wonder if that has anything to do with it.
const bartKey = process.env.REACT_API_BART_API_KEY;
console.log(`Api key: ${process.env.REACT_API_BART_API_KEY}` );
//inside class Component
async getAllStations(){
try{
const response = await fetch(`http://api.bart.gov/api/etd.aspx?cmd=etd&orig=${this.state.selectedStation}&key=${bartKey}&json=y`);
// console.log(response.json());
const data = await response.json();
console.log('Initial data: ', data);
// fetch(`http:api.bart.gov/api/etd.aspx?cmd=etd&orig=${this.state.selectedStation}&key=${bartKey}&json=y`)
// .then(response => response.json())
// .then(data => console.log(`here: ${data}`))
}catch(e){
console.log(`Error: ${e}`)
}
}
this works for me when using create-react-app:
instead of REACT_API_BART_API_KEY use REACT_APP_BART_API_KEY in your .env
Then you can call it as process.env.REACT_APP_BART_API_KEY
check this url from create-react-app docs https://facebook.github.io/create-react-app/docs/adding-custom-environment-variables
I would say a really good solution if your .env file is working weird is to make a config file. Keep the API key in there. Put that file in git ignore and it will be hidden the same way and it is sure to work.
This changes helped me solve my issue:
Please note dotenv is a zero-dependency module that loads environment variables from a .env file into process.env. So this you should at first download this package:
1 step: npm install dotenv --save-dev
Import webpack and dotenv in your webpack.config.js file and make this changes:
After make sure module.exports an function which at first generates the environment keys:
When the keys are generated use webpack.DefinePlugin() which will help you generate global constants for your app.
// other imports
const dotenv = require('dotenv');
const webpack = require('webpack');
module.exports = () => {
const env = dotenv.config().parsed;
const envKeys = Object.keys(env).reduce((prev, next) => {
prev[`process.env.${next}`] = JSON.stringify(env[next]);
return prev;
}, {});
return {
/* ... here should be your previous module.exports object attributes */
entry: ['babel-polyfill', './src/index.js'],
output: {
path: path.join(__dirname, 'build'),
filename: 'bundle.js',
},
plugins: [
new HtmlWebPackPlugin({
template: "./public/index.html",
filename: "./index.html"
}),
new webpack.DefinePlugin(envKeys)
]
}
};
Also note that on logging process.env you will still get empty object or nothing. But if you log process.env.YOU_KEY then you will get your key value stringified
Hope it helps you!
As early as possible in your application, require and configure dotenv. (https://www.npmjs.com/package/dotenv)
require('dotenv').config()
You should write that line as early as possible in your program.
I am quite new to Webpack, so bear with me if thats a stupid question.
My goal is to transform my old, AMD based codebase to a ES6 Module based solution. What I am struggling with is handling dynamic import()s. So my app router works on a module basis, i.e. each route is mapped to a module path and then required. Since I know what modules will be included, I just add those dynamically imported modules to my r.js configuration and am able to build everything in a single file, with all require calls still working.
Now, I am trying to do the same with ES6 modules and Webpack. With my devmode this is no problem as I can just replace require() with import(). However I cannot get this to work with bundling. Either Webpack splits my code (and still fails to load the dynamic module anyways), or - if I use the Array format for the entry config, the dynamic module is included in the bundle but loading still fails: Error: Cannot find module '/src/app/DynClass.js'
This is how my Webpack config looks like:
const webpack = require('webpack');
const path = require('path');
module.exports = {
mode: "development",
entry: ['./main.js', './app/DynClass.js'],
output: {
filename: 'main.js',
path: path.resolve(__dirname, "../client/")
},
resolve: {
alias: {
"/src": path.resolve(__dirname, '')
}
},
module: {
rules: [
{
test: /\.tpl$/i,
use: 'raw-loader',
},
]
}
};
So basically I want to tell Webpack: "hey, there is another module (or more) that is to be loaded dynamically and I want it to be included in the bundle"
How can I do this?
So yeah, after much fiddling there seems to be a light at the end of the tunnel. Still, this is not a 100% solution and it is surely not for the faint of heart, as it is quite ugly and fragile. But still I want to share my approach with you:
1) manual parsing of my routes config
My router uses a config file looking like this:
import StaticClass from "/src/app/StaticClass.js";
export default {
StaticClass: {
match: /^\//,
module: StaticClass
},
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
};
So as you can see the export is an object, with keys acting as the route id, and an object that contains the matches (regex based) and the module which should be executed by the router if the route matches. I can feed my router with both a Constructor function (or an object) for modules which are available immediatly (i.e. contained in the main chunk) or if the module value is a string, this means that the router has to load this module dynamically by using the path specified in the string.
So as I know what modules could be potentially loaded (but not if and when) I can now parse this file within my build process and transform the route config to something webpack can understand:
const path = require("path");
const fs = require("fs");
let routesSource = fs.readFileSync(path.resolve(__dirname, "app/routes.js"), "utf8");
routesSource = routesSource.substr(routesSource.indexOf("export default"));
routesSource = routesSource.replace(/module:\s*((?!".*").)*$/gm, "module: undefined,");
routesSource = routesSource.replace(/\r?\n|\r/g, "").replace("export default", "var routes = ");
eval(routesSource);
let dummySource = Object.entries(routes).reduce((acc, [routeName, routeConfig]) => {
if (typeof routeConfig.module === "string") {
return acc + `import(/* webpackChunkName: "${routeName}" */"${routeConfig.module}");`;
}
return acc;
}, "") + "export default ''";
(Yeah I know this is quite ugly and also a bit brittle so this surely could be done better)
Essentially I create a new, virtual module where every route entry which demands a dynamic import is translated, so:
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
becomes:
import(/* webpackChunkName: "DynClass" */"/src/app/DynClass.js");
So the route id simply becomes the name of the chunk!
2) including the virtual module in the build
For this I use the virtual-module-webpack-plugin:
plugins: [
new VirtualModulePlugin({
moduleName: "./app/dummy.js",
contents: dummySource
})
],
Where dummySource is just a string containing the sourcecode of my virtual module I just have generated. Now, this module is pulled in and the "virtual imports" can be processed by webpack. But wait, I still need to import the dummy module, but I do not have any in my development mode (where I use everything natively, so no loaders).
So in my main code I do the following:
let isDev = false;
/** #remove */
isDev = true;
/** #endremove */
if (isDev) { import('./app/dummy.js'); }
Where "dummy.js" is just an empty stub module while I am in development mode. The parts between that special comments are removed while building (using the webpack-loader-clean-pragma loader), so while webpack "sees" the import for dummy.js, this code will not be executed in the build itself since then isDev evaluates to false. And since we already defined a virtual module with the same path, the virtual module is included while building just like I want, and of course all dependencies are resolved as well.
3) Handling the actual loading
For development, this is quite easy:
import routes from './app/routes.js';
Object.entries(routes).forEach(async ([routeId, route]) => {
if (typeof route.module === "function") {
new route.module;
} else {
const result = await import(route.module);
new result.default;
}
});
(Note that this is not the actual router code, just enough to help me with my PoC)
Well, but for the build I need something else, so I added some code specific to the build environment:
/** #remove */
const result = await import(route.module);
new result.default;
/** #endremove */
if (!isDev) {
if (typeof route.module === "string") { await __webpack_require__.e(routeId); }
const result = __webpack_require__(route.module.replace("/src", "."));
new result.default;
}
Now, the loading code for the dev environment is just stripped out, and there is another loading code that uses webpack internally. I also check if the module value is a function or string, and if it is the latter I invoke the internal require.ensure function to load the correct chunk: await __webpack_require__.e(routeId);. Remember that I named my chunks when generating the virtual module? Now thats why I still can find them now!
4) more needs to be done
Another thing I encountered is when several dynamically loaded modules have the same dependencies, webpack tries to generate more chunks with names like module1~module2.bundle.js, breaking my build. To counter this, I needed to make sure that all those shared modules go into a specific named bundle I called "shared":
optimization: {
splitChunks: {
chunks: "all",
name: "shared"
}
}
And when in production mode, I simply load this chunk manually before any dynamic modules depending on it are requested:
if (!isDev) {
await __webpack_require__.e("shared");
}
Again, this code only runs in production mode!
Finally, I have to prevent webpack renaming my modules (and chunks) to something like "1", "2" etc, but rather keep the names I just have defined:
optimization: {
namedChunks: true,
namedModules: true
}
Se yeah, there you have it! As I said this wasn't pretty but seems to work, at least with my simplified test setup. I really hope there aren't any blockers ahead of me when I do all the rest (like ESLint, SCSS etc)!
Suppose I have the following module:
var modulesReq = require.context('.', false, /\.js$/);
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
Jest complains because it doesn't know about require.context:
FAIL /foo/bar.spec.js (0s)
● Runtime Error
- TypeError: require.context is not a function
How can I mock it? I tried using setupTestFrameworkScriptFile Jest configuration but the tests can't see any changes that I've made in require.
I had the same problem, then I've made a 'solution'.
I'm pretty sure that this is not the best choice. I ended up stopping using it, by the points answered here:
https://github.com/facebookincubator/create-react-app/issues/517
https://github.com/facebook/jest/issues/2298
But if you really need it, you should include the polyfill below in every file that you call it (not on the tests file itself, because the require will be no global overridden in a Node environment).
// This condition actually should detect if it's an Node environment
if (typeof require.context === 'undefined') {
const fs = require('fs');
const path = require('path');
require.context = (base = '.', scanSubDirectories = false, regularExpression = /\.js$/) => {
const files = {};
function readDirectory(directory) {
fs.readdirSync(directory).forEach((file) => {
const fullPath = path.resolve(directory, file);
if (fs.statSync(fullPath).isDirectory()) {
if (scanSubDirectories) readDirectory(fullPath);
return;
}
if (!regularExpression.test(fullPath)) return;
files[fullPath] = true;
});
}
readDirectory(path.resolve(__dirname, base));
function Module(file) {
return require(file);
}
Module.keys = () => Object.keys(files);
return Module;
};
}
With this function, you don't need to change any require.context call, it will execute with the same behavior as it would (if it's on webpack it will just use the original implementation, and if it's inside Jest execution, with the polyfill function).
After spending some hours trying each of the answers above. I would like to contribute.
Adding babel-plugin-transform-require-context plugin to .babelrc for test env fixed all the issues.
Install - babel-plugin-transform-require-context here https://www.npmjs.com/package/babel-plugin-transform-require-context (available with yarn too)
Now add plugin to .babelrc
{
"env": {
"test": {
"plugins": ["transform-require-context"]
}
}
}
It will simply transform require-context for test env into dummy fn calls so that code can run safely.
If you are using Babel, look at babel-plugin-require-context-hook. Configuration instructions for Storybook are available at Storyshots | Configure Jest to work with Webpack's require.context(), but they are not Storyshots/Storybook specific.
To summarise:
Install the plugin.
yarn add babel-plugin-require-context-hook --dev
Create a file .jest/register-context.js with the following contents:
import registerRequireContextHook from 'babel-plugin-require-context-hook/register';
registerRequireContextHook();
Configure Jest (the file depends on where you are storing your Jest configuration, e.g. package.json):
setupFiles: ['<rootDir>/.jest/register-context.js']
Add the plugin to .babelrc
{
"presets": ["..."],
"plugins": ["..."],
"env": {
"test": {
"plugins": ["require-context-hook"]
}
}
}
Alternatively, add it to babel.config.js:
module.exports = function(api) {
api.cache(true)
const presets = [...]
const plugins = [...]
if (process.env.NODE_ENV === "test") {
plugins.push("require-context-hook")
}
return {
presets,
plugins
}
}
It may be worth noting that using babel.config.js rather than .babelrc may cause issues. For example, I found that when I defined the require-context-hook plugin in babel.config.js:
Jest 22 didn't pick it up;
Jest 23 picked it up; but
jest --coverage didn't pick it up (perhaps Istanbul isn't up to speed with Babel 7?).
In all cases, a .babelrc configuration was fine.
Remarks on Edmundo Rodrigues's answer
This babel-plugin-require-context-hook plugin uses code that is similar to Edmundo Rodrigues's answer here. Props to Edmundo! Because the plugin is implemented as a Babel plugin, it avoids static analysis issues. e.g. With Edmundo's solution, Webpack warns:
Critical dependency: require function is used in a way in which dependencies cannot be statically extracted
Despite the warnings, Edmundo's solution is the most robust because it doesn't depend on Babel.
Extract the call to a separate module:
// src/js/lib/bundle-loader.js
/* istanbul ignore next */
module.exports = require.context('bundle-loader?lazy!../components/', false, /.*\.vue$/)
Use the new module in the module where you extracted it from:
// src/js/lib/loader.js
const loadModule = require('lib/bundle-loader')
Create a mock for the newly created bundle-loader module:
// test/unit/specs/__mocks__/lib/bundle-loader.js
export default () => () => 'foobar'
Use the mock in your test:
// test/unit/specs/lib/loader.spec.js
jest.mock('lib/bundle-loader')
import Loader from 'lib/loader'
describe('lib/loader', () => {
describe('Loader', () => {
it('should load', () => {
const loader = new Loader('[data-module]')
expect(loader).toBeInstanceOf(Loader)
})
})
})
Alrighty! I had major issues with this and managed to come to a solution that worked for me by using a combination of other answers and the Docs. (Took me a good day though)
For anyone else who is struggling:
Create a file called bundle-loader.js and add something like:
module.exports = {
importFiles: () => {
const r = require.context(<your_path_to_your_files>)
<your_processing>
return <your_processed_files>
}
}
In your code import like:
import bundleLoader from '<your_relative_Path>/bundle-loader'
Use like
let <your_var_name> = bundleLoader.importFiles()
In your test file right underneath other imports:
jest.mock('../../utils/bundle-loader', () => ({
importFiles: () => {
return <this_will_be_what_you_recieve_in_the_test_from_import_files>
}
}))
Installing
babel-plugin-transform-require-context
package and adding the plugin in the .babelrc resolved the issue for me.
Refer to the documentation here:
https://www.npmjs.com/package/babel-plugin-transform-require-context
The easiest and fastest way to solve this problem will be to install require-context.macro
npm install --save-dev require-context.macro
then just replace:
var modulesReq = require.context('.', false, /\.js$/);
with:
var modulesReq = requireContext('.', false, /\.js$/);
Thats it, you should be good to go!
Cheers and good luck!
Implementation problems not mentioned:
Jest prevents out-of-scope variables in mock, like __dirname.
Create React App limits Babel and Jest customization. You need to use src/setupTests.js which is run before every test.
fs is not supported in the browser. You will need something like browserFS. Now your app has file system support, just for dev.
Potential race condition. Export after this import. One of your require.context imports includes that export. I'm sure require takes care of this, but now we are adding a lot of fs work on top of it.
Type checking.
Either #4 or #5 created undefined errors. Type out the imports, no more errors. No more concerns about what can or can't be imported and where.
Motivation for all this? Extensibility. Keeping future modifications limited to one new file. Publishing separate modules is a better approach.
If there's an easier way to import, node would do it. Also this smacks of premature optimization. You end up scrapping everything anyways because you're now using an industry leading platform or utility.
If you're using Jest with test-utils in Vue.
Install these packages:
#vue/cli-plugin-babel
and
babel-plugin-transform-require-context
Then define babel.config.js at the root of the project with this configuration:
module.exports = function(api) {
api.cache(true);
const presets = [
'#vue/cli-plugin-babel/preset'
];
const plugins = [];
if (process.env.NODE_ENV === 'test') {
plugins.push('transform-require-context');
}
return {
presets,
plugins
};
};
This will check if the current process is initiated by Jest and if so, it mocks all the require.context calls.
I faced the same issue with an ejected create-react-app project
and no one from the answers above helped me...
My solution were to copy to config/babelTransform.js the follwoing:
module.exports = babelJest.createTransformer({
presets: [
[
require.resolve('babel-preset-react-app'),
{
runtime: hasJsxRuntime ? 'automatic' : 'classic',
},
],
],
plugins:["transform-require-context"],
babelrc: false,
configFile: false,
});
Simpleset Solution for this
Just Do
var modulesReq = require.context && require.context('.', false, /\.js$/);
if(modulesReq) {
modulesReq.keys().forEach(function(module) {
modulesReq(module);
});
}
So Here I have added extra check if require.context is defined then only execute By Doing this jest will no longer complain
Is it possible to define a global variable with webpack to result something like this:
var myvar = {};
All of the examples I saw were using external file require("imports?$=jquery!./file.js")
There are several way to approach globals:
1. Put your variables in a module.
Webpack evaluates modules only once, so your instance remains global and carries changes through from module to module. So if you create something like a globals.js and export an object of all your globals then you can import './globals' and read/write to these globals. You can import into one module, make changes to the object from a function and import into another module and read those changes in a function. Also remember the order things happen. Webpack will first take all the imports and load them up in order starting in your entry.js. Then it will execute entry.js. So where you read/write to globals is important. Is it from the root scope of a module or in a function called later?
config.js
export default {
FOO: 'bar'
}
somefile.js
import CONFIG from './config.js'
console.log(`FOO: ${CONFIG.FOO}`)
Note: If you want the instance to be new each time, then use an ES6 class. Traditionally in JS you would capitalize classes (as opposed to the lowercase for objects) like
import FooBar from './foo-bar' // <-- Usage: myFooBar = new FooBar()
2. Use Webpack's ProvidePlugin.
Here's how you can do it using Webpack's ProvidePlugin (which makes a module available as a variable in every module and only those modules where you actually use it). This is useful when you don't want to keep typing import Bar from 'foo' again and again. Or you can bring in a package like jQuery or lodash as global here (although you might take a look at Webpack's Externals).
Step 1. Create any module. For example, a global set of utilities would be handy:
utils.js
export function sayHello () {
console.log('hello')
}
Step 2. Alias the module and add to ProvidePlugin:
webpack.config.js
var webpack = require("webpack");
var path = require("path");
// ...
module.exports = {
// ...
resolve: {
extensions: ['', '.js'],
alias: {
'utils': path.resolve(__dirname, './utils') // <-- When you build or restart dev-server, you'll get an error if the path to your utils.js file is incorrect.
}
},
plugins: [
// ...
new webpack.ProvidePlugin({
'utils': 'utils'
})
]
}
Now just call utils.sayHello() in any js file and it should work. Make sure you restart your dev-server if you are using that with Webpack.
Note: Don't forget to tell your linter about the global, so it won't complain. For example, see my answer for ESLint here.
3. Use Webpack's DefinePlugin.
If you just want to use const with string values for your globals, then you can add this plugin to your list of Webpack plugins:
new webpack.DefinePlugin({
PRODUCTION: JSON.stringify(true),
VERSION: JSON.stringify("5fa3b9"),
BROWSER_SUPPORTS_HTML5: true,
TWO: "1+1",
"typeof window": JSON.stringify("object")
})
Use it like:
console.log("Running App version " + VERSION);
if(!BROWSER_SUPPORTS_HTML5) require("html5shiv");
4. Use the global window object (or Node's global).
window.foo = 'bar' // For SPA's, browser environment.
global.foo = 'bar' // Webpack will automatically convert this to window if your project is targeted for web (default), read more here: https://webpack.js.org/configuration/node/
You'll see this commonly used for polyfills, for example: window.Promise = Bluebird
5. Use a package like dotenv.
(For server side projects) The dotenv package will take a local configuration file (which you could add to your .gitignore if there are any keys/credentials) and adds your configuration variables to Node's process.env object.
// As early as possible in your application, require and configure dotenv.
require('dotenv').config()
Create a .env file in the root directory of your project. Add environment-specific variables on new lines in the form of NAME=VALUE. For example:
DB_HOST=localhost
DB_USER=root
DB_PASS=s1mpl3
That's it.
process.env now has the keys and values you defined in your .env file.
var db = require('db')
db.connect({
host: process.env.DB_HOST,
username: process.env.DB_USER,
password: process.env.DB_PASS
})
Notes
Regarding Webpack's Externals, use it if you want to exclude some modules from being included in your built bundle. Webpack will make the module globally available but won't put it in your bundle. This is handy for big libraries like jQuery (because tree shaking external packages doesn't work in Webpack) where you have these loaded on your page already in separate script tags (perhaps from a CDN).
I was about to ask the very same question. After searching a bit further and decyphering part of webpack's documentation I think that what you want is the output.library and output.libraryTarget in the webpack.config.js file.
For example:
js/index.js:
var foo = 3;
var bar = true;
webpack.config.js
module.exports = {
...
entry: './js/index.js',
output: {
path: './www/js/',
filename: 'index.js',
library: 'myLibrary',
libraryTarget: 'var'
...
}
Now if you link the generated www/js/index.js file in a html script tag you can access to myLibrary.foo from anywhere in your other scripts.
Use DefinePlugin.
The DefinePlugin allows you to create global constants which can be
configured at compile time.
new webpack.DefinePlugin(definitions)
Example:
plugins: [
new webpack.DefinePlugin({
PRODUCTION: JSON.stringify(true)
})
//...
]
Usage:
console.log(`Environment is in production: ${PRODUCTION}`);
You can use define window.myvar = {}.
When you want to use it, you can use like window.myvar = 1
DefinePlugin doesn't actually define anything. What it does is replace variables that exist in your bundle code. If the variable doesn't exist in your code, it will do nothing. So it doesn't create global variables.
In order to create a global variable, write it in your code:
window.MyGlobal = MY_GLOBAL;
And use DefinePlugin to replace MY_GLOBAL with some code:
new webpack.DefinePlugin({
'MY_GLOBAL': `'foo'`,
// or
'MY_GLOBAL': `Math.random()`,
}),
Then your output JS will be like this:
window.MyGlobal = 'foo';
// or
window.MyGlobal = Math.random();
But MY_GLOBAL will never actually exist at runtime, because it is never defined. So that's why DefinePlugin has a misleading name.
I solved this issue by setting the global variables as a static properties on the classes to which they are most relevant. In ES5 it looks like this:
var Foo = function(){...};
Foo.globalVar = {};
You may hit this issue, when triing bundle < script > tag js files in some old project.
Do not use webpack for this, it may be even impossible if joining 50+ libraries like jquery and then figuring out all global variables or if they used nested require. I would advice to simply use uglify js instead , which drops all this problems in 2 commands.
npm install uglify-js -g
uglifyjs --compress --mangle --output bundle.js -- js/jquery.js js/silly.js