Multiple Webpack builds based on file extension - javascript

I'm wondering if it's possible to get a build configuration in Webpack going similar to how react-native handle ios/android builds based on the extensions: .ios.js and .android.js respectively.
To elaborate, given this directory structure:
app/
index.js
components/
button.web.js
button.mobile.js
I'd like to get
build/
web/app.js
mobile/app.js
Does anyone have an idea if this is even possible with webpack?

Yes, using resolve.extensions. For example this webpack configuration snippet:
resolve: {
extensions: ['', '.js', '.mobile.js']
}
will cause Webpack to look for button, button.js and finally button.mobile.js when encountering require('./button'). You can make the extension a command-line parameter. For example, if you add yargs to your project and use it like so in the webpack config file:
const argv = require('yargs').argv;
... {
resolve: {
extensions: ['', '.js', '.' + argv.target + '.js']
}
}
then webpack --target=mobile will also include .mobile.js files in the bundle, likewise for webpack --target=web or any other target.
Note that if both button.js and button.mobile.js exist, the first one to be found will be required and all the rest - ignored. If you have both common and platform-specific parts of a component, you will have to put the two parts into files with different names and require both - for example button.js and button-platform.mobile.js; then var button = require('./button'); var buttonPlatform = require('./button-platform'); and merge the two.
EDIT: Answer to second part of the question
The sought output folder structure can be created, again, using yargs - just plug the --target argument from above into the output section of the webpack config:
output: {
path: path.join(__dirname, argv.target);
filename: 'app.js'
}

Related

How to Make Webpack Produce a JS file as-is to Use it Later for Configuration [duplicate]

I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.

webpack babel how to copy transpiled file if i don't use import/require

I have a file in my project called test.js
I don't import/require it anywhere which means my webpack won't call babel-loader for that js file.
Question: what I want is to move test.js into /dist folder, but as a compiled/transpiled. What's the best practice for it?
What I tried: I tried to use a copy-webpack-plugin and use its transform parameters before copying the file, but I can't seem to find the good babel package.
{
from: 'test.js',
to: '/dist/test.js',
transform(content, path) {
//what do I write here?
},
}
The simplest approach I could think about is to use several entry points like this:
{
entry: {
a: "./your-main-stuff",
b: "./your-single-file",
},
output: {
path: path.join(__dirname, "dist"),
filename: "[name].js"
}
}
which will create your a.js main bundle and b.js file in __dirname/dist folder both transpiled provided you used corresponding loader(s).
And from copy-webpack-plugin docs section:
webpack-copy-plugin is not designed to copy files generated from the
build process; rather, it is to copy files that already exist in the
source tree, as part of the build process.
so it seems to be difficult (if possible) making it move transpiled files.
Update. If you want to output files into different folders w/o changing your src folder, additonal tools needed. For your case (just 1 file) I would write a simple script and add it into package.json script section combined with webpack call like:
"scripts": {
"dev": "webpack && babel path-to-script.js --out-file path-to-script-compiled.js"
}
Just like in the previous answer, initially I went with the "scripts" entry in package.json that runs babel. But for a number of reasons I wanted to use webpack 5 to do the job. So after failing with webpack-copy-plugin and a good amount of digging around I came to this solution:
let config = {
entry: [
glob.sync(srcDir + '/**/*.js') // get all .js files from the source dir
],
output : {
filename : '[name].rem.js', // webpack wants to bundle - it can bundle here ;)
path: outDir
},
resolve: {
alias: {
'app': appDir
}
},
plugins: [
new RemoveEmptyScriptsPlugin({extensions: ['js'], scriptExtensions: /\.rem\.js/}) // for all .js source files that get bundled remove the bundle .rem.js file
],
module: {
rules:[{
test: /\.jsx?$/,
type: 'asset/resource', // get webpack to take it out instead of bundling
generator: {
filename: ({filename}) => filename // return full file name so directory structure is preserved
},
use: {
loader: 'babel-loader',
options: {
targets: { node: 16 },
presets: [
['#babel/preset-env', { modules: 'commonjs' /* transpile import/export */}],
]
}
}
}]
}
};
// Since the code does not go through the full pipeline and imports are not getting resolved, aliases will remain in the code.
// To resolve them it takes to hack the aliases object into the babel config
config.module.rules[0].use.options.plugins.push(['babel-plugin-webpack-alias-7', {config: {resolve: {alias: config.resolve.alias}}}];
And it does a good job but beware that it takes to use the patched versions of the two plugins (unless the patches have been merged already)!

Webpack configuration to concatenate JS

Im using webpack 4 to bundle my dependencies (in this case AngularJS) like so:
index.js
require('angular');
require('angular-ui-router');
// ...
webpack.config.js
const path = require('path');
module.exports = {
mode: "development",
entry: "./build/index.js",
output: {
path: __dirname + '/public/js',
filename: "bundle.js"
}
}
This generates a bundle.js file containing all my dependencies.
I would additionally like to use it as a task runner to concatenate the Angular JS files from /build/js into a single file (lets call it app.js) thats placed into /public/js
Ideally, I would like the representation of the concatenated angular files (what would be in app.js) to be included within bundle.js along with my dependencies - though Im not sure if this is possible or best-practice.
As #Pete has pointed out in the comment, webpack input accepts an array of entry paths. Webpack itself doesn't take glob pattern, but you can use the glob package to do so (if you use webpack, it's likely that you've already installed it, otherwise get it here):
const glob = require('glob');
module.exports = {
mode: 'development',
entry: glob.sync('./src/**/*.js'), // ['./src/a.js', './src/dir/b.js']
...
}
Hope it helps!

How to copy static files to build directory with Webpack?

I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.

Resolving require paths with webpack

I'm still confused how to resolve module paths with webpack. Now I write:
myfile = require('../../mydir/myfile.js')
but I'd like to write
myfile = require('mydir/myfile.js')
I was thinking that resolve.alias may help since I see a similar example using { xyz: "/some/dir" } as alias then I can require("xyz/file.js").
But if I set my alias to { mydir: '/absolute/path/mydir' }, require('mydir/myfile.js') won't work.
I feel dumb because I've read the doc many times and I feel I'm missing something.
What is the right way to avoid writing all the relative requires with ../../ etc?
Webpack >2.0
See wtk's answer.
Webpack 1.0
A more straightforward way to do this would be to use resolve.root.
http://webpack.github.io/docs/configuration.html#resolve-root
resolve.root
The directory (absolute path) that contains your modules. May also be an array of directories. This setting should be used to add individual directories to the search path.
In your case:
webpack config
var path = require('path');
// ...
resolve: {
root: path.resolve('./mydir'),
extensions: ['', '.js']
}
consuming module
require('myfile')
or
require('myfile.js')
see also: http://webpack.github.io/docs/configuration.html#resolve-modulesdirectories
For future reference, webpack 2 removed everything but modules as a way to resolve paths. This means root will not work.
https://gist.github.com/sokra/27b24881210b56bbaff7#resolving-options
The example configuration starts with:
{
modules: [path.resolve(__dirname, "app"), "node_modules"]
// (was split into `root`, `modulesDirectories` and `fallback` in the old options)
resolve.alias should work exactly the way you described, so I'm providing this as an answer to help mitigate any confusion that may result from the suggestion in the original question that it does not work.
a resolve configuration like the one below will give you the desired results:
// used to resolve absolute path to project's root directory (where web pack.config.js should be located)
var path = require( 'path' );
...
{
...
resolve: {
// add alias for application code directory
alias:{
mydir: path.resolve( __dirname, 'path', 'to', 'mydir' )
},
extensions: [ '', '.js' ]
}
}
require( 'mydir/myfile.js' ) will work as expected. If it does not, there must be some other issue.
If you have multiple modules that you want to add to the search path, resolve.root makes sense, but if you just want to be able to reference components within your application code without relative paths, alias seems to be the most straight-forward and explicit.
An important advantage of alias is that it gives you the opportunity to namespace your requires which can add clarity to your code; just like it is easy to see from other requires what module is being referenced, alias allows you to write descriptive requires that make it obvious you're requiring internal modules, e.g. require( 'my-project/component' ). resolve.root just plops you into the desired directory without giving you the opportunity to namespace it further.
In case anyone else runs into this problem, I was able to get it working like this:
var path = require('path');
// ...
resolve: {
root: [path.resolve(__dirname, 'src'), path.resolve(__dirname, 'node_modules')],
extensions: ['', '.js']
};
where my directory structure is:
.
├── dist
├── node_modules
├── package.json
├── README.md
├── src
│   ├── components
│   ├── index.html
│   ├── main.js
│   └── styles
├── webpack.config.js
Then from anywhere in the src directory I can call:
import MyComponent from 'components/MyComponent';
I have resolve it with Webpack 2 like this:
module.exports = {
resolve: {
modules: ["mydir", "node_modules"]
}
}
You can add more directories to array...
My biggest headache was working without a namespaced path. Something like this:
./src/app.js
./src/ui/menu.js
./node_modules/lodash/
Before I used to set my environment to do this:
require('app.js')
require('ui/menu')
require('lodash')
I found far more convenient avoiding an implicit src path, which hides important context information.
My aim is to require like this:
require('src/app.js')
require('src/ui/menu')
require('test/helpers/auth')
require('lodash')
As you see, all my app code lives within a mandatory path namespace. This makes quite clear which require call takes a library, app code or a test file.
For this I make sure that my resolve paths are just node_modules and the current app folder, unless you namespace your app inside your source folder like src/my_app
This is my default with webpack
resolve: {
extensions: ['', '.jsx', '.js', '.json'],
root: path.resolve(__dirname),
modulesDirectories: ['node_modules']
}
It would be even better if you set the environment var NODE_PATH to your current project file. This is a more universal solution and it will help if you want to use other tools without webpack: testing, linting...
If you're using create-react-app, you can simply add a .env file containing
NODE_PATH=src/
Source: https://medium.com/#ktruong008/absolute-imports-with-create-react-app-4338fbca7e3d
Got this solved using Webpack 2 :
resolve: {
extensions: ['', '.js'],
modules: [__dirname , 'node_modules']
}
This thread is old but since no one posted about require.context I'm going to mention it:
You can use require.context to set the folder to look through like this:
var req = require.context('../../mydir/', true)
// true here is for use subdirectories, you can also specify regex as third param
return req('./myfile.js')
Simply use babel-plugin-module-resolver:
$ npm i babel-plugin-module-resolver --save-dev
Then create a .babelrc file under root if you don't have one already:
{
"plugins": [
[
"module-resolver",
{
"root": ["./"]
}
]
]
}
And everything under root will be treated as absolute import:
import { Layout } from 'components'
For VSCode/Eslint support, see here.
I didn't get why anybody suggested to include myDir's parent directory into modulesDirectories in webpack, that should make the trick easily:
resolve: {
modulesDirectories: [
'parentDir',
'node_modules',
],
extensions: ['', '.js', '.jsx']
},

Categories