Related
I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.
As I understand, webpack-merge helps us break down our webpack.config file into more manageable chunks, adding configuration relevant to the environment.
While we will separate the production and development specific bits
out, note that we'll still maintain a "common" configuration to keep
things DRY. In order to merge these configurations together, we'll use
a utility called webpack-merge. With the "common" configuration in
place, we won't have to duplicate code within the environment-specific
configurations.
- Webpack - production
My code in my webpack.prod.js is like so:
const merge = require('webpack-merge');
const common = require('./webpack.common.js');
module.exports = merge(common, {
mode: 'production',
devtool: 'source-map',
module: {
rules: [
{
test: /\.(png|svg|jpg|gif)$/,
use: [
'file-loader'
],
exclude: [
path.resolve(__dirname, "node_modules") //NEED TO ACCCESS PATH VARIABLE HERE
]
}
]
}
In my webpack.common.js is the variable path in which I thought webpack.prod.js has access to. I'm assuming it doesn't as I am presented with an error:
ReferenceError: path is not defined
Question
How do I access the common configuration? Have I misunderstood the concept of webpack-merge?
webpack-merge will take two js objects and merge them using lodash mergeWith. So basically its returning a object which contains properties of both objects.
It cant provide with you path variable or any other variable. You will need to implicitely import it in your webpack.prod.js file.
So right now I'm working with a prototype where we're using a combination between webpack (for building .tsx files and copying .html files) and webpack-dev-server for development serving. As you can assume we are also using React and ReactDOM as a couple of library dependencies as well. Our current build output is the following structure:
dist
-favicon.ico
-index.html
-main.js
-main.js.map // for source-mapping between tsx / js files
This places ALL of the modules (including library dependencies into on big bundled file). I want the end result to look like this:
dist
-favicon.ico
-index.html
-appName.js
-appName.min.js
-react.js
-react.min.js
-reactDOM.js
-reactDOM.min.js
I have references to each of the libraries in index.html and in import statements in the .tsx files. So my question is this...
How do I go from webpack producing this gigantic bundled .js file to individual .js files (libraries included, without having to specify each individually)? **Bonus: I know how to do prod/dev environment flags, so how do I just minify those individual files (again without bundling them)?
current webpack.config:
var webpack = require("webpack"); // Assigning node package of webpack dependency to var for later utilization
var path = require("path"); // // Assigning node package of path dependency to var for later utilization
module.exports = {
entry: [
"./wwwroot/app/appName.tsx", // Starting point of linking/compiling Typescript and dependencies, will need to add separate entry points in case of not deving SPA
"./wwwroot/index.html", // Starting point of including HTML and dependencies, will need to add separate entry points in case of not deving SPA
"./wwwroot/favicon.ico" // Input location for favicon
],
output: {
path: "./dist/", // Where we want to host files in local file directory structure
publicPath: "/", // Where we want files to appear in hosting (eventual resolution to: https://localhost:4444/)
filename: "appName.js" // What we want end compiled app JS file to be called
},
// Enable sourcemaps for debugging webpack's output.
devtool: "source-map",
devServer: {
contentBase: './dist', // Copy and serve files from dist folder
port: 4444, // Host on localhost port 4444
// https: true, // Enable self-signed https/ssl cert debugging
colors: true // Enable color-coding for debugging (VS Code does not currently emit colors, so none will be present there)
},
resolve: {
// Add '.ts' and '.tsx' as resolvable extensions.
extensions: [
"",
".ico",
".js",
".ts",
".tsx",
".web.js",
".webpack.js"
]
},
module: {
loaders: [
// This loader copies the index.html file & favicon.ico to the output directory.
{
test: /\.(html|ico)$/,
loader: 'file?name=[name].[ext]'
},
// All files with a '.ts' or '.tsx' extension will be handled by 'ts-loader'.
{
test: /\.tsx?$/,
loaders: ["ts-loader"]
}
],
preLoaders: [
// All output '.js' files will have any sourcemaps re-processed by 'source-map-loader'.
{
test: /\.js$/,
loader: "source-map-loader"
}
]
},
// When importing a module whose path matches one of the following, just
// assume a corresponding global variable exists and use that instead.
// This is important because it allows us to avoid bundling all of our
// dependencies, which allows browsers to cache those libraries between builds.
// externals: {
// "react": "React",
// "react-dom": "ReactDOM",
// "redux": "Redux"
// }
};
Change the output setting to be name driven e.g.
entry: {
dash: 'app/dash.ts',
home: 'app/home.ts',
},
output: {
path: './public',
filename: 'build/[name].js',
sourceMapFilename: 'build/[name].js.map'
},
To expand upon #basarat's answer, you can use the glob package from node's standard library to build the "entry" config:
const glob = require("glob");
module.exports = [
{
target: "node",
entry: glob.sync("./src/**/*.test.{ts,tsx}").reduce((acc, file) => {
acc[file.replace(/^\.\/src\//, "")] = file;
return acc;
}, {}),
output: {
filename: "[name].js",
chunkFilename: "[name]-[id].js",
path: __dirname + "/dist"
},
//...
}
];
This builds files with the same name as their source, replacing .ts and .tsx with .js.
OPs answer copied out of the question
Ended up finding a solution that fit my needs, although, again, in that webpack-y way, requires some additional configuration. Still would like to make it a little more dynamic, but will perfect this at a later point in time. The resolution I was looking for was the ability to "chunk" common modules, but I stated it as filename given "entry"-points provided in webpack. I didn't mind some files being combined, where it made sense, but wanted overall files to be at a component-level given the project wasn't a SPA (single page application).
The additional code ended up being:
plugins: [
new webpack.optimize.CommonsChunkPlugin({ // This plugin is for extracting and created "chunks" (extracted parts of the code that are common and aren't page specific)
// One of these instances of plugins needs to be specified for EACH chunk file output desired
filename: "common.js", // Filename for this particular set of chunks to be stored
name: "common", // Entry point name given for where to pull all of the chunks
minChunks: 3 // Minimum number of chunks to be created
})
]
I also had to parameterize the entry points (see below for example), by variable name so that I could assign react, react-dom, and redux modules to common.js file.
entry: {
main: "./wwwroot/app/appName.tsx", // Starting point of linking/compiling Typescript and dependencies, will need to add separate entry points in case of not deving SPA
index: "./wwwroot/index.html", // Starting point of including HTML and dependencies, will need to add separate entry points in case of not deving SPA
favicon: "./wwwroot/favicon.ico", // Input location for favicon
common: [ "react", "react-dom", "redux" ] // All of the "chunks" to extract and place in common file for faster loading of common libraries between pages
},
Im using a file-loader to automatically render a bunch of pug templates to static html files, but webpack is also outputting a pointless file based on the entry point
Eg this is in my webpack.config.js:
entry: {
'js/build/bundle.js': './js/app.js',
'this-shouldnt-emit': './pug/.pug.js' //pug entry point
},
output: {
path: path.join(__dirname, '../'),
filename: '[name]'
},
...
// pug loading module rule
{
test: /\.pug$/,
include: path.resolve(__dirname, "../pug"),
use: [
"file-loader?name=[path][name].html&context=./pug",
"pug-html-loader?pretty&exports=false"
]
}
and Im getting a this-shouldnt-emit bundle file in the root build directory, which I dont want.
How can I stop the file-loader from emitting an output bundle, but not interfere with it generating all the static html files it currently is. Is there a plugin or some kind of null loader I can put at the end of the loader chain to kill the bundle emitting?
I am not sure of what you are really asking but if you want it include multiple files in one bundle, use an array. The whole point of using [name] in filename: '[name].bundle.js is for multiple entries. It is not necessary (but optional) for one entry key.
This will create two bundle files.
entry: {
BUNDLE_KEY_ONE: 'path/file.js',
BUNDLE_KEY_TWO: 'path/otherFile.js',
}
This is how you have multiple files one bundle.
entry: {
SINGLE_BUNDLE_NAME: ['file/one.js', 'file/two.js']
}
I'm trying to do something that I believe should be possible, but I really can't understand how to do it just from the webpack documentation.
I am writing a JavaScript library with several modules that may or not depend on each other. On top of that, jQuery is used by all modules and some of them may need jQuery plugins. This library will then be used on several different websites which may require some or all modules.
Defining the dependencies between my modules was very easy, but defining their third-party dependencies seems to be harder then I expected.
What I would like to achieve: for each app I want to have two bundle files one with the necessary third-party dependencies and other with the necessary modules from my library.
Example:
Let's imagine that my library has the following modules:
a (requires: jquery, jquery.plugin1)
b (requires: jquery, a)
c (requires: jquery, jquery.ui, a, b)
d (requires: jquery, jquery.plugin2, a)
And I have an app (see it as a unique entry file) that requires modules a, b and c. Webpack for this case should generate the following files:
vendor bundle: with jquery, jquery.plugin1 and jquery.ui;
website bundle: with modules a, b and c;
In the end, I would prefer to have jQuery as a global so I don't need to require it on every single file (I could require it only on the main file, for example). And jQuery plugins would just extend the $ global in case they are required (it is not a problem if they are available to other modules that don't need them).
Assuming this is possible, what would be an example of a webpack configuration file for this case? I tried several combinations of loaders, externals, and plugins on my configuration file, but I don't really get what they are doing and which ones should I use. Thank you!
in my webpack.config.js (Version 1,2,3) file, I have
function isExternal(module) {
var context = module.context;
if (typeof context !== 'string') {
return false;
}
return context.indexOf('node_modules') !== -1;
}
in my plugins array
plugins: [
new CommonsChunkPlugin({
name: 'vendors',
minChunks: function(module) {
return isExternal(module);
}
}),
// Other plugins
]
Now I have a file that only adds 3rd party libs to one file as required.
If you want get more granular where you separate your vendors and entry point files:
plugins: [
new CommonsChunkPlugin({
name: 'common',
minChunks: function(module, count) {
return !isExternal(module) && count >= 2; // adjustable
}
}),
new CommonsChunkPlugin({
name: 'vendors',
chunks: ['common'],
// or if you have an key value object for your entries
// chunks: Object.keys(entry).concat('common')
minChunks: function(module) {
return isExternal(module);
}
})
]
Note that the order of the plugins matters a lot.
Also, this is going to change in version 4. When that's official, I update this answer.
Update: indexOf search change for windows users
I am not sure if I fully understand your problem but since I had similar issue recently I will try to help you out.
Vendor bundle.
You should use CommonsChunkPlugin for that. in the configuration you specify the name of the chunk (e.g. vendor), and file name that will be generated (vendor.js).
new webpack.optimize.CommonsChunkPlugin("vendor", "vendor.js", Infinity),
Now important part, you have to now specify what does it mean vendor library and you do that in an entry section. One one more item to entry list with the same name as the name of the newly declared chunk (i.e. 'vendor' in this case). The value of that entry should be the list of all the modules that you want to move to vendor bundle.
in your case it should look something like:
entry: {
app: 'entry.js',
vendor: ['jquery', 'jquery.plugin1']
}
JQuery as global
Had the same problem and solved it with ProvidePlugin. here you are not defining global object but kind of shurtcuts to modules. i.e. you can configure it like that:
new webpack.ProvidePlugin({
$: "jquery"
})
And now you can just use $ anywhere in your code - webpack will automatically convert that to
require('jquery')
I hope it helped. you can also look at my webpack configuration file that is here
I love webpack, but I agree that the documentation is not the nicest one in the world... but hey.. people were saying same thing about Angular documentation in the begining :)
Edit:
To have entrypoint-specific vendor chunks just use CommonsChunkPlugins multiple times:
new webpack.optimize.CommonsChunkPlugin("vendor-page1", "vendor-page1.js", Infinity),
new webpack.optimize.CommonsChunkPlugin("vendor-page2", "vendor-page2.js", Infinity),
and then declare different extenral libraries for different files:
entry: {
page1: ['entry.js'],
page2: ['entry2.js'],
"vendor-page1": [
'lodash'
],
"vendor-page2": [
'jquery'
]
},
If some libraries are overlapping (and for most of them) between entry points then you can extract them to common file using same plugin just with different configuration. See this example.
In case you're interested in bundling automatically your scripts separately from vendors ones:
var webpack = require('webpack'),
pkg = require('./package.json'), //loads npm config file
html = require('html-webpack-plugin');
module.exports = {
context : __dirname + '/app',
entry : {
app : __dirname + '/app/index.js',
vendor : Object.keys(pkg.dependencies) //get npm vendors deps from config
},
output : {
path : __dirname + '/dist',
filename : 'app.min-[hash:6].js'
},
plugins: [
//Finally add this line to bundle the vendor code separately
new webpack.optimize.CommonsChunkPlugin('vendor', 'vendor.min-[hash:6].js'),
new html({template : __dirname + '/app/index.html'})
]
};
You can read more about this feature in official documentation.
Also not sure if I fully understand your case, but here is config snippet to create separate vendor chunks for each of your bundles:
entry: {
bundle1: './build/bundles/bundle1.js',
bundle2: './build/bundles/bundle2.js',
'vendor-bundle1': [
'react',
'react-router'
],
'vendor-bundle2': [
'react',
'react-router',
'flummox',
'immutable'
]
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({
name: 'vendor-bundle1',
chunks: ['bundle1'],
filename: 'vendor-bundle1.js',
minChunks: Infinity
}),
new webpack.optimize.CommonsChunkPlugin({
name: 'vendor-bundle2',
chunks: ['bundle2'],
filename: 'vendor-bundle2-whatever.js',
minChunks: Infinity
}),
]
And link to CommonsChunkPlugin docs: http://webpack.github.io/docs/list-of-plugins.html#commonschunkplugin