How to shim tinymce in webpack? - javascript

I'm trying to get tinymce recognized by webpack. It sets a property named tinymce on window, so evidently one option is to require() it using syntax like this (described at the bottom of the EXPORTING section of the webpack docs):
require("imports?window=>{}!exports?window.XModule!./file.js
But in this example, how is ./file.js resolved? I installed tinymce via npm, and I can't figure out how to specify the right path to the tinymce.js file.
Regardless, I'd rather handle this in my configuration and be able to just require('tinymce') if possible, so I've installed exports-loader and added the following to my configuration (based on this discussion):
module: {
loaders: [
{
test: /[\/]tinymce\.js$/,
loader: 'exports?tinymce'
}
]
}
Unfortunately this isn't working. What's wrong with my configuration?

The tinymce module on npm can't be required directly, but contains 4 different distributions of the library. Namely:
tinymce/tinymce.js
tinymce/tinymce.min.js
tinymce/tinymce.jquery.js
tinymce/tinymce.jquery.min.js
To be able to do require('tinymce') in your code, you can add an alias in your webpack config, as well as a custom loader for your distribution of choice.
resolve: {
alias: {
// require('tinymce') will do require('tinymce/tinymce')
tinymce: 'tinymce/tinymce',
},
},
module: {
loaders: [
{
// Only apply on tinymce/tinymce
include: require.resolve('tinymce/tinymce'),
// Export window.tinymce
loader: 'exports?window.tinymce',
},
],
},
Where you can replace tinymce/tinymce with your distribution of choice.

Just like #cchamberlain I ended up using script loader for tinymce, but to load the plugins and other resources that were not required by default I used CopyWebpackPlugin instead of ES6 for more configurable solution.
var copyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
//...
plugins: [
new copyWebpackPlugin([
{ from: './node_modules/tinymce/plugins', to: './plugins' },
{ from: './node_modules/tinymce/themes', to: './themes' },
{ from: './node_modules/tinymce/skins', to: './skins' }
])
]
};

I was able to integrate tinyMCE in my Angular 2/TypeScript based project by using the imports-loader and exports-loader and the copy-webpack-plugin.
First ensure that the necessary dependencies are available and part of the packages.json file of your project:
npm i tinymce --save
npm i exports-loader --save-dev
npm i imports-loader --save-dev
npm i copy-webpack-plugin --save-dev
Then add the required loader to the loaders-section of your webpack configuration:
loaders: [
{
test: require.resolve('tinymce/tinymce'),
loaders: [
'imports?this=>window',
'exports?window.tinymce'
]
},
{
test: /tinymce\/(themes|plugins)\//,
loaders: [
'imports?this=>window'
]
}]
To make the copyWebpackPlugin available in your webpack configuration, import it in the header part of the webpack configuration file:
var copyWebpackPlugin = require('copy-webpack-plugin');
And, as Petri Ryhänen commented, add the following entry to the plugins-section of your webpack configuration:
plugins: [
new copyWebpackPlugin([
{ from: './node_modules/tinymce/plugins', to: './plugins' },
{ from: './node_modules/tinymce/themes', to: './themes' },
{ from: './node_modules/tinymce/skins', to: './skins' }
])
]
This step ensures that (required) addons of tinyMCE are also available in your webpack.
Finally to import tinyMCE in your Angular 2 component file, add
require('tinymce')
declare var tinymce: any;
to the import section and tinyMCE is ready to use.

I got this to work similar to how I bundle React to ensure I don't get two separate instances in DOM. I had some issues with imports / exports / expose loaders so instead I used script-loader.
In my setup I have a commons chunk that I use strictly for vendors (React / tinymce).
entry: { 'loading': '../src/app/entry/loading'
, 'app': '../src/app/entry/app'
, 'timeout': '../src/app/entry/timeout'
, 'commons': [ 'expose?React!react'
, 'expose?ReactDOM!react-dom'
, 'script!tinymce/tinymce.min.js'
]
}
This is working for me the same way that including the script from CDN would work however I now had errors because it could not find my themes / plugins / skins paths from my node_modules location. It was looking for them at paths /assets/plugins, /assets/themes, /assets/skins (I use webpack public path /assets/).
I resolved the second issue by mapping express to serve these two routes statically like so (es6):
const NODE_MODULES_ROOT = path.resolve(__dirname, 'node_modules')
const TINYMCE_PLUGINS_ROOT = path.join(NODE_MODULES_ROOT, 'tinymce/plugins')
const TINYMCE_THEMES_ROOT = path.join(NODE_MODULES_ROOT, 'tinymce/themes')
const TINYMCE_SKINS_ROOT = path.join(NODE_MODULES_ROOT, 'tinymce/skins')
router.use('/assets/plugins', express.static(TINYMCE_PLUGINS_ROOT))
router.use('/assets/themes', express.static(TINYMCE_THEMES_ROOT))
router.use('/assets/skins', express.static(TINYMCE_SKINS_ROOT))
After doing this window.tinymce / window.tinyMCE are both defined and functions same as CDN.

As an addition to this answer (thanks to Petri Ryhänen), I want to add my copyWebpackPlugin and tinymce.init() configuration adjustments.
new copyWebpackPlugin([{
context: './node_modules/tinymce/skins/lightgray',
from: './**/*',
to: './tinymce/skin',
}]),
With this configuration you will get all skin files in {output}/tinymce/skin folder.
Then you can initialize tinymce like this:
import tinymce from 'tinymce/tinymce';
// A theme is also required
import 'tinymce/themes/modern/theme'; // you may change to 'inlite' theme
// Any plugins you want to use has to be imported
import 'tinymce/plugins/advlist/plugin';
// ... possibly other plugins
// Then anywhere in this file you can:
tinymce.init({
// ... possibly other options
skin_url: '/tinymce/skin', // <-- !!! here we tell tinymce where
// to load skin files from
// ... possibly other options
});
With this I have both development and production builds working normally.

We use TinyMCE jQuery 4.1.6 and the accepted answer did not work for us because window seems to be used in other locations by TinyMCE (e.g. window.setTimeout). Also, document not being shimmed seemed to cause problems.
This works for us:
alias: {
'tinymce': 'tinymce/tinymce.jquery.js'
}
module: {
loaders: [
{
test: /tinymce\/tinymce\.jquery\.js/,
loader: 'imports?document=>window.document,this=>window!exports?window.tinymce'
}
]
}
Load your plugins like this:
{
test: /tinymce\/plugins/,
loader: 'imports?tinymce,this=>{tinymce:tinymce}'
}

Related

How to Make Webpack Produce a JS file as-is to Use it Later for Configuration [duplicate]

I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.

Webpack resolve alias and compile file under that alias

I have project which uses lerna ( monorepo, multiple packages ). Few of the packages are standalone apps.
What I want to achieve is having aliases on few of the packages to have something like dependency injection. So for example I have alias #package1/backendProvider/useCheckout and in webpack in my standalone app I resolve it as ../../API/REST/useCheckout . So when I change backend provider to something else I would only change it in webpack.
Problem
Problem appears when this alias is used by some other package ( not standalone app ). For example:
Directory structure looks like this:
Project
packageA
ComponentA
packageB
API
REST
useCheckout
standalone app
ComponentA is in packageA
useCheckout is in packageB under /API/REST/useCheckout path
ComponentA uses useCheckout with alias like import useCheckout from '#packageA/backendProvider/useCheckout
Standalone app uses componentA
The error I get is that Module not found: Can't resolve '#packageA/backendProvider/useCheckout
However when same alias is used in standalone app ( that has webpack with config provided below ) it is working. Problem occurs only for dependencies.
Potential solutions
I know that one solution would be to compile each package with webpack - but that doesn't really seem friendly. What I think is doable is to tell webpack to resolve those aliases to directory paths and then to recompile it. First part ( resolving aliases ) is done.
Current code
As I'm using NextJS my webpack config looks like this:
webpack: (config, { buildId, dev, isServer, defaultLoaders }) => {
// Fixes npm packages that depend on `fs` module
config.node = {
fs: "empty"
};
const aliases = {
...
"#package1/backendProvider": "../../API/REST/"
};
Object.keys(aliases).forEach(alias => {
config.module.rules.push({
test: /\.(js|jsx)$/,
include: [path.resolve(__dirname, aliases[alias])],
use: [defaultLoaders.babel]
});
config.resolve.alias[alias] = path.resolve(__dirname, aliases[alias]);
});
return config;
}
You don’t need to use aliases. I have a similar setup, just switch to yarn (v1) workspaces which does a pretty smart trick, it adds sym link to all of your packages in the root node_modules.
This way, each package can import other packages without any issue.
In order to apply yarn workspaces with lerna:
// lerna.json
{
"npmClient": "yarn",
"useWorkspaces": true,
"packages": [
"packages/**"
],
}
// package.json
{
...
"private": true,
"workspaces": [
"packages/*",
]
...
}
This will enable yarn workspace with lerna.
The only think that remains to solve is to make consumer package to transpile the required package (since default configs of babel & webpack is to ignore node_module transpilation).
In Next.js project it is easy, use next-transpile-modules.
// next.config.js
const withTM = require('next-transpile-modules')(['somemodule', 'and-another']); // pass the modules you would like to see transpiled
module.exports = withTM();
In other packages that are using webpack you will need to instruct webpack to transpile your consumed packages (lets assume that they are under npm scope of #somescope/).
So for example, in order to transpile typescript, you can add additional module loader.
// webpack.config.js
{
...
module: {
rules: [
{
test: /\.ts$/,
loader: 'ts-loader',
include: /[\\/]node_modules[\\/]#somescope[\\/]/, // <-- instruct to transpile ts files from this path
options: {
allowTsInNodeModules: true, // <- this a specific option of ts-loader
transpileOnly: isDevelopment,
compilerOptions: {
module: 'commonjs',
noEmit: false,
},
},
}
]
}
...
resolve: {
symlinks: false, // <-- important
}
}
If you have css, you will need add a section for css as well.
Hope this helps.
Bonus advantage, yarn workspaces will reduce your node_modules size since it will install duplicate packages (with the same semver version) once!

How do we include only required modules from lodash in a Nuxt?Vuejs Project?

We have built a Nuxt/VueJS project.
Nuxt has its own config file called nuxt.config.js within which we configure webpack and other build setup.
In our package.json, we have included the lodash package.
In our code, we have been careful to load only import what we require, for example:
import orderBy from 'lodash/orderBy'
In nuxt.config.js, lodash is add to the vendor list.
However when we create the build, webpack always includes the entire lodash library instead of including only what we have used in our code.
I have read numerous tutorials but haven't got the answer. Some of those answers will surely work if it was a webpack only project. But in our case, it is through nuxt config file.
Looking forward to some help.
Below is the partial nuxt.config.js file. Only relevant/important parts are included:
const resolve = require('resolve')
const webpack = require('webpack')
module.exports = {
/*
** Headers of the page
*/
head: {
},
modules: [
['#nuxtjs/component-cache', { maxAge: 1000 * 60 * 10 }]
],
plugins: [
{ src: '~/plugins/intersection', ssr: false },
],
build: {
vendor: ['moment', 'lodash'],
analyze: {
analyzerMode: 'static'
},
postcss: {
plugins: {
'postcss-custom-properties': false
}
},
plugins: [
new webpack.IgnorePlugin(/^\.\/locale$/, /moment$/)
],
/*
** Run ESLINT on save
*/
extend (config, ctx) {
// config.resolve.alias['create-api'] = `./create-api-${ctx.isClient ? 'client' : 'server'}.js`
}
}
}
You can npm install only the required packages
Lodash can be split up per custom builds. You can find a list of already available ones here. You can use them like this: npm i -S lodash.orderby. I didn't check it but you would probably also need to change import orderBy from 'lodash/orderBy' to import orderBy from 'lodash.orderby'.

creating javascript library with webpack

I don't understand why this is being so complicated I want my project to have 2 separate work spaces where one is a library that will be distributed and the other will be used for testing... this is how i have the file structure
project
--engine
---math
----vec2.js
---dist
----library.js
---main.js
--sandbox
---main.js
I want to build the "engine" project with webpack and es6 modules so I get a "library" file that can be used in "sandbox".
The "engine" main file would look something like this
import vec2 from './math/vec2';
export default class Library {
constructor() {
this.vec2 = vec2;
}
}
An then the sandbox main file would look something like this
import lib from '../engine/dist/library';
const game = new lib();
The problem is when I build the "library.js" file with webpack and import it in the "sandbox" main file I can't call any of the classes therein. I get this error.
Uncaught TypeError: o.default is not a constructor
at Object.<anonymous> (library.js:1)
at e (library.js:1)
at library.js:1
at library.js:1
My webpack.config.js file looks like this
var webpack = require('webpack');
module.exports = {
context: __dirname,
entry: __dirname+"/main.js",
output: {
path: __dirname+"/dist",
filename: "library.js"
},
module: {
loaders: [
{
test: /\.js$/,
exclude: /(node_modules)/,
loader: 'babel-loader',
query: {
presets: ['es2015']
}
}
]
},
plugins: [
new webpack.optimize.UglifyJsPlugin()
]
};
I must be missing some configuration webpack needs or some plugin that will make this work. I simply want to build the library with webpack using es6 modules so it can be used in another project but I have no idea how to configure it. I'm using babel for transpilling es6 to es5
You need to configure output.libraryTarget. In this case the target commonjs-module is appropriate. So your output would be:
output: {
path: __dirname+"/dist",
filename: "library.js",
libraryTarget: "commonjs-module"
},
The different targets are described in the docs. And you might also want to read Guides - Authoring Libraries.

How to copy static files to build directory with Webpack?

I'm trying to move from Gulp to Webpack. In Gulp I have task which copies all files and folders from /static/ folder to /build/ folder. How to do the same with Webpack? Do I need some plugin?
Requiring assets using the file-loader module is the way webpack is intended to be used (source). However, if you need greater flexibility or want a cleaner interface, you can also copy static files directly using my copy-webpack-plugin (npm, Github). For your static to build example:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
context: path.join(__dirname, 'your-app'),
plugins: [
new CopyWebpackPlugin({
patterns: [
{ from: 'static' }
]
})
]
};
Compatibility note: If you're using an old version of webpack like webpack#4.x.x, use copy-webpack-plugin#6.x.x. Otherwise use latest.
You don't need to copy things around, webpack works different than gulp. Webpack is a module bundler and everything you reference in your files will be included. You just need to specify a loader for that.
So if you write:
var myImage = require("./static/myImage.jpg");
Webpack will first try to parse the referenced file as JavaScript (because that's the default). Of course, that will fail. That's why you need to specify a loader for that file type. The file- or url-loader for instance take the referenced file, put it into webpack's output folder (which should be build in your case) and return the hashed url for that file.
var myImage = require("./static/myImage.jpg");
console.log(myImage); // '/build/12as7f9asfasgasg.jpg'
Usually loaders are applied via the webpack config:
// webpack.config.js
module.exports = {
...
module: {
loaders: [
{ test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/, loader: "file" }
]
}
};
Of course you need to install the file-loader first to make this work.
If you want to copy your static files you can use the file-loader in this way :
for html files :
in webpack.config.js :
module.exports = {
...
module: {
loaders: [
{ test: /\.(html)$/,
loader: "file?name=[path][name].[ext]&context=./app/static"
}
]
}
};
in your js file :
require.context("./static/", true, /^\.\/.*\.html/);
./static/ is relative to where your js file is.
You can do the same with images or whatever.
The context is a powerful method to explore !!
One advantage that the aforementioned copy-webpack-plugin brings that hasn't been explained before is that all the other methods mentioned here still bundle the resources into your bundle files (and require you to "require" or "import" them somewhere). If I just want to move some images around or some template partials, I don't want to clutter up my javascript bundle file with useless references to them, I just want the files emitted in the right place. I haven't found any other way to do this in webpack. Admittedly it's not what webpack originally was designed for, but it's definitely a current use case.
(#BreakDS I hope this answers your question - it's only a benefit if you want it)
Webpack 5 adds Asset Modules which are essentially replacements for common file loaders. I've copied a relevant portion of the documentation below:
asset/resource emits a separate file and exports the URL. Previously achievable by using file-loader.
asset/inline exports a data URI of the asset. Previously achievable by using url-loader.
asset/source exports the source code of the asset. Previously achievable by using raw-loader.
asset automatically chooses between exporting a data URI and emitting a separate file. Previously achievable by using url-loader with asset size limit.
To add one in you can make your config look like so:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
}
]
}
};
To control how the files get output, you can use templated paths.
In the config you can set the global template here:
// webpack.config.js
module.exports = {
...
output: {
...
assetModuleFilename: '[path][name].[hash][ext][query]'
}
}
To override for a specific set of assets, you can do this:
// webpack.config.js
module.exports = {
...
module: {
rules: [
{
test: /\.(jpe?g|gif|png|svg|woff|ttf|wav|mp3)$/,
type: "asset/resource"
generator: {
filename: '[path][name].[hash][ext][query]'
}
}
]
}
};
The provided templating will result in filenames that look like build/images/img.151cfcfa1bd74779aadb.png. The hash can be useful for cache busting etc. You should modify to your needs.
Above suggestions are good. But to try to answer your question directly I'd suggest using cpy-cli in a script defined in your package.json.
This example expects node to somewhere on your path. Install cpy-cli as a development dependency:
npm install --save-dev cpy-cli
Then create a couple of nodejs files. One to do the copy and the other to display a checkmark and message.
copy.js
#!/usr/bin/env node
var shelljs = require('shelljs');
var addCheckMark = require('./helpers/checkmark');
var path = require('path');
var cpy = path.join(__dirname, '../node_modules/cpy-cli/cli.js');
shelljs.exec(cpy + ' /static/* /build/', addCheckMark.bind(null, callback));
function callback() {
process.stdout.write(' Copied /static/* to the /build/ directory\n\n');
}
checkmark.js
var chalk = require('chalk');
/**
* Adds mark check symbol
*/
function addCheckMark(callback) {
process.stdout.write(chalk.green(' ✓'));
callback();
}
module.exports = addCheckMark;
Add the script in package.json. Assuming scripts are in <project-root>/scripts/
...
"scripts": {
"copy": "node scripts/copy.js",
...
To run the sript:
npm run copy
The way I load static images and fonts:
module: {
rules: [
....
{
test: /\.(jpe?g|png|gif|svg)$/i,
/* Exclude fonts while working with images, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/fonts'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'images/'
}
}]
},
{
test: /\.(woff(2)?|ttf|eot|svg|otf)(\?v=\d+\.\d+\.\d+)?$/,
/* Exclude images while working with fonts, e.g. .svg can be both image or font. */
exclude: path.resolve(__dirname, '../src/assets/images'),
use: [{
loader: 'file-loader',
options: {
name: '[name].[ext]',
outputPath: 'fonts/'
},
}
]
}
Don't forget to install file-loader to have that working.
You can write bash in your package.json:
# package.json
{
"name": ...,
"version": ...,
"scripts": {
"build": "NODE_ENV=production npm run webpack && cp -v <this> <that> && echo ok",
...
}
}
Most likely you should use CopyWebpackPlugin which was mentioned in kevlened answer. Alternativly for some kind of files like .html or .json you can also use raw-loader or json-loader. Install it via npm install -D raw-loader and then what you only need to do is to add another loader to our webpack.config.js file.
Like:
{
test: /\.html/,
loader: 'raw'
}
Note: Restart the webpack-dev-server for any config changes to take effect.
And now you can require html files using relative paths, this makes it much easier to move folders around.
template: require('./nav.html')
I was stuck here too. copy-webpack-plugin worked for me.
However, 'copy-webpack-plugin' was not necessary in my case (i learned later).
webpack ignores root paths
example
<img src="/images/logo.png'>
Hence, to make this work without using 'copy-webpack-plugin'
use '~' in paths
<img src="~images/logo.png'>
'~' tells webpack to consider 'images' as a module
note:
you might have to add the parent directory of images directory in
resolve: {
modules: [
'parent-directory of images',
'node_modules'
]
}
Visit https://vuejs-templates.github.io/webpack/static.html
The webpack config file (in webpack 2) allows you to export a promise chain, so long as the last step returns a webpack config object. See promise configuration docs. From there:
webpack now supports returning a Promise from the configuration file. This allows to do async processing in you configuration file.
You could create a simple recursive copy function that copies your file, and only after that triggers webpack. E.g.:
module.exports = function(){
return copyTheFiles( inpath, outpath).then( result => {
return { entry: "..." } // Etc etc
} )
}
lets say all your static assets are in a folder "static" at the root level and you want copy them to the build folder maintaining the structure of subfolder, then
in your entry file) just put
//index.js or index.jsx
require.context("!!file?name=[path][name].[ext]&context=./static!../static/", true, /^\.\/.*\.*/);
In my case I used webpack for a wordpress plugin to compress js files, where the plugin files are already compressed and need to skip from the process.
optimization: {
minimize: false,
},
externals: {
"jquery": "jQuery",
},
entry: glob.sync('./js/plugin/**.js').reduce(function (obj, el) {
obj[path.parse(el).name] = el;
return obj
}, {}),
output: {
path: path.resolve(__dirname, './js/dist/plugin'),
filename: "[name].js",
clean: true,
},
That used to copy the js file as it is to the build folder. Using any other methods like file-loader and copy-webpack create issues with that.
Hope it will help someone.

Categories