I'm using webpack to modernize a legacy multiple-page ASP.NET Web Forms application. I've had pretty good success with it up till I tried using the SplitChunksPlugin to de-dupe my bundles using its chunks: 'all' option. This unfortunately makes a handful of extra JS bundles that all need to be included in script tags along with the original entry bundle. No surprise, the above linked doc states as much:
By default [the plugin] only affects on-demand chunks, because changing initial chunks would affect the script tags the HTML file should include to run the project.
But I would very much like to have those initial entry chunks split up, so I'm trying to find a way of getting all those extra chunks included in script tags. It seems the standard advice here is to use the HtmlWebpackPlugin to generate an HTML page with all the script tags included, but this doesn't work for me (at least in its default configuration) for at least two reasons:
This is a Web Forms project. One does not simply tamper with aspx files.
Even if I did find a way to generate some valid aspx files every time I ran webpack (I suppose that's doable, but here is the main difficulty); it seems HtmlWebpackPlugin only generates script tags for all the chunks, or a manually selected subset of them (using the chunks: [] option).
To elaborate on that second point, and get to my question -- I could do some manual analysis of the split chunks to build a dependency graph and manually include each one in the aspx, but that's obviously not a maintainable approach. I was hoping HtmlWebpackPlugin could offer some way of at least indicating that this chunk is ultimately used by this entry, or this entry uses these chunks, etc., but I have not found any such relationships present in its output.
Is there any way without going through hack-hoops to automatically determine which of the split chunks are dependencies of a given entry chunk?
Install this plugin with option to your webpack config:
npm install webpack-manifest-plugin --save-dev
// webpack.config.js
const ManifestPlugin = require('webpack-manifest-plugin')
concatMerge(configuration, {
// ...
plugins: [
new ManifestPlugin({
fileName: 'prod.manifest.json',
generate: (seed, files) => {
const entrypoints = new Set()
files.forEach(
(file) => ((file.chunk || {})._groups || []).forEach(
(group) => entrypoints.add(group)
)
)
const entries = [...entrypoints]
const entryArrayManifest = entries.reduce((acc, entry) => {
const name = (entry.options || {}).name
|| (entry.runtimeChunk || {}).name
const files = [].concat(
...(entry.chunks || []).map((chunk) => chunk.files)
).filter(Boolean)
return name ? {...acc, [name]: files} : acc
}, seed)
return entryArrayManifest
}
}),
],
}
It will generate a prod.manifest.json contain chunks for every entry or routes:
{
"entryOne": [
"main.common.d7791ce7a1e7ba394.css",
"main.common.d7791ce7a1e7ba394.js",
"main.entryOne.eb614be915641d465.js"
],
"a-route": [
"main.common.d7791ce7a1e7ba394.css",
"main.a-routes.14b91be915641d465.js"
]
// ...
}
Related
I'm transitioning a legacy app to Webpack. I'm using Webpack 5.56 (latest at time of writing).
My app is localised and I have a folder with a handful of locale files,
locales
- locale.en.ts
- locale.de.ts
- etc
Each of these locale files is an ES module and they all export (different implementations of) the same functions — getText, printNumber, etc. I have a wrapper module which dynamically imports the correct locale for the current user:
// localization.ts
interface LocaleModule {
getText(text: string): string;
// etc
}
let module: LocaleModule;
import(`locales/locale.${currentUser.language}`).then(m => {
module = m;
});
export function getText(text: string): string {
return module.getText(text);
}
I know the current user's language when the page is being rendered. I want to include the correct locale.*.js script as an initial chunk, so that:
The browser doesn't have to wait for the main chunk to load before it can start downloading the locale file.
The functions in localization.ts can be synchronous.
This seemed like it'd be a good fit for webpackMode: "weak", since I'd like to get an error in the console if the locale file is missing for whatever reason (rather than silently degrade performance). The docs seem to explicitly call out my use case:
This is useful for universal rendering when required chunks are always manually served in initial requests (embedded within the page).
Here's my code:
let module: LocaleModule;
import(
/* webpackMode: "weak" */
/* webpackChunkName: "locales/[request]" */
`./locales/locale.${currentUser.language}`
).then(m => {
module = m;
});
However, it seems webpackMode: "weak" causes Webpack to emit no chunks for the referenced modules at all. There aren't any locale files in Webpack's output folder. I can’t very well include a chunk in the HTML if it was never emitted!
What's the reason for this behaviour? Is there a clean way to get Webpack to emit chunks for dynamically imported modules but not download them asynchronously? (I know that I could use webpackMode: "lazy" and just include the chunk upfront in a script tag, but I'd like to get an error if the locale file is missing.) Or do I have an XY problem, and there’s some better way to do this which I’m unaware of?
I have similar issues and resolve this.
My local file looks like:
import dictionary from './locales/en.json'
const en = dictionary
window.__default_dictionary__ = en
module.exports = en
My locales structure looks like:
enter image description here
You must add new cacheGroup for splitChunks.cacheGroups in webpack config
locales: {
enforce: true,
reuseExistingChunk: true,
priority: 50,
chunks: 'all',
test(module) {
if (/[\\/]src\/i18n[\\/]/.test(module.resource)) return true
return false
},
name(module) {
const moduleFileName = module
.identifier()
.split('/')
.reduceRight((item) => item)
.replace('.json', '')
.replace('.js', '')
return `locales~${moduleFileName}`
},
},
Now all of your locales files will be extracted to another chunk files.
You can use any handler for load locales, for example:
loadLocaleHandler: async (locale: Locale) => {
let localeModule: { default: Dictionary } = await import(`i18n/${locale}`)
return localeModule.default
},
And for everything to work correctly you must
Add locale chunk for result html
<script src="/assets/webpack/js/runtime.js" defer="defer"></script>
<script src="/assets/webpack/js/vendors.js" defer="defer"></script>
<!-- For example it maybe value from cookie or context of app -->
<script src="/assets/webpack/js/locales~(en|it|es).chunk.js" defer="defer"></script>
<script src="/assets/webpack/js/entry.js" defer="defer"></script>
Add magic webpack code to entry point
const defaultLocale: Locale = cookies.getItem('locale') || process.env.DEFAULT_LOCALE
if (__webpack_modules__[`./src/i18n/${defaultLocale}.js`]) {
__webpack_require__(`./src/i18n/${defaultLocale}.js`)
}
Totally:
you don't need wait loading locales by runtime import for first request
you can organize locales for multilocale and multidomain app
all of your locales remain dynamic modules and can be loaded at runtime
I can't post such long comment, so it has to be an answer...
So it looks like there isn't a real link between the modules and the bundler can't resolve them compile time so they aren't emitted. The only think I changed in your code is how modules are imported and it worked out of the box:
const langCode = getLangCode();
let mod;
import("./locales/locale.en")
switch (langCode) {
case "en":
import(`./locales/locale.en.js`).then(m => {
mod = m;
console.log("loaded locale");
})
break;
case "de":
import(`./locales/locale.de.js`).then(m => {
mod = m;
console.log("loaded locale");
})
break;
default:
}
export function getText(text) {
return mod.getText(text);
}
function getLangCode() {
return "de";
}
I know the switch case is not ideal, but the bundler can't automatically guess that pattern: ./locales/locale.${langCode}.js and add all files in the directory that match .js.
The doc says the following:
'weak': Tries to load the module if the module function has already been loaded in some other way (e.g. another chunk imported it or a script containing the module was loaded). A Promise is still returned, but only successfully resolves if the chunks are already on the client. If the module is not available, the Promise is rejected. A network request will never be performed. This is useful for universal rendering when required chunks are always manually served in initial requests (embedded within the page), but not in cases where app navigation will trigger an import not initially served.
From what I understand this means the chunks are expected to be already on the page and generated through some other means.
I hope that helps you resolve your issue.
In order to use weak you have to already manually served the chunks as stated in the docs. This means that adding it in a dynamic import as comment does not create any chunks (in contradiction with lazy and lazy-once).
Is there a clean way to get Webpack to emit chunks for dynamically imported modules but not download them asynchronously?
For synchronous loading:
You can either:
Use webpackMode: "lazy" and include the chunk upfront in a script tag as you stated (the Promise returned is rejected in case of missing chunk).
You can define the locale js files as dynamic entry points and load them manually by yourself.
For your example, creating an entrypoint for each locale could be something like:
const glob = require('glob')
module.exports = {
devtool: false,
entry: {
...glob.sync('./src/locales/*').reduce((acc, module) => {
const name = module.replace('./src/locales/', '').replace('.js', '')
acc[name] = module
return acc
}, {})
}
};
This would emit locale.de.js and locale.en.js bundles and then you should somehow manually load a <script defer src="locale.<locale>.js"></script>, but that depends on how you serve your app.
For asynchronous loading:
You can use webpackMode: "lazy" along with webpackPreload: true in order to decouple main and locale chunk requests.
As stated in the docs
A preloaded chunk starts loading in parallel to the parent chunk.
I have a hybrid AngularJS/Angular application that will take some time to complete migration to fully be an Angular app. While this process occurs, I'd like to move away from the previous build system to using the CLI and webpack to manage all of the old AngularJS scripts as well. This is possible as I've done it before by adding all of my scripts to the scripts section in angular.json like the following:
"scripts": [
"src/app/angularjs/app.js",
"src/app/angularjs/controllers/main.js",
"src/app/angularjs/services/someService.js",
"src/app/angularjs/controllers/someController.js"
],
This works well and the CLI builds via ng serve and ng build continue to work for the hybrid bootstrapped app as needed. The problem I'm running into now is manually listing each file for the current application I'm migrating is not ideal. I have hundreds of scripts that need to be added, and what I need is to be able to use a globbing pattern like the following:
"scripts": [
"src/app/angularjs/**/*.js"
],
The problem is this syntax from what I can tell is not supported. The glob pattern is supported in the assets section of angular.json as stated here but not in the scripts section: https://angular.io/guide/workspace-config#assets-configuration
In the scripts section I can't find a similar solution. It does have an expanded object API, but nothing that solves the problem I can tell to select all .js files from a particular directory as listed here: https://angular.io/guide/workspace-config#styles-and-scripts-configuration
Is it possible by some means to use a glob pattern or similar approach to select all files of a directory for the scripts section in angular.json so I don't have to manually list out hundreds of individual .js files?
The Bad News
The scripts section does not support the same glob patterns that the assets section does.
The Good News(?)
Since you're transitioning away from AngularJS, you hopefully won't have any new files to import in the future, so you could just generate the list of all the files you need to import.
Make your way to the src/app/angular directory and run the following:
find . -iregex '.*\.\(js\)' -printf '"%p",\n'
That will give you your list, already quoted for your convenience. You may need to do a quick search/replace (changing "." to "src/app/angularjs"), and don't forget to remove the last comma, but once you've done that once you should be all set.
The Extra News
You can further filter out unwanted files with -not, so (per your comment) you might do:
find . -iregex '^.*\.js$' -not -iregex '^.*_test\.js$' -printf '"%p",\n'
And that should give you all your .js files without your _test.js files.
KISS
Of course, this isn't a complex pattern, so as #atconway points out below, this will work just as well:
find . -iname "*.js" -not -iname "*_test.js" -printf '"%p",\n'
I'll keep the above, though, for use in situations where the full power of regex might come in handy.
I wanted to extend an anser of #JasonVerber and here is a Node.JS code and therefore (I believe) cross-platform.
Firstly install find package and then save contents from the snippet in some file.js.
Afterwards, specify paths so that they resolve to where you wan't to get your files from and where to put the resulting file to.
After that node file-name.js and this will save all found file paths to the resultPath in result.txt ready to Ctrl+A, Ctrl+C, Ctrl+V.
const find = require('find');
const path = require('path');
const fs = require('fs');
// BEFORE USAGE INSTALL `find` package
// Path to the folder where to look for files
const sourcePath = path.resolve(path.join(__dirname, 'cordova-app', 'src'));
// Path that will be removed from absolute path to files
const pathToRemove = path.resolve(path.join(__dirname, 'cordova-app'));
// Path where to put result.txt
const resultPath = path.resolve(path.join(__dirname, './result.txt'));
// Collects the file paths
const res = [];
// Path with replaced \ onto /
const pathToRemovehReplaced = pathToRemove.replace(/\\/g, '/');
// Get all fils that match a regex
find.eachfile(/\.js$/, sourcePath, file => {
// First remove all \ with / and then remove the path from root to source so that only relative path is left
const fileReplaced = file.replace(/\\/g, '/').replace(`${pathToRemovehReplaced}/`, '');
// Surround with quoutes
res.push(`"${fileReplaced}"`);
}).end(() => {
// Write file and concatenate results with newline and commas
fs.writeFileSync(resultPath, res.join(',\r\n'), 'utf8');
console.log('DONE!');
});
The result I got while testing (/\.ts$/ for regex)
"src/app/app.component.spec.ts",
"src/app/app.component.ts",
"src/app/app.module.ts",
"src/environments/environment.prod.ts",
"src/environments/environment.ts",
"src/main.ts",
"src/polyfills.ts",
"src/test.ts"
I am quite new to Webpack, so bear with me if thats a stupid question.
My goal is to transform my old, AMD based codebase to a ES6 Module based solution. What I am struggling with is handling dynamic import()s. So my app router works on a module basis, i.e. each route is mapped to a module path and then required. Since I know what modules will be included, I just add those dynamically imported modules to my r.js configuration and am able to build everything in a single file, with all require calls still working.
Now, I am trying to do the same with ES6 modules and Webpack. With my devmode this is no problem as I can just replace require() with import(). However I cannot get this to work with bundling. Either Webpack splits my code (and still fails to load the dynamic module anyways), or - if I use the Array format for the entry config, the dynamic module is included in the bundle but loading still fails: Error: Cannot find module '/src/app/DynClass.js'
This is how my Webpack config looks like:
const webpack = require('webpack');
const path = require('path');
module.exports = {
mode: "development",
entry: ['./main.js', './app/DynClass.js'],
output: {
filename: 'main.js',
path: path.resolve(__dirname, "../client/")
},
resolve: {
alias: {
"/src": path.resolve(__dirname, '')
}
},
module: {
rules: [
{
test: /\.tpl$/i,
use: 'raw-loader',
},
]
}
};
So basically I want to tell Webpack: "hey, there is another module (or more) that is to be loaded dynamically and I want it to be included in the bundle"
How can I do this?
So yeah, after much fiddling there seems to be a light at the end of the tunnel. Still, this is not a 100% solution and it is surely not for the faint of heart, as it is quite ugly and fragile. But still I want to share my approach with you:
1) manual parsing of my routes config
My router uses a config file looking like this:
import StaticClass from "/src/app/StaticClass.js";
export default {
StaticClass: {
match: /^\//,
module: StaticClass
},
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
};
So as you can see the export is an object, with keys acting as the route id, and an object that contains the matches (regex based) and the module which should be executed by the router if the route matches. I can feed my router with both a Constructor function (or an object) for modules which are available immediatly (i.e. contained in the main chunk) or if the module value is a string, this means that the router has to load this module dynamically by using the path specified in the string.
So as I know what modules could be potentially loaded (but not if and when) I can now parse this file within my build process and transform the route config to something webpack can understand:
const path = require("path");
const fs = require("fs");
let routesSource = fs.readFileSync(path.resolve(__dirname, "app/routes.js"), "utf8");
routesSource = routesSource.substr(routesSource.indexOf("export default"));
routesSource = routesSource.replace(/module:\s*((?!".*").)*$/gm, "module: undefined,");
routesSource = routesSource.replace(/\r?\n|\r/g, "").replace("export default", "var routes = ");
eval(routesSource);
let dummySource = Object.entries(routes).reduce((acc, [routeName, routeConfig]) => {
if (typeof routeConfig.module === "string") {
return acc + `import(/* webpackChunkName: "${routeName}" */"${routeConfig.module}");`;
}
return acc;
}, "") + "export default ''";
(Yeah I know this is quite ugly and also a bit brittle so this surely could be done better)
Essentially I create a new, virtual module where every route entry which demands a dynamic import is translated, so:
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
becomes:
import(/* webpackChunkName: "DynClass" */"/src/app/DynClass.js");
So the route id simply becomes the name of the chunk!
2) including the virtual module in the build
For this I use the virtual-module-webpack-plugin:
plugins: [
new VirtualModulePlugin({
moduleName: "./app/dummy.js",
contents: dummySource
})
],
Where dummySource is just a string containing the sourcecode of my virtual module I just have generated. Now, this module is pulled in and the "virtual imports" can be processed by webpack. But wait, I still need to import the dummy module, but I do not have any in my development mode (where I use everything natively, so no loaders).
So in my main code I do the following:
let isDev = false;
/** #remove */
isDev = true;
/** #endremove */
if (isDev) { import('./app/dummy.js'); }
Where "dummy.js" is just an empty stub module while I am in development mode. The parts between that special comments are removed while building (using the webpack-loader-clean-pragma loader), so while webpack "sees" the import for dummy.js, this code will not be executed in the build itself since then isDev evaluates to false. And since we already defined a virtual module with the same path, the virtual module is included while building just like I want, and of course all dependencies are resolved as well.
3) Handling the actual loading
For development, this is quite easy:
import routes from './app/routes.js';
Object.entries(routes).forEach(async ([routeId, route]) => {
if (typeof route.module === "function") {
new route.module;
} else {
const result = await import(route.module);
new result.default;
}
});
(Note that this is not the actual router code, just enough to help me with my PoC)
Well, but for the build I need something else, so I added some code specific to the build environment:
/** #remove */
const result = await import(route.module);
new result.default;
/** #endremove */
if (!isDev) {
if (typeof route.module === "string") { await __webpack_require__.e(routeId); }
const result = __webpack_require__(route.module.replace("/src", "."));
new result.default;
}
Now, the loading code for the dev environment is just stripped out, and there is another loading code that uses webpack internally. I also check if the module value is a function or string, and if it is the latter I invoke the internal require.ensure function to load the correct chunk: await __webpack_require__.e(routeId);. Remember that I named my chunks when generating the virtual module? Now thats why I still can find them now!
4) more needs to be done
Another thing I encountered is when several dynamically loaded modules have the same dependencies, webpack tries to generate more chunks with names like module1~module2.bundle.js, breaking my build. To counter this, I needed to make sure that all those shared modules go into a specific named bundle I called "shared":
optimization: {
splitChunks: {
chunks: "all",
name: "shared"
}
}
And when in production mode, I simply load this chunk manually before any dynamic modules depending on it are requested:
if (!isDev) {
await __webpack_require__.e("shared");
}
Again, this code only runs in production mode!
Finally, I have to prevent webpack renaming my modules (and chunks) to something like "1", "2" etc, but rather keep the names I just have defined:
optimization: {
namedChunks: true,
namedModules: true
}
Se yeah, there you have it! As I said this wasn't pretty but seems to work, at least with my simplified test setup. I really hope there aren't any blockers ahead of me when I do all the rest (like ESLint, SCSS etc)!
I have to use Webpack for one of my projects to build front-end bundles for js, css and other static assets. It does the job well, but in my early stages of the project I've got only some css and static images and no js files yet. Here is my full webpack.config.js
const Webpack = require("webpack");
const Glob = require("glob");
const path = require("path");
const CopyWebpackPlugin = require("copy-webpack-plugin");
const configurator = {
entries: function(){
var entries = {
application: [
'./assets/dummy.js',
],
}
return entries
},
plugins() {
var plugins = [
new CopyWebpackPlugin([{from: "./assets",to: ""}], {copyUnmodified: true,ignore: ["css/**", "js/**", "**.js"] }),
];
return plugins
},
moduleOptions: function() {
return {
rules: [
]
}
},
buildConfig: function(){
const env = process.env.NODE_ENV || "development";
var config = {
mode: env,
entry: configurator.entries(),
output: {filename: "[name].[hash].js", path: `${__dirname}/public/assets`},
plugins: configurator.plugins(),
module: configurator.moduleOptions()
}
return config
}
}
module.exports = configurator.buildConfig()
Practically what it does for me is copying assets to public dir. I don't have any javascripts yet, but they will be in the future. So, I tried commenting entry, setting it to null or empty string with no luck. It seems Webpack needs to process js files so badly. My current solution is creating an empty dummy.js file and feeding it to Webpack. Annoyingly it generates some 3.3kb application.afff4a3748b8d5d33a3a.js file with some boilerplate js code, despite that my source js file is totally empty.
I understand that this is an edge use case for Webpack and Webpack was primarily created for processing javascript, but I bet many people still use it not just for bundling javascripts. So, my question, is there a better, more elegant way to skip bundling js files in Webpack?
p.s.
I think, I've found a related question without an answer here How to make WebPack copy a library instead of bundling?
p.s.#2
The suggested duplicate question has an accepted answer with an invalid Webpack config Invalid configuration object., so, I can't use it to solve my issue.
Moreover, the answer reads
webpack will create a dummy javascript file
and I'm specifically asking how to avoid creating unnecessary files.
My goal was to create a file that would
Require all of the JS files in a directory that didn't end in _test.js
Do a module.exports equal to an array of module names returned from those view files.
I thought I had it with this:
// Automagically crawls through this directory, finds every js file inside any
// subdirectory, removes test files, and requires the resulting list of files,
// registering the exported module names as dependencies to the myApp.demoApp.views module.
var context = require.context('.', true, /\/.*\/.*\.js$/);
var moduleNames = _.chain(context.keys())
.filter(function(key) {
console.log(key, key.indexOf('_test.js') == -1);
return key.indexOf('_test.js') == -1;
})
.map(function(key) {
console.log("KEY", key);
return context(key)
})
.value();
module.exports = angular.module('myApp.demoApp.views', moduleNames).name;
#2 is working as intended
#1 Unfortunately I was naive. While the module names are filtered out, this still requires all of the files with _test so the test files end up in my built code.
I tried to fix this by updating the regex but JS doesn't support regex negative-look-behind and I'm not regex savvy enough to do it without that.
Ok, I was able to use the answer in Slava.K's comment to answer my question. Here's the final code below. I had to include (?!.*index) in the regex because this code was including itself index.views.js.
var context = require.context('.', true, /^(?!.*index).*\/(?!.*test).*\.js$/);
var moduleNames = _.chain(context.keys())
.map(function(key) {
return context(key)
})
.value();
module.exports = angular.module('myApp.demoApp.views', moduleNames).name;