I'm writing PWA app. I was using default Service Worker from template that I'm using (Vue.js PWA template), but now I have decided to write my own from the scratch. I have placed it (service-worker.js) into static folder, because I want to have static name for it - I don't want to change name each time (build).
In this particular Service Worker I want to use package name and version, so that I can nicely generate cache ID.
So I want to achieve something like this:
./package.json:
{
"name": "my.app",
"version": "1.0.0",
...
}
./static/service-worker.js:
var CACHE_ID = 'PACKAGE_NAME-vPACKAGE_VERSION';
// ...
./build/service-worker.js:
var CACHE_ID = 'my.app-v1.0.0';
The ./build/service-worker.js shows what I want to achieve.
I have tried https://www.npmjs.com/package/string-replace-loader with below configuration:
{
test: /service-worker\.js$/,
loader: 'string-replace-loader',
options: {
multiple: [
{
search: 'PACKAGE_NAME',
replace: packageConfig.name
},
{
search: 'PACKAGE_VERSION',
replace: packageConfig.version
}
]
}
}
But as I understand files placed in static are not modules (am I right?), so those are not checked by module.rules.
I would be greatful for help and/or guidence how I can solve this problem.
Ok, I finnaly got it. I have used copy-webkit-plugin and it's possibility to transform:
plugins: [
new CopyWebpackPlugin([
{
from: 'static/service-worker.js',
to: './service-worker.js',
transform (content) {
var parsed = content.toString();
var transformation = [
{
search: 'PACKAGE_NAME',
replace: packageConfig.name
},
{
search: 'PACKAGE_VERSION',
replace: packageConfig.version
}
];
for(var i = 0; i < transformation.length; i++) {
parsed = parsed.replace(transformation[i].search, transformation[i].replace);
}
return Buffer.from(parsed, 'utf8');
}
}
])
]
Modules
Modules are placed in node_modules. src is your source folder, where you should keep only this files you're not going to use in production mode.
Also remember that modules is nothing more than just a JavaScript code like libraries; set of functions. If you move your *.js files from node_modules to src — this still will by modules.
I can not really understand why would you like to use string-replace-loader as it has nothing to do with your question.
Loader allows to perform replacements in a way String.prototype.replace() does (loader uses it internally). It means that if you want to replace all occurrences, you should use RegExp-like string in options.search with g flag in options.flags, etc.
And from String.prototype.replace() in MDN:
The replace() method returns a new string with some or all matches of a pattern replaced by a replacement. The pattern can be a string or a RegExp, and the replacement can be a string or a function to be called for each match.
Or did I misunderstood you?
Worker-loader
But if I understood you correctly — there's actually a loader for workers.
$ npm install worker-loader --save-dev
Related
I was able to output an assets library and many other libraries that work as remote federated modules and as deep import libraries in case I am not connecting to a remote or I am not using webpack in the consumer end.
The issue is now that all my assets exports a module, that either have the raw data as uri or the string that points to the right asset. Eg: the bird.svg is outputed to dust with it's hash plus the modules that resolves to the file bird-[hash].svg.
The above is great from javascript but not so much for css. Now I can't rewrite the url() to point to the right remote path which would be sg like:
//since I don't know when the assets will change names I can't refer to it directly. So I would need to first read the string from the bird.js module. Append the publicPath and then rewrite the url.
.someClass {
background-image: url('/assets/bird.js')
}
//the above won't work for obvious reasons.
Só, the question is how can I solve this? Or is there any loader for this? I checked the resolve url loader but it does not seem to be what need.
Ok I resolved this issue, by passing additional data as variable to sass-loader. That way I can evaluate the actual name of the files, and put it as a sass map before and handle it from sass.
//I am using glob to return an object with all the assets.
//This can probably be automated better. That would be an easier way.
//But this way works for me in all 3 scenarios, node, browser and federated module.
//Also has caching ootb for the assets.
const assetsPaths = {
...glob.sync('../assets/dist/img/**.node.js').reduce(function (obj, el) {
obj['img/' + path.parse(el).name] = '../../assets/dist/' + require(el).default;
return obj
}, {}), ...glob.sync('../assets/dist/fonts/**.node.js').reduce(function (obj, el) {
obj['fonts/' + path.parse(el).name] = '../../assets/dist/' + require(el).default;
return obj
}, {})
};
//...
{
loader: 'sass-loader',
options: {
additionalData: "$assets: '" + assetsMap + "';",
sourceMap: true,
sassOptions: {
outputStyle: "compressed",
},
}
},
//...
you also need to disable url rewriting
{
loader: 'css-loader',
options: {
url: false,
}
},
then you can use assets map in your sass files:
#font-face {
font-family: 'Some Font';
src: local('Some Font');
src: url("#{map-get($assets, SomeFont)}");
}
You will need probably have your project setup sort like a mono repo and you also need to build those assets library with two bundles.
One for node so you can use the string path to your actual assets when bundling you sass/whatever.
And another for normally loading it from the browser.
update:
Instead of doing all this I just used the manifest generated from 'webpack-manifest-plugin' to build the $assets map to be used in sass.
const assetsManifest = JSON.parse(fs.readFileSync('../assets/dist/manifest.json'));
const assetsMapFn = asset => `'${asset[0]}':'${asset[1]}'`;
const assetsMap = `(
${Object.entries(assetsManifest).map(assetsMapFn).join(',')}
); `;
If anyone knows a better way to do this please reply or comment.
So I have resources that I hash with copy webpack plugin.
{
from: "data/json/*.json",
to: "data/json/[name].[hash:6].json",
},
Now during runtime I need to get the access to the actual url of these json files.
What I would ideally like is to be able to fetch this url during runtime so that I can do something like
let name = "tiles";
const tileDataUrl = requireUrl(`data/json/${name}.json`);
fetch(tileDataUrl) // tileData Url here would data/json/tiles.abc34f.json
...
What I need is a method requireUrl (or whatever it might be called) which returns the actual url of the static resources with hashes during runtime.
( For anyone wondering, the hashes are used to do cache busting here)
Please and thank you :)
Assuming you're on version 5, Webpack asset modules will provide what you want without the need for copy-webpack-plugin. Webpack can recognize a require statement with expressions. Webpack will automatically included all possibly matching files for you without additional configuration. In this case you may want to watch out for optimizations where Webpack knows that name is set to "tiles". Here's the required addition to your config:
module.exports = {
module: {
rules: [
{
test: /data\/json\/.+\.json$/
type: 'asset/resource',
generator: {
// Look at https://webpack.js.org/configuration/output/#template-strings to see additional template strings.
filename: '[path][name].[hash:6][ext]'
}
}
]
}
}
Alternatively for Webpack 4 you can add file-loader as a dependency and use it with this equivalent config addition:
module.exports = {
module: {
rules: [
{
test: /data\/json\/.+\.json$/
loader: 'file-loader',
options: {
name: '[path][name].[hash:6][ext]'
}
}
]
}
}
Either way your code will now be able to work simply as follows:
let name = "tiles";
const tileDataUrl = require(`data/json/${name}.json`); // tileDataUrl will display the interpolated filename.
fetch(tileDataUrl);
I am quite new to Webpack, so bear with me if thats a stupid question.
My goal is to transform my old, AMD based codebase to a ES6 Module based solution. What I am struggling with is handling dynamic import()s. So my app router works on a module basis, i.e. each route is mapped to a module path and then required. Since I know what modules will be included, I just add those dynamically imported modules to my r.js configuration and am able to build everything in a single file, with all require calls still working.
Now, I am trying to do the same with ES6 modules and Webpack. With my devmode this is no problem as I can just replace require() with import(). However I cannot get this to work with bundling. Either Webpack splits my code (and still fails to load the dynamic module anyways), or - if I use the Array format for the entry config, the dynamic module is included in the bundle but loading still fails: Error: Cannot find module '/src/app/DynClass.js'
This is how my Webpack config looks like:
const webpack = require('webpack');
const path = require('path');
module.exports = {
mode: "development",
entry: ['./main.js', './app/DynClass.js'],
output: {
filename: 'main.js',
path: path.resolve(__dirname, "../client/")
},
resolve: {
alias: {
"/src": path.resolve(__dirname, '')
}
},
module: {
rules: [
{
test: /\.tpl$/i,
use: 'raw-loader',
},
]
}
};
So basically I want to tell Webpack: "hey, there is another module (or more) that is to be loaded dynamically and I want it to be included in the bundle"
How can I do this?
So yeah, after much fiddling there seems to be a light at the end of the tunnel. Still, this is not a 100% solution and it is surely not for the faint of heart, as it is quite ugly and fragile. But still I want to share my approach with you:
1) manual parsing of my routes config
My router uses a config file looking like this:
import StaticClass from "/src/app/StaticClass.js";
export default {
StaticClass: {
match: /^\//,
module: StaticClass
},
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
};
So as you can see the export is an object, with keys acting as the route id, and an object that contains the matches (regex based) and the module which should be executed by the router if the route matches. I can feed my router with both a Constructor function (or an object) for modules which are available immediatly (i.e. contained in the main chunk) or if the module value is a string, this means that the router has to load this module dynamically by using the path specified in the string.
So as I know what modules could be potentially loaded (but not if and when) I can now parse this file within my build process and transform the route config to something webpack can understand:
const path = require("path");
const fs = require("fs");
let routesSource = fs.readFileSync(path.resolve(__dirname, "app/routes.js"), "utf8");
routesSource = routesSource.substr(routesSource.indexOf("export default"));
routesSource = routesSource.replace(/module:\s*((?!".*").)*$/gm, "module: undefined,");
routesSource = routesSource.replace(/\r?\n|\r/g, "").replace("export default", "var routes = ");
eval(routesSource);
let dummySource = Object.entries(routes).reduce((acc, [routeName, routeConfig]) => {
if (typeof routeConfig.module === "string") {
return acc + `import(/* webpackChunkName: "${routeName}" */"${routeConfig.module}");`;
}
return acc;
}, "") + "export default ''";
(Yeah I know this is quite ugly and also a bit brittle so this surely could be done better)
Essentially I create a new, virtual module where every route entry which demands a dynamic import is translated, so:
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
becomes:
import(/* webpackChunkName: "DynClass" */"/src/app/DynClass.js");
So the route id simply becomes the name of the chunk!
2) including the virtual module in the build
For this I use the virtual-module-webpack-plugin:
plugins: [
new VirtualModulePlugin({
moduleName: "./app/dummy.js",
contents: dummySource
})
],
Where dummySource is just a string containing the sourcecode of my virtual module I just have generated. Now, this module is pulled in and the "virtual imports" can be processed by webpack. But wait, I still need to import the dummy module, but I do not have any in my development mode (where I use everything natively, so no loaders).
So in my main code I do the following:
let isDev = false;
/** #remove */
isDev = true;
/** #endremove */
if (isDev) { import('./app/dummy.js'); }
Where "dummy.js" is just an empty stub module while I am in development mode. The parts between that special comments are removed while building (using the webpack-loader-clean-pragma loader), so while webpack "sees" the import for dummy.js, this code will not be executed in the build itself since then isDev evaluates to false. And since we already defined a virtual module with the same path, the virtual module is included while building just like I want, and of course all dependencies are resolved as well.
3) Handling the actual loading
For development, this is quite easy:
import routes from './app/routes.js';
Object.entries(routes).forEach(async ([routeId, route]) => {
if (typeof route.module === "function") {
new route.module;
} else {
const result = await import(route.module);
new result.default;
}
});
(Note that this is not the actual router code, just enough to help me with my PoC)
Well, but for the build I need something else, so I added some code specific to the build environment:
/** #remove */
const result = await import(route.module);
new result.default;
/** #endremove */
if (!isDev) {
if (typeof route.module === "string") { await __webpack_require__.e(routeId); }
const result = __webpack_require__(route.module.replace("/src", "."));
new result.default;
}
Now, the loading code for the dev environment is just stripped out, and there is another loading code that uses webpack internally. I also check if the module value is a function or string, and if it is the latter I invoke the internal require.ensure function to load the correct chunk: await __webpack_require__.e(routeId);. Remember that I named my chunks when generating the virtual module? Now thats why I still can find them now!
4) more needs to be done
Another thing I encountered is when several dynamically loaded modules have the same dependencies, webpack tries to generate more chunks with names like module1~module2.bundle.js, breaking my build. To counter this, I needed to make sure that all those shared modules go into a specific named bundle I called "shared":
optimization: {
splitChunks: {
chunks: "all",
name: "shared"
}
}
And when in production mode, I simply load this chunk manually before any dynamic modules depending on it are requested:
if (!isDev) {
await __webpack_require__.e("shared");
}
Again, this code only runs in production mode!
Finally, I have to prevent webpack renaming my modules (and chunks) to something like "1", "2" etc, but rather keep the names I just have defined:
optimization: {
namedChunks: true,
namedModules: true
}
Se yeah, there you have it! As I said this wasn't pretty but seems to work, at least with my simplified test setup. I really hope there aren't any blockers ahead of me when I do all the rest (like ESLint, SCSS etc)!
I have an entry array in my webpack config:
entry: {
'main': [
'webpack-hot-middleware/client?path=some-query'
'my-module/my-file',
]
Inside of my code (node_modules/my-module/my-file.js) I attempt to require that initial third party file.
var client = require('webpack-hot-middleware/client');
Because I don't require it with the same querystring, webpack treats it as a separate asset/module, and inlines webpack-hot-middleware/client twice in the output bundle. This means I'm working with a new instance of the code, while I want to access the original instance. I don't have access to the third party code so I need to do it in my own library.
Currently the only solution I have is to duplicate the query string:
entry: {
'main': [
'webpack-hot-middleware/client?path=some-query'
'my-module/my-file?path=some-query',
]
And then require it using the __resourceQuery exposed to every Webpack file:
var client = require('webpack-hot-middleware/client' + __resourceQuery);
This requires me to duplicate the query string into my module, which is undesired, especially because my module won't use the querystring params (and might want to use its own, which isn't allowed here).
You should be able to make this work with a webpack resolver alias: https://webpack.github.io/docs/configuration.html#resolve-alias
I'm configuring Grunt with grunt-contrib-concat to concatenate like 20 javascript files. They have to be in a specific order and I'm wondering if there is a neat way to do this, without messing up my Gruntfile.js.
What I did and what worked well, was declaring an variable called 'libraries' with a function which returned a string with all the files in the right order.
var libraries = new (function () {
return [
'/javascript/libs/jquery.min.js',
'/javascript/libs/jquery.address.js',
'/javascript/libs/jquery.console.js'
];
});
And then concat (simplified, just an example):
concat: {
libs: {
files: {
'libs.js' : [libraries],
},
},
main: {
files: {
'main.js' : [main]
}
}
},
So when I call 'libraries' in my task configuration everything works fine, but I would like to declare this list in a separate file.
Unfortunately I couldn't find anything, nor do I know if this is even possible. Hope that someone could help me out! Thanks in advance :-)
I found a solution! Since Grunt is build on NodeJS, it's possible to use module.exports. What I did was setting an external file called libraries.js, which is in my Grunt directory.
var exports = module.exports = {};
exports.customLibrary = function () {
return [
// Path to a library
// Path to another library
// and so on...
];
};
exports.mainScripts = function () {
return [
// Path to a library
// Path to another library
// and so on...
];
};
Then I import this module by declaring a variable in Gruntfile.js
var libraries = require('../javascript/libraries.js');
To use the methods declared in libraries.js I set two more variables which returns a string with all the necessary files in the desired order:
var customLibrary = libraries.customLibrary();
var mainScripts = libraries.mainScripts();
I use these variables to define the source in the concat task. Hope this is helpful!