Can't resolve css url from webpack asset librarymodule - javascript

I was able to output an assets library and many other libraries that work as remote federated modules and as deep import libraries in case I am not connecting to a remote or I am not using webpack in the consumer end.
The issue is now that all my assets exports a module, that either have the raw data as uri or the string that points to the right asset. Eg: the bird.svg is outputed to dust with it's hash plus the modules that resolves to the file bird-[hash].svg.
The above is great from javascript but not so much for css. Now I can't rewrite the url() to point to the right remote path which would be sg like:
//since I don't know when the assets will change names I can't refer to it directly. So I would need to first read the string from the bird.js module. Append the publicPath and then rewrite the url.
.someClass {
background-image: url('/assets/bird.js')
}
//the above won't work for obvious reasons.
Só, the question is how can I solve this? Or is there any loader for this? I checked the resolve url loader but it does not seem to be what need.

Ok I resolved this issue, by passing additional data as variable to sass-loader. That way I can evaluate the actual name of the files, and put it as a sass map before and handle it from sass.
//I am using glob to return an object with all the assets.
//This can probably be automated better. That would be an easier way.
//But this way works for me in all 3 scenarios, node, browser and federated module.
//Also has caching ootb for the assets.
const assetsPaths = {
...glob.sync('../assets/dist/img/**.node.js').reduce(function (obj, el) {
obj['img/' + path.parse(el).name] = '../../assets/dist/' + require(el).default;
return obj
}, {}), ...glob.sync('../assets/dist/fonts/**.node.js').reduce(function (obj, el) {
obj['fonts/' + path.parse(el).name] = '../../assets/dist/' + require(el).default;
return obj
}, {})
};
//...
{
loader: 'sass-loader',
options: {
additionalData: "$assets: '" + assetsMap + "';",
sourceMap: true,
sassOptions: {
outputStyle: "compressed",
},
}
},
//...
you also need to disable url rewriting
{
loader: 'css-loader',
options: {
url: false,
}
},
then you can use assets map in your sass files:
#font-face {
font-family: 'Some Font';
src: local('Some Font');
src: url("#{map-get($assets, SomeFont)}");
}
You will need probably have your project setup sort like a mono repo and you also need to build those assets library with two bundles.
One for node so you can use the string path to your actual assets when bundling you sass/whatever.
And another for normally loading it from the browser.
update:
Instead of doing all this I just used the manifest generated from 'webpack-manifest-plugin' to build the $assets map to be used in sass.
const assetsManifest = JSON.parse(fs.readFileSync('../assets/dist/manifest.json'));
const assetsMapFn = asset => `'${asset[0]}':'${asset[1]}'`;
const assetsMap = `(
${Object.entries(assetsManifest).map(assetsMapFn).join(',')}
); `;
If anyone knows a better way to do this please reply or comment.

Related

How to access the actual name/path/url of a static resource hashed during copy webpack plugin

So I have resources that I hash with copy webpack plugin.
{
from: "data/json/*.json",
to: "data/json/[name].[hash:6].json",
},
Now during runtime I need to get the access to the actual url of these json files.
What I would ideally like is to be able to fetch this url during runtime so that I can do something like
let name = "tiles";
const tileDataUrl = requireUrl(`data/json/${name}.json`);
fetch(tileDataUrl) // tileData Url here would data/json/tiles.abc34f.json
...
What I need is a method requireUrl (or whatever it might be called) which returns the actual url of the static resources with hashes during runtime.
( For anyone wondering, the hashes are used to do cache busting here)
Please and thank you :)
Assuming you're on version 5, Webpack asset modules will provide what you want without the need for copy-webpack-plugin. Webpack can recognize a require statement with expressions. Webpack will automatically included all possibly matching files for you without additional configuration. In this case you may want to watch out for optimizations where Webpack knows that name is set to "tiles". Here's the required addition to your config:
module.exports = {
module: {
rules: [
{
test: /data\/json\/.+\.json$/
type: 'asset/resource',
generator: {
// Look at https://webpack.js.org/configuration/output/#template-strings to see additional template strings.
filename: '[path][name].[hash:6][ext]'
}
}
]
}
}
Alternatively for Webpack 4 you can add file-loader as a dependency and use it with this equivalent config addition:
module.exports = {
module: {
rules: [
{
test: /data\/json\/.+\.json$/
loader: 'file-loader',
options: {
name: '[path][name].[hash:6][ext]'
}
}
]
}
}
Either way your code will now be able to work simply as follows:
let name = "tiles";
const tileDataUrl = require(`data/json/${name}.json`); // tileDataUrl will display the interpolated filename.
fetch(tileDataUrl);

How to include manual import() in Webpack Bundle

I am quite new to Webpack, so bear with me if thats a stupid question.
My goal is to transform my old, AMD based codebase to a ES6 Module based solution. What I am struggling with is handling dynamic import()s. So my app router works on a module basis, i.e. each route is mapped to a module path and then required. Since I know what modules will be included, I just add those dynamically imported modules to my r.js configuration and am able to build everything in a single file, with all require calls still working.
Now, I am trying to do the same with ES6 modules and Webpack. With my devmode this is no problem as I can just replace require() with import(). However I cannot get this to work with bundling. Either Webpack splits my code (and still fails to load the dynamic module anyways), or - if I use the Array format for the entry config, the dynamic module is included in the bundle but loading still fails: Error: Cannot find module '/src/app/DynClass.js'
This is how my Webpack config looks like:
const webpack = require('webpack');
const path = require('path');
module.exports = {
mode: "development",
entry: ['./main.js', './app/DynClass.js'],
output: {
filename: 'main.js',
path: path.resolve(__dirname, "../client/")
},
resolve: {
alias: {
"/src": path.resolve(__dirname, '')
}
},
module: {
rules: [
{
test: /\.tpl$/i,
use: 'raw-loader',
},
]
}
};
So basically I want to tell Webpack: "hey, there is another module (or more) that is to be loaded dynamically and I want it to be included in the bundle"
How can I do this?
So yeah, after much fiddling there seems to be a light at the end of the tunnel. Still, this is not a 100% solution and it is surely not for the faint of heart, as it is quite ugly and fragile. But still I want to share my approach with you:
1) manual parsing of my routes config
My router uses a config file looking like this:
import StaticClass from "/src/app/StaticClass.js";
export default {
StaticClass: {
match: /^\//,
module: StaticClass
},
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
};
So as you can see the export is an object, with keys acting as the route id, and an object that contains the matches (regex based) and the module which should be executed by the router if the route matches. I can feed my router with both a Constructor function (or an object) for modules which are available immediatly (i.e. contained in the main chunk) or if the module value is a string, this means that the router has to load this module dynamically by using the path specified in the string.
So as I know what modules could be potentially loaded (but not if and when) I can now parse this file within my build process and transform the route config to something webpack can understand:
const path = require("path");
const fs = require("fs");
let routesSource = fs.readFileSync(path.resolve(__dirname, "app/routes.js"), "utf8");
routesSource = routesSource.substr(routesSource.indexOf("export default"));
routesSource = routesSource.replace(/module:\s*((?!".*").)*$/gm, "module: undefined,");
routesSource = routesSource.replace(/\r?\n|\r/g, "").replace("export default", "var routes = ");
eval(routesSource);
let dummySource = Object.entries(routes).reduce((acc, [routeName, routeConfig]) => {
if (typeof routeConfig.module === "string") {
return acc + `import(/* webpackChunkName: "${routeName}" */"${routeConfig.module}");`;
}
return acc;
}, "") + "export default ''";
(Yeah I know this is quite ugly and also a bit brittle so this surely could be done better)
Essentially I create a new, virtual module where every route entry which demands a dynamic import is translated, so:
DynClass: {
match: /^\//,
module: "/src/app/DynClass.js"
}
becomes:
import(/* webpackChunkName: "DynClass" */"/src/app/DynClass.js");
So the route id simply becomes the name of the chunk!
2) including the virtual module in the build
For this I use the virtual-module-webpack-plugin:
plugins: [
new VirtualModulePlugin({
moduleName: "./app/dummy.js",
contents: dummySource
})
],
Where dummySource is just a string containing the sourcecode of my virtual module I just have generated. Now, this module is pulled in and the "virtual imports" can be processed by webpack. But wait, I still need to import the dummy module, but I do not have any in my development mode (where I use everything natively, so no loaders).
So in my main code I do the following:
let isDev = false;
/** #remove */
isDev = true;
/** #endremove */
if (isDev) { import('./app/dummy.js'); }
Where "dummy.js" is just an empty stub module while I am in development mode. The parts between that special comments are removed while building (using the webpack-loader-clean-pragma loader), so while webpack "sees" the import for dummy.js, this code will not be executed in the build itself since then isDev evaluates to false. And since we already defined a virtual module with the same path, the virtual module is included while building just like I want, and of course all dependencies are resolved as well.
3) Handling the actual loading
For development, this is quite easy:
import routes from './app/routes.js';
Object.entries(routes).forEach(async ([routeId, route]) => {
if (typeof route.module === "function") {
new route.module;
} else {
const result = await import(route.module);
new result.default;
}
});
(Note that this is not the actual router code, just enough to help me with my PoC)
Well, but for the build I need something else, so I added some code specific to the build environment:
/** #remove */
const result = await import(route.module);
new result.default;
/** #endremove */
if (!isDev) {
if (typeof route.module === "string") { await __webpack_require__.e(routeId); }
const result = __webpack_require__(route.module.replace("/src", "."));
new result.default;
}
Now, the loading code for the dev environment is just stripped out, and there is another loading code that uses webpack internally. I also check if the module value is a function or string, and if it is the latter I invoke the internal require.ensure function to load the correct chunk: await __webpack_require__.e(routeId);. Remember that I named my chunks when generating the virtual module? Now thats why I still can find them now!
4) more needs to be done
Another thing I encountered is when several dynamically loaded modules have the same dependencies, webpack tries to generate more chunks with names like module1~module2.bundle.js, breaking my build. To counter this, I needed to make sure that all those shared modules go into a specific named bundle I called "shared":
optimization: {
splitChunks: {
chunks: "all",
name: "shared"
}
}
And when in production mode, I simply load this chunk manually before any dynamic modules depending on it are requested:
if (!isDev) {
await __webpack_require__.e("shared");
}
Again, this code only runs in production mode!
Finally, I have to prevent webpack renaming my modules (and chunks) to something like "1", "2" etc, but rather keep the names I just have defined:
optimization: {
namedChunks: true,
namedModules: true
}
Se yeah, there you have it! As I said this wasn't pretty but seems to work, at least with my simplified test setup. I really hope there aren't any blockers ahead of me when I do all the rest (like ESLint, SCSS etc)!

Webpack: Injecting variables into static service-worker.js

I'm writing PWA app. I was using default Service Worker from template that I'm using (Vue.js PWA template), but now I have decided to write my own from the scratch. I have placed it (service-worker.js) into static folder, because I want to have static name for it - I don't want to change name each time (build).
In this particular Service Worker I want to use package name and version, so that I can nicely generate cache ID.
So I want to achieve something like this:
./package.json:
{
"name": "my.app",
"version": "1.0.0",
...
}
./static/service-worker.js:
var CACHE_ID = 'PACKAGE_NAME-vPACKAGE_VERSION';
// ...
./build/service-worker.js:
var CACHE_ID = 'my.app-v1.0.0';
The ./build/service-worker.js shows what I want to achieve.
I have tried https://www.npmjs.com/package/string-replace-loader with below configuration:
{
test: /service-worker\.js$/,
loader: 'string-replace-loader',
options: {
multiple: [
{
search: 'PACKAGE_NAME',
replace: packageConfig.name
},
{
search: 'PACKAGE_VERSION',
replace: packageConfig.version
}
]
}
}
But as I understand files placed in static are not modules (am I right?), so those are not checked by module.rules.
I would be greatful for help and/or guidence how I can solve this problem.
Ok, I finnaly got it. I have used copy-webkit-plugin and it's possibility to transform:
plugins: [
new CopyWebpackPlugin([
{
from: 'static/service-worker.js',
to: './service-worker.js',
transform (content) {
var parsed = content.toString();
var transformation = [
{
search: 'PACKAGE_NAME',
replace: packageConfig.name
},
{
search: 'PACKAGE_VERSION',
replace: packageConfig.version
}
];
for(var i = 0; i < transformation.length; i++) {
parsed = parsed.replace(transformation[i].search, transformation[i].replace);
}
return Buffer.from(parsed, 'utf8');
}
}
])
]
Modules
Modules are placed in node_modules. src is your source folder, where you should keep only this files you're not going to use in production mode.
Also remember that modules is nothing more than just a JavaScript code like libraries; set of functions. If you move your *.js files from node_modules to src — this still will by modules.
I can not really understand why would you like to use string-replace-loader as it has nothing to do with your question.
Loader allows to perform replacements in a way String.prototype.replace() does (loader uses it internally). It means that if you want to replace all occurrences, you should use RegExp-like string in options.search with g flag in options.flags, etc.
And from String.prototype.replace() in MDN:
The replace() method returns a new string with some or all matches of a pattern replaced by a replacement. The pattern can be a string or a RegExp, and the replacement can be a string or a function to be called for each match.
Or did I misunderstood you?
Worker-loader
But if I understood you correctly — there's actually a loader for workers.
$ npm install worker-loader --save-dev

Grunt build script that reads xml file to determine which files are copied

copy: {
build: {
cwd: 'app',
src: ['**', '!**/vendors/**', '!**src/js/*.js',],
dest: 'dist',
expand: true
}
}
I am using grunt build scripts to build a distribution folder for the completed product. However, its not 100% automatic and dynamic. For example, I have a folder of xml content files. Yet, I don't use them all. Right now, the whole folder is copied over to the build version. Manually I have to go in and delete the xml files I don't want in the build version then run it. Or I could go into the grunt file and and tell it to ignore those files.
The problem is that I don't want to do that every time. A theoretical idea I had would be to have an xml file where I define elements to represent certain other files.
<bootstrap>true</bootstrap>
<extraContent>false</extraContent>
This would say that the file correlated to bootstrap and extraContent should or shouldn't be ignored in the build. I am trying to figure out if you could do this in grunt.
something like the following is how I see the logic playing out...
var bootstrap = $(xml).find("bootstrap").text()
if(bootstrap == "false"){
var url = src/bootstrap.css
//Here add the correlated filepath defined above to be ignored
}
The problem is not only writing this so grunt knows what it is, but also combining that logic with the actual "copy:{}" script I showed above
If you want to include/exclude files based on their contents you can use filter function for this. Examples can be found in the official documentation: https://gruntjs.com/configuring-tasks#custom-filter-function.
The filter property can help you target files with a greater level of detail.
In your case this could be something like this:
copy: {
build: {
cwd: 'app',
src: ['**', '!**/vendors/**', '!**src/js/*.js',],
dest: 'dist',
expand: true,
// this filter function will copy xml files only when `bootstrap` is set to 'true'
filter: filepath => {
if (require('path').extname(filepath) !== 'xml')
return true;
const xml = require('fs').readFileSync(filepath, 'utf8');
const json = require('xml2json').toJson(xml);
return json.bootstrap === 'true';
}
}
}
You can then use the process function to copy only certain contents from specific files: https://github.com/gruntjs/grunt-contrib-copy#process
This option is passed to grunt.file.copy as an advanced way to control the file contents that are copied.

Webpack Hashing *after* uglification

We're using the style-loader in webpack, which, when compiled seems to place information about the current directory in the source code that injects style tags when modules are loaded/unloaded. It looks roughly like this:
if(false) {
// When the styles change, update the <style> tags
if(!content.locals) {
module.hot.accept("!!./../../../../../node_modules/css-loader/index.js!./../../../../../node_modules/sass-loader/index.js?includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/node-bourbon/node_modules/bourbon/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/bourbon-neat/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/infl-fonts!./campaigns.scss", function() {
var newContent = require("!!./../../../../../node_modules/css-loader/index.js!./../../../../../node_modules/sass-loader/index.js?includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/node-bourbon/node_modules/bourbon/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity/node_modules/node-neat/node_modules/bourbon-neat/app/assets/stylesheets&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/patternity&includePaths[]=/var/deploy/referrals/web_head/releases/20151118202441/node_modules/infl-fonts!./campaigns.scss");
if(typeof newContent === 'string') newContent = [[module.id, newContent, '']];
update(newContent);
});
}
// When the module is disposed, remove the <style> tags
module.hot.dispose(function() { update(); });
}
The thing to note is the directory listed in the accept string.
Now, this code ultimately gets removed once uglify is run (because of the if (false)) which is great. Where my problem lies however is that this compilation happens on 2 machines and the chunk hashing appears to happen before uglification, because the hash generated on 2 different machines (or even on the same machine but when in a different folder) is different. This obviously won't work if I'm deploying to, say, a production machine and need both of these machines to serve up an asset with the same digest.
Just to clarify, when not minified, the code is different, thus generates a different hash, but minification does in fact make the files identical, however the chunk hashing appears to have happened before minification.
Does anyone know how I can get the chunkhash to be generated after the uglify plugin is run. I'm using a config like so:
...
output: {
filename: '[name]-[chunkhash].js'
...
with the command:
webpack -p
edit
So after looking over this. I'm seeing now that this has to do with us adding includePaths to our style loader, it looks like this:
var includePaths = require('node-neat').includePaths;
module: {
loaders: [
{ test: /\.scss$/, loader: "style!css!sass?includePaths[]=" + includePaths },
]
}
So I think we know why we're getting these absolute URLs, but I think the original question still stands, IMO webpack should be hashing chunks AFTER minification, not before.

Categories