Using webpack to create & inject asset filesizes dictionary - javascript

I'm using Webpack to bundle source code and assets for a game. I also use the CompressionPlugin() to make static gzip files available so that my web server can send the precompressed files when appropriate. Some of the game assets are large so I have a loading experience up front that shows a progress bar while the assets are fetched.
Unfortunately a problem arises on Chome during loading when receiving a gzip response for an XMLHttpRequest in that the onprogress total is always 0. There are some imperfect workarounds for this such as this solution but they're not entirely appropriate for my case.
Instead I'd like to inject the compressed & decompressed file sizes of specific bundled assets into the html or javascript so that they're immediately accessible to the loading javascript code. Injecting something as follows would be perfect:
<script>
const assetSizes = {
"game.25317abd3eb6cf0fb0f1.wasm": {uncompressed: 8192, compressed: 1024},
"game.25317abd3eb6cf0fb0f1.data": {uncompressed: 8192, compressed: 1024}
};
</script>
I'm somewhat new to webpack so I'm not entirely sure how to approach this. I've considered using the WebpackManifestPlugin and implementing a custom generate option function. This would allow me to control the output of the generated manifest.json but it's still not clear to me if this the right thing to do or how I'd go about subsequently injecting this files contents ahead of my own loading javascript code.
Perhaps there is a better approach that would be more appropriate?
Update: I've been trying to progress this further and it feels like a custom Webpack plugin might be the right direction to go. If I tap into the afterEmit compiler hook it seems I have access to the filesizes I need and I can construct an appropriate dictionary:
class InjectFileSizePlugin {
apply(compiler) {
compiler.hooks.afterEmit.tap(
"InjectFileSizePlugin",
(compilation) => {
const fileSizes = {};
for (const [assetPath, assetInfo] of compilation.assetsInfo) {
if (assetInfo.related) {
fileSizes[assetPath] = {
uncompressed: assetInfo.size,
compressed: -1
};
if (assetInfo.related.gzipped) {
const gzippedAssetInfo = compilation.assetsInfo.get(
assetInfo.related.gzipped
);
if (gzippedAssetInfo) {
fileSizes[assetPath].compressed =
gzippedAssetInfo.size;
}
}
}
}
console.log(fileSizes); // <-- output is as I'd like, how to inject it now?
}
);
}
}
What's not clear though is how I can now go about injecting this fileSizes data into the bundle as afterEmit is called very late in the compilation stage after the bundle javascript has been emitted. There is an additionalPass compiler hook but I currently can't figure out how it works.

Related

Webdriver.io - Unable to load spec files quite likely because they rely on `browser` object

I'm using Webdriver.io to run tests on a large number of pages. Because all the specs for the pages are in a JSON file, I have a special class that sets up the test. It looks like this:
module.exports = class PageTester {
suiteName = '';
browser = {};
constructor (suiteName, browser) {
this.suiteName = suiteName;
this.browser = browser;
}
testModel(currentModel) {
describe(this.suiteName + ' endpoint ' + currentModel.url, () => {
this.browser.url(currentModel.url);
/* it() statements for the test */
});
}
}
Then in my specs folder I have a file that loads the JSON and plugs it into the PageTester class, like this:
const PageTester = require('../modules/PageTester');
const models = require('/path/to/some/file.json');
const pageTester = new PageTester('Some Name', browser);
for (const modelName in models) {
pageTester.testModel(models[modelName]);
}
When I run this code, WebdriverIO gives me the following warning:
WARN #wdio/mocha-framework: Unable to load spec files quite likely because they rely on `browser` object that is not fully initialised.
`browser` object has only `capabilities` and some flags like `isMobile`.
Helper files that use other `browser` commands have to be moved to `before` hook.
Spec file(s): /suite/test/specs/test.js
All the tests seem to run fine, so I don't actually understand what this warning is complaining about and what negative consequences ignoring it may have. So I would like to a) understand why this is happening and b) how it would be possible to get rid of this warning given the way my code is set up.
In my case, I resolve it by fixing the path for the require files. I noticed that my path was wrong. But the error that wdio throws is not really helpful. :/
you can only interact with browser object inside it blocks because it is not fully accessible before the browser session is started.
See https://webdriver.io/blog/2019/11/01/spec-filtering.html for details.
You simply should ensure your spec file and respective page file are kept on a similar folder structure.

Require.ensure() non-blocking

If we have different bundles created by webpack and we require.ensure something to dynamically transfer+eval it at a later point in time, it happens via jsonPadding and some webpack js magic. If we have
require.ensure([ ], ( require ) => {
console.log('before...');
var data = require( './myModule.js' );
console.log('after...');
}, 'myModule')
"after..." will get encountered when that module was entirely transferred and evaluated. If it happens to be that this chunk / module is pretty big, contains images, css and whatnot, the loading will pretty much lock down a browser while the webpack javascript code unpacks the bundle with all its components.
Question: Is there any way to "hook" into that require magic? For instance, it would be a dream scenario to have callbacks for:
whole file / chunk was transferred
image[1] was evaluated
css[1] was evaluated / style tag was injected
javascript was evaluated
and so forth, assuming that our transferred bundle contains a lot of data. In general it just bothers me pretty hard to have a nice option to asynchronously transfer whole bundles dynamically, but still have to load that very bundle in full sync / blocking fashion.
Let me preface by saying I know this might be an 'annoying' answer, because it doesn't answer your question directly but offers an
alternative, pragmatic, solution to the browser hanging problem. I
used this pattern myself to manage asset loading within the context of
a heavy 3D web game.
I'm writing this as an answer and not as a comment so it might serve
others who come across the same problem. If this does answer your
case, I'll be happy to provide actual code to implement and generify
these sort of modules.
If I understand correctly, essentially what you want is a way to break down MyModule into discrete components which can be atomically loaded and evaluated within the context of one require.ensure, but handle evaluating so that not everything is evaluated in one go resulting in browser hang.
A different way to look at this is to use the require and ensure methods themselves as the loading/evaluation mechanisms. Consider MyModule.js, which is a huge-loading module with the dependencies Css1, Css2, ... CssN as well as JS1, JS2, ... JSN and images.
My suggestion is to break it down into SuperMyModule.js which requires MyModuleLogic.js as well as all the CSS, images and JS.
Node, in SuperMyModule.js you could do:
let myModuleLogic = require("myModuleLogic");
console.log('JS was evaluated');
require.ensure(['image1.png'], ( require ) => {
let data = require( './images1.png' );
console.log('image[1] was evaluated');
// register that resource was evaluated/fire event
})
require.ensure(['style1.css'], ( require ) => {
let data = require( './style1.css' );
console.log('css[1] was evaluated');
// register that resource was evaluated/fire event
})
//after all resources evaluated/fire callback or event
Then in your original file, like you requested:
require.ensure([ ], ( require ) => {
console.log('before...');
let myModule = require( './superMyModule.js' );
console.log('after...');
})
And if you set up your module instance as an event emitter possibly hook into the loading of resources like so:
require.ensure([ ], ( require ) => {
let myModule = require( './superMyModule.js' );
myModule.on("loadResource", myCallback)
})
I guess I was confused about the topic myself, so my question was probably not precise enough to get properly answered. However, my misunderstanding on the whole "commonJS dynamic module loading" context was, that require.ensure() will just transfer the Module Code (respectively the Chunk which webpack created) over the wire. After that the transferred Chunk which basically is just one big ECMAscript file just sits there in the browser, cached but not evaluated yet. Evaluation of the entire Chunk happens only on the actual require() call.
Having that said, it is entirely in your hands how you decouple and evaluate the individual parts of a Module / Chunk. If for example, like in my original question, a module requires() in some CSS Files, some Images and some HTML, that all gets asynchronously transferred on the require.ensure() call. In which manner you require() (and therefore evaluate) those parts is entirely up to you and you can decouple those call if necessary yourself.
For instance, a Module looks like this:
Module1.js
"use strict";
import { io } from 'socket.io-client';
document.getElementById( 'foo' ).addEventListener('click', ( event ) => {
let partCSS = require( 'style/usable!./someCoolCSS.css' ),
moarCSS = require( 'style/usable!./moarCoolCSS.css' ),
tmpl = require( './myTemplate.html' ),
image1 = require( './foo.jpg' ),
image2 = require( './bar.png' );
}, false);
Of course, all these files are already contained by the Chunk which gets transferred to the client, when some other Module calls:
require.ensure([ 'Module1.js' ], ( require ) => {
}, 'Module1');
This is were my confusion was. So now, we can just play with the require() calls within module1.js ourself. If we really require a lot of files that way, we could even use a setTimeout / setImmediate run-away-timer to decouple the synchronous evaluation between each require() call if necessary or wanted.
Actually a long answer for a pretty simple story.
TL;DR:
"require.ensure transfers a whole chunk over the wire. This chunk contains all files which are part of a require() call within the ensured Module. But those files do not get automatically evaluated. That happens only when the actual require() call is matched at runtime (which is represented by a webpackJSONP call at this point)"
You can offload the main thread and avoid blocking using the worker loader.
Downside: there's extra message passing to do between main window and the worker.
https://github.com/webpack/worker-loader
You can also try emmitting load events in the large module to track more granular progress.
Further reference:
MDN docs https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API
Browser support http://caniuse.com/#feat=webworkers
If you would like to kick off loading asynchronous JavaScript bundle through require.ensure as well as other Promises, here is how you can achieve that:
const requireEnsurePromise = new Promise((resolve) => {
require.ensure(['./modulePath'], function (requireEnsure) {
console.log('The module is fetched but not evaluated yet');
resolve(requireEnsure.bind(null, require.resolve('./modulePath')));
});
});
Promise.all([
fetch('/api/relevant/stuff').then(response => response.json()),
requireEnsurePromise,
]).then((values) => {
if (values[0]) {
// DO STUFF
}
console.log('right before module is evaluated');
const evaluatedModule = values[1]();
});
Webpack statically determined what the module path corresponds to Webpack internal representation (could be an integer or a string). Whenever Webpack recognizes the require.ensure([], fn) patten, it looks at the function body of the fn callback and do so. In order to delay the evaluation time after fetching the JavaScript bundle in a Promise fashion, require('./modulePath') cannot be present inside the require.ensure success callback as it will evaluate the module. Webpack translates require('./modulePath') to something like __webpack_require__(2343423), that's why you would like to avoid using it in this scenario.

Is it possible to let webpacks System.import use ajax (for progress events)?

So I've just updated to webpack 2 and have my first working setup where webpack automatically creates chunks by looking at System.import calls. Pretty sweet!
However, I load the initial chunk with an ajax call so that I can show the progress while loading
So my question is, can I overwrite or change the function of System.import somehow so that it will use an ajax request that I can listen to for events, instead of loading the chunk with a <script> tag?
No, unfortunately not. webpack 2 translates System.import() to ordinary require.ensure() calls which just uses the <script> tag. Even the official WHATWG Loader Spec does not provide an API for this kind of event. I've created an issue for this question.
Regarding webpack: There is a way to implement your own require.ensure(). However, since chunk loading is an integral part of webpack, this requires to dive a little deeper. I'm not sure how important this is for you, but you might be interested how things work inside webpack, so let's take a look:
In webpack, all internal features are implemented as plugins. This way, webpack is able to support a lot of different features and environments. So, if you're interested how things are implemented in webpack, it's always a good idea to a) take a look at WebpackOptionsApply or b) search for a specific string/code snippet.
Chunk loading depends heavily on the given target, because you need different implementations for each environment. Webpack allows you to define custom targets. When you pass in a function instead of a string, webpack invokes the function with a compiler instance. There you can apply all the required plugins. Since our custom target is almost like the web target, we just copy all the stuff from the web target:
// webpack.config.js
const NodeSourcePlugin = require("webpack/lib/node/NodeSourcePlugin");
const FunctionModulePlugin = require("webpack/lib/FunctionModulePlugin");
const LoaderTargetPlugin = require("webpack/lib/LoaderTargetPlugin");
const JsonpChunkTemplatePlugin = require("webpack/lib/JsonpChunkTemplatePlugin");
const JsonpHotUpdateChunkTemplatePlugin = require("webpack/lib/JsonpHotUpdateChunkTemplatePlugin");
function customTarget(compiler) {
compiler.apply(
new JsonpTemplatePlugin(compiler.options.output),
new FunctionModulePlugin(compiler.options.output),
new NodeSourcePlugin(compiler.options.node),
new LoaderTargetPlugin("web")
);
}
module.exports = {
entry: require.resolve("./app/main.js"),
output: {
path: path.resolve(__dirname, "dist"),
filename: "bundle.js"
},
target: customTarget
};
If you take a look at each plugin, you will recognize that the JsonpTemplatePlugin is responsible for loading chunks. So let's replace that with out own implementation. We call it the XHRTemplatePlugin:
function customTarget(compiler) {
compiler.apply(
new XHRTemplatePlugin(compiler.options.output),
new FunctionModulePlugin(compiler.options.output),
new NodeSourcePlugin(compiler.options.node),
new LoaderTargetPlugin("my-custom-target")
);
}
Our XHRTemplatePlugin is responsible for providing the code in the main chunk, in each child chunk and for hot updates:
function XHRTemplatePlugin() {}
XHRTemplatePlugin.prototype.apply = function (compiler) {
compiler.plugin("this-compilation", function(compilation) {
compilation.mainTemplate.apply(new XHRMainTemplatePlugin());
compilation.chunkTemplate.apply(new XHRChunkTemplatePlugin());
compilation.hotUpdateChunkTemplate.apply(new XHRHotUpdateChunkTemplatePlugin());
});
};
Maybe, you can also re-use the JsonpChunkTemplatePlugin and JsonpHotUpdateChunkTemplatePlugin plugin, but this depends on your use-case/implementation.
Your XHRMainTemplatePlugin now may look like this:
function XHRMainTemplatePlugin() {}
XHRMainTemplatePlugin.prototype.apply = function (mainTemplate) {
mainTemplate.plugin("require-ensure", function(_, chunk, hash) {
return this.asString([
// Add your custom implementation here
"fetch()"
]);
});
};
I won't go any further here because I think this answer is already long enough. But I recommend to create a real small example project and to check the output created by webpack. The internal webpack plugins may look a little bit scary on first sight, but most of them are real short and do just one thing. You can also get some inspiration from them.

Any way to inject values into Less files in Meteor?

I am working on a project where we want the user to be able to define custom colors. We are running the latest version of Meteor with, among others, the less package.
Right now all colors are variables located in a single theme.lessimport file which is included early in processing. All colors throughout the site (and many subsequent less files) are generated from these few variables.
The idea was to just generate a new userTheme.lessimport file for each user that, if present, could be imported just after the theme.lessimport file to override the variables with custom values. It all works beautifully and flawlessly if you physically add the file to the directory, but I can't seem to even think of a way to do it dynamically/programmatically.
I'm starting to wonder if this can even be done with less.
one of the big hang-ups is that so much of the css is derived from these variables—including CSS included with our own app's plugins/modules.
it appears that you can't import a remote file for inclusion in less pre-processing... so the file can't be generated on a remote server (this would be the ideal situation for our situation as user data will exist on an API server).
there doesn't seem to be any programmatic way to generate or otherwise inject any values into less—at least on Meteor—as I can't find any way to interact with the less through JS.
Aside from this inconvenience, less has been perfect for what we're doing, so I really want to make this work. Hoping someone out there has some wisdom or direction they can impart.
Take a look at how the bootstrap3-less package implements variables and mixins. Specifically the Advanced Usage section of their README.
"If you want to #import a file, give it the extension .import.less to prevent Meteor from processing it independently." So in your instance you'll name your theme file: theme.import.less
Of course you can do it. Just use the "fs" node module.
Here's a rather stupid example. There are lots of gotchas when you do it, but for a basic proof-of-concept, check this.
if (Meteor.isClient) {
Template.hello.greeting = function () {
return "Welcome to less_injector_meteor_test.";
};
Template.hello.events({
'click #button': function () {
var css = "body {background: " + $("#color").val() + ";}";
Meteor.call("writeToUserThemeFile", css);
}
});
}
if (Meteor.isServer) {
Meteor.methods({
"writeToUserThemeFile" :function(css) {
var fs = Npm.require("fs");
var path = "/Users/charnjitsingh/Desktop/less_injector_meteor_test";
fs.writeFile(path+"/user_theme.less", css, function(err) {
console.log("WRITING FILE");
if (err) {
console.log("ERROR WHEN WRITING", err);
}
});
}
});
}

MVC4 Beta Minification and Bundling: Ordering files and debugging in browser

I've started using bundling and minification included with the MVC4 Beta. I'm running into a few issues with it:
For one thing, if I use the classic <script src="Folder/js" type="text/javascript"/> bundling, it seems like I have to rename my files to make sure they're bundled in the proper order.
Let's say I have three javascript files: "ants.js", "bugs.js", "insects.js"
ants.js depends on bugs.js
bugs.js depends on insects.js
Default bundling seems to bundle them in alphabetical order.
To get them to bundle properly, I have to rename them to: "0.insects.js", "1.bugs.js", "2.ants.js"
That's really hackish and there has to be a cleaner way.
The next problem I'm having is debugging. I like to step through the javascript in my testing browsers, is there a way to turn off just the minification while in DEBUG mode?
EDIT: To be clear, I know I can create bundles and register them from C#, it just seems really ugly to have to do it that way.
To temporarily get non-minified output use this
public class NonMinifyingJavascript : IBundleTransform
{
public void Process(BundleContext context, BundleResponse bundle)
{
if(bundle == null)
{
throw new ArgumentNullException("bundle");
}
context.HttpContext.Response.Cache.SetLastModifiedFromFileDependencies();
foreach(FileInfo file in bundle.Files)
{
HttpContext.Current.Response.AddFileDependency(file.FullName);
}
bundle.ContentType = "text/javascript";
//base.Process(context, bundle);
}
}
If you wanted it based totally on a config setting, I imagine you could create an IBundle transform that delegates to this one or JsMinify depending on your config setting
In order to control the ordering of the javascript files you need to use the BundleFileSetOrdering
var javascriptBundle = new Bundle("~/site/js", new NonMinifyingJavascript());
//controls ordering for javascript files, otherwise they are processed in order of AddFile calls
var bootstrapOrdering = new BundleFileSetOrdering("bootstrap");
//The popover plugin requires the tooltip plugin
bootstrapOrdering.Files.Add("bootstrap-tooltip.js");
bootstrapOrdering.Files.Add("bootstrap-popover.js");
BundleTable.Bundles.FileSetOrderList.Add(bootstrapOrdering);
javascriptBundle.AddDirectory("~/Scripts", "bootstrap-*.js");
I use the MVC default NoTransform instead of the NonMinifyingJavascript proposed by chrisortman. As far as I know it does the same.
But still not optimal. Ideally I want a script tag for each idividual script file when I want to debug. This makes debugging a lot easier with VS11, which I like to use (one debugger so I can debug js and c# in one debug session).
So I created this little helper:
#helper RenderScriptTags(string virtualPath)
{
if (Minify /* some appsetting */)
{
<script src="#System.Web.Optimization.BundleTable.Bundles.ResolveBundleUrl(virtualPath)"></script>
}
else
{
foreach (var file in System.Web.Optimization.BundleResolver.Current.GetBundleContents(virtualPath))
{
<script src="#Url.Content(file)"></script>
}
}
}
#RenderScriptTags("~/libraries")
I have a single page app, so I have this in my main cshtml file, but it can easily be generalized by moving this to an htmlhelper extension method.
Works nice!
This code takes also into account the BundleFileSetOrdering if you have set one!
Might also take a look at RequestReduce. It bundles your scripts and CSS without any coding or configuration by looking at how they are laid out on your page and bundling according to that. It also allows you to turn off bundling and minification via web.config or for individual requests via a querystring param: RRFilter=disabled.
I ran into this same problem yesterday and couldn't find a good solution with the new System.Web.Optimization namespace. There were some broken MSDN links, so the fact that everything is in beta means it may change, but I digress...
You could always load the scripts differently during development than in production. Easy to do with an AppSetting:
#if (System.Configuration.
ConfigurationManager.AppSettings["BundleResources"] != null)
{
#* load the css & js using bundles *#
}
else
{
#* load the css & js files individually*#
}
You can then enable / disable the optimization stuff by commenting out an appsetting in web.config:
<appSettings>
...
<!--<add key="BundleResources" value="uhuh" />-->
...
</appSettings>

Categories