I was surprised by an experience with relative paths in JavaScript today. I’ve boiled down the situation to the following:
Suppose you have a directory structure like:
app/
|
+--app.html
+--js/
|
+--app.js
+--data.json
All my app.html does is run js/app.js
<!DOCTYPE html>
<title>app.html</title>
<body>
<script src=js/app.js></script>
</body>
app.js loads the JSON file and sticks it at the beginning of body:
// js/app.js
fetch('js/data.json') // <-- this path surprises me
.then(response => response.json())
.then(data => app.data = data)
The data is valid JSON, just a string:
"Hello World"
This is a pretty minimal usage of fetch, but I am surprised that the URL that I pass to fetch has to be relative to app.html instead of relative to app.js. I would expect this path to work, since data.json and app.js are in the same directory (js/):
fetch('data.json') // nope
Is there an explanation for why this is the case?
When you say fetch('data.json') you are effectively requesting http://yourdomain.com/data.json since it is relative to the page you're are making the request from. You should lead with forward slash, which will indicate that the path is relative to the domain root: fetch('/js/data.json'). Or fully qualify with your domain fetch('http://yourdomain.com/js/data.json').
An easy way to understand why it must be the case is to consider what should happen if we write a helper function in app/js/helper/logfetch.js:
// app/js/helper/logfetch.js
function logFetch(resource) {
console.log('Fetching', resource);
return fetch(resource);
}
Now, consider what happens if we use logFetch from app/js/app.js:
// app/js/app.js
fetch('data.json'); // if this is relative to js/, then ...
logFetch('data.json'); // should this be relative to js/ or js/helper/?
We might want these two calls to return the same thing - but if fetch is relative to the contained file, then logFetch would request js/helper/data.json instead of something consistent with fetch.
If fetch could sense where it is called from, then to implement helper libraries such as logFetch, the JavaScript would need a whole range of new caller-location-aware functionality.
In contrast, performing the fetch relative to the HTML file provides more consistency.
CSS works differently because it doesn't have the complexity of method calling: you can't create "helper CSS modules" that transform other CSS modules, so the idea of relative paths is a lot more conceptually cleaner.
This is not exactly a recommendation since it relies on a number of things that aren't guaranteed to work everywhere or to continue to work into the future, however it works for me in the places I need it to and it might help you.
const getRunningScript = () => {
return decodeURI(new Error().stack.match(/([^ \n\(#])*([a-z]*:\/\/\/?)*?[a-z0-9\/\\]*\.js/ig)[0])
}
fetch(getRunningScript() + "/../config.json")
.then(req => req.json())
.then(config => {
// code
})
Related
I have a scenario in which I want to dynamically tell a child component which image to show, based on an required image passed down by the parent, as such:
Parent:
function Social() {
return (
<SocialBar>
{
SocialMedias.map((media) =>{
return <SocialApp
key={uuidv4()}
link={media.link}
social={require(media.icon)}
/>
})
}
</SocialBar>
);
}
Child
function SocialApp(props) {
return (
<Social href={props.link} target="_blank">
<CustomSizedImage src={props.social.default} />
</Social>
);
}
Which seems pretty straight forward, right? However, this causes the error below:
I've seen some posts say that this is caused because the path is not absolute. However, I tested with absolute paths and it still doesn't work. The funny thing is, when I changed the parent code to the below for testing, it worked.
New parent code:
function Social() {
const img = require("../assets/icons/social-media/instagram.svg");
return (
<SocialBar>
{
SocialMedias.map((media) =>{
return <SocialApp
key={uuidv4()}
link={media.link}
social={img}
/>
})
}
</SocialBar>
);
}
export default Social;
The result:
OBS: I tested with EVERY image that this code will possibly load. Is not an image/image path related issue.
What is expected of all this is to dynamically load icons for social media with the correct URLs, etc. What I do not understand is: why does it work when I require the image outside the return?. I know I could make this work by iterating media object outside of return and use the resulting array with the map index, but that doesn't seem clean. Why does it have this behavior? Is there a way to make it work inside the map?
Your last code snippet works because the path is known at compile time.
I assume you use a bundler like Webpack or similar. When they compile your project, they check all imports (including require calls), use that to calculate the dependency tree, and finally bundle all required files together. For images, that includes "compiling it into a module" which just returns the path to the image.
Usually that also involves replacing paths with module identifiers. E.g. if you ever looked at the built Webpack code, you'll notice that it doesn't use e.g. require('./module') but __webpack_require.c(5) or something similar. It gave the module './module' the unique ID 5 in the bundled code.
With your earlier code snippets, when using a dynamic require with a non-constant string, Webpack just doesn't know that you'll request that image. Nor would it know the new unique ID to use for that.
The easiest solution is to import your images somewhere and use those references dynamically:
// images.ts
export IconA from './media/icon-a.svg';
export IconB from './media/icon-b.svg';
// somewhere else
import { IconA, IconB } from './images';
const iconPath = someCondition ? IconA : IconB;
Of course, you don't have to use the import/require mechanic. You can simply have your icon path be an actual path, relative to your public/ folder.
I'm using Webpack to bundle source code and assets for a game. I also use the CompressionPlugin() to make static gzip files available so that my web server can send the precompressed files when appropriate. Some of the game assets are large so I have a loading experience up front that shows a progress bar while the assets are fetched.
Unfortunately a problem arises on Chome during loading when receiving a gzip response for an XMLHttpRequest in that the onprogress total is always 0. There are some imperfect workarounds for this such as this solution but they're not entirely appropriate for my case.
Instead I'd like to inject the compressed & decompressed file sizes of specific bundled assets into the html or javascript so that they're immediately accessible to the loading javascript code. Injecting something as follows would be perfect:
<script>
const assetSizes = {
"game.25317abd3eb6cf0fb0f1.wasm": {uncompressed: 8192, compressed: 1024},
"game.25317abd3eb6cf0fb0f1.data": {uncompressed: 8192, compressed: 1024}
};
</script>
I'm somewhat new to webpack so I'm not entirely sure how to approach this. I've considered using the WebpackManifestPlugin and implementing a custom generate option function. This would allow me to control the output of the generated manifest.json but it's still not clear to me if this the right thing to do or how I'd go about subsequently injecting this files contents ahead of my own loading javascript code.
Perhaps there is a better approach that would be more appropriate?
Update: I've been trying to progress this further and it feels like a custom Webpack plugin might be the right direction to go. If I tap into the afterEmit compiler hook it seems I have access to the filesizes I need and I can construct an appropriate dictionary:
class InjectFileSizePlugin {
apply(compiler) {
compiler.hooks.afterEmit.tap(
"InjectFileSizePlugin",
(compilation) => {
const fileSizes = {};
for (const [assetPath, assetInfo] of compilation.assetsInfo) {
if (assetInfo.related) {
fileSizes[assetPath] = {
uncompressed: assetInfo.size,
compressed: -1
};
if (assetInfo.related.gzipped) {
const gzippedAssetInfo = compilation.assetsInfo.get(
assetInfo.related.gzipped
);
if (gzippedAssetInfo) {
fileSizes[assetPath].compressed =
gzippedAssetInfo.size;
}
}
}
}
console.log(fileSizes); // <-- output is as I'd like, how to inject it now?
}
);
}
}
What's not clear though is how I can now go about injecting this fileSizes data into the bundle as afterEmit is called very late in the compilation stage after the bundle javascript has been emitted. There is an additionalPass compiler hook but I currently can't figure out how it works.
I'm not sure I'm even asking the right question here, sorry, but I think the two general ones are:
In what way do you need to modify a node.js package using require etc to be used as a plain embedded script/library in HTML?
How do you call a class constructor (?) in JS as a function to validate a form field?
I'm trying to use this small JS library NoSwearingPlease (which is an npm package) in an environment with no node or build system – so I'm just trying to call it like you would jQuery or something with a script & src in the HTML, and then utilise it with a small inline script.
I can see a couple of things are required to get this working:
the JSON file needs to be called in a different way (not using require etc)
the checker variable needs to be rewritten, again without require
I attempted using jQuery getJSON but I just don't understand the class & scope bits of the library enough to use it I think:
var noswearlist = $.getJSON( "./noswearing-swears.json" );
function() {
console.log( "got swear list from inline script" );
})
.fail(function() {
console.log( "failed to get swear list" );
})
noswearlist.done(function() {
console.log( "done callback as child of noswearlist variable" );
var checker = new NoSwearing(noswearlist);
console.log(checker);
});
Please halp. Thanks!
No need to modify, when outside of node the class is just appended to window (global):
fetch("https://cdn.jsdelivr.net/gh/ThreeLetters/NoSwearingPlease#master/swears.json").then(response => {
return response.json();
}).then(data => {
var noSwearing = new NoSwearing(data);
console.log(noSwearing.check("squarehead"));
});
<script src="https://cdn.jsdelivr.net/gh/ThreeLetters/NoSwearingPlease#master/index.js"></script>
In the future, you can answer this type of question on your own by looking through the source code and looking up things you don't understand. That being said, here's what I was able to gather doing that myself.
For your first question, if you have no build tools you can't use require, you have to hope your NPM package supports adding the class to the window or has a UMD export (which in this case, it does). If so, you can download the source code or use a CDN like JSDelivr and add a <script> tag to link it.
<script src="https://cdn.jsdelivr.net/gh/ThreeLetters/NoSwearingPlease#master/index.js"></script>
I'm having a hard time deciphering your script (it has a few syntax errors as far as I can tell), so here's what you do if you have a variable ns containing the JSON and the string str that you need to check:
var checker = new NoSwearing(ns);
checker.check(str);
As an aside, you should really use build tools to optimize your bundle size and make using packages a lot easier. And consider dropping jQuery for document.querySelector, fetch/XMLHttpRequest, and other modern JavaScript APIs.
If we have different bundles created by webpack and we require.ensure something to dynamically transfer+eval it at a later point in time, it happens via jsonPadding and some webpack js magic. If we have
require.ensure([ ], ( require ) => {
console.log('before...');
var data = require( './myModule.js' );
console.log('after...');
}, 'myModule')
"after..." will get encountered when that module was entirely transferred and evaluated. If it happens to be that this chunk / module is pretty big, contains images, css and whatnot, the loading will pretty much lock down a browser while the webpack javascript code unpacks the bundle with all its components.
Question: Is there any way to "hook" into that require magic? For instance, it would be a dream scenario to have callbacks for:
whole file / chunk was transferred
image[1] was evaluated
css[1] was evaluated / style tag was injected
javascript was evaluated
and so forth, assuming that our transferred bundle contains a lot of data. In general it just bothers me pretty hard to have a nice option to asynchronously transfer whole bundles dynamically, but still have to load that very bundle in full sync / blocking fashion.
Let me preface by saying I know this might be an 'annoying' answer, because it doesn't answer your question directly but offers an
alternative, pragmatic, solution to the browser hanging problem. I
used this pattern myself to manage asset loading within the context of
a heavy 3D web game.
I'm writing this as an answer and not as a comment so it might serve
others who come across the same problem. If this does answer your
case, I'll be happy to provide actual code to implement and generify
these sort of modules.
If I understand correctly, essentially what you want is a way to break down MyModule into discrete components which can be atomically loaded and evaluated within the context of one require.ensure, but handle evaluating so that not everything is evaluated in one go resulting in browser hang.
A different way to look at this is to use the require and ensure methods themselves as the loading/evaluation mechanisms. Consider MyModule.js, which is a huge-loading module with the dependencies Css1, Css2, ... CssN as well as JS1, JS2, ... JSN and images.
My suggestion is to break it down into SuperMyModule.js which requires MyModuleLogic.js as well as all the CSS, images and JS.
Node, in SuperMyModule.js you could do:
let myModuleLogic = require("myModuleLogic");
console.log('JS was evaluated');
require.ensure(['image1.png'], ( require ) => {
let data = require( './images1.png' );
console.log('image[1] was evaluated');
// register that resource was evaluated/fire event
})
require.ensure(['style1.css'], ( require ) => {
let data = require( './style1.css' );
console.log('css[1] was evaluated');
// register that resource was evaluated/fire event
})
//after all resources evaluated/fire callback or event
Then in your original file, like you requested:
require.ensure([ ], ( require ) => {
console.log('before...');
let myModule = require( './superMyModule.js' );
console.log('after...');
})
And if you set up your module instance as an event emitter possibly hook into the loading of resources like so:
require.ensure([ ], ( require ) => {
let myModule = require( './superMyModule.js' );
myModule.on("loadResource", myCallback)
})
I guess I was confused about the topic myself, so my question was probably not precise enough to get properly answered. However, my misunderstanding on the whole "commonJS dynamic module loading" context was, that require.ensure() will just transfer the Module Code (respectively the Chunk which webpack created) over the wire. After that the transferred Chunk which basically is just one big ECMAscript file just sits there in the browser, cached but not evaluated yet. Evaluation of the entire Chunk happens only on the actual require() call.
Having that said, it is entirely in your hands how you decouple and evaluate the individual parts of a Module / Chunk. If for example, like in my original question, a module requires() in some CSS Files, some Images and some HTML, that all gets asynchronously transferred on the require.ensure() call. In which manner you require() (and therefore evaluate) those parts is entirely up to you and you can decouple those call if necessary yourself.
For instance, a Module looks like this:
Module1.js
"use strict";
import { io } from 'socket.io-client';
document.getElementById( 'foo' ).addEventListener('click', ( event ) => {
let partCSS = require( 'style/usable!./someCoolCSS.css' ),
moarCSS = require( 'style/usable!./moarCoolCSS.css' ),
tmpl = require( './myTemplate.html' ),
image1 = require( './foo.jpg' ),
image2 = require( './bar.png' );
}, false);
Of course, all these files are already contained by the Chunk which gets transferred to the client, when some other Module calls:
require.ensure([ 'Module1.js' ], ( require ) => {
}, 'Module1');
This is were my confusion was. So now, we can just play with the require() calls within module1.js ourself. If we really require a lot of files that way, we could even use a setTimeout / setImmediate run-away-timer to decouple the synchronous evaluation between each require() call if necessary or wanted.
Actually a long answer for a pretty simple story.
TL;DR:
"require.ensure transfers a whole chunk over the wire. This chunk contains all files which are part of a require() call within the ensured Module. But those files do not get automatically evaluated. That happens only when the actual require() call is matched at runtime (which is represented by a webpackJSONP call at this point)"
You can offload the main thread and avoid blocking using the worker loader.
Downside: there's extra message passing to do between main window and the worker.
https://github.com/webpack/worker-loader
You can also try emmitting load events in the large module to track more granular progress.
Further reference:
MDN docs https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API
Browser support http://caniuse.com/#feat=webworkers
If you would like to kick off loading asynchronous JavaScript bundle through require.ensure as well as other Promises, here is how you can achieve that:
const requireEnsurePromise = new Promise((resolve) => {
require.ensure(['./modulePath'], function (requireEnsure) {
console.log('The module is fetched but not evaluated yet');
resolve(requireEnsure.bind(null, require.resolve('./modulePath')));
});
});
Promise.all([
fetch('/api/relevant/stuff').then(response => response.json()),
requireEnsurePromise,
]).then((values) => {
if (values[0]) {
// DO STUFF
}
console.log('right before module is evaluated');
const evaluatedModule = values[1]();
});
Webpack statically determined what the module path corresponds to Webpack internal representation (could be an integer or a string). Whenever Webpack recognizes the require.ensure([], fn) patten, it looks at the function body of the fn callback and do so. In order to delay the evaluation time after fetching the JavaScript bundle in a Promise fashion, require('./modulePath') cannot be present inside the require.ensure success callback as it will evaluate the module. Webpack translates require('./modulePath') to something like __webpack_require__(2343423), that's why you would like to avoid using it in this scenario.
So I've just updated to webpack 2 and have my first working setup where webpack automatically creates chunks by looking at System.import calls. Pretty sweet!
However, I load the initial chunk with an ajax call so that I can show the progress while loading
So my question is, can I overwrite or change the function of System.import somehow so that it will use an ajax request that I can listen to for events, instead of loading the chunk with a <script> tag?
No, unfortunately not. webpack 2 translates System.import() to ordinary require.ensure() calls which just uses the <script> tag. Even the official WHATWG Loader Spec does not provide an API for this kind of event. I've created an issue for this question.
Regarding webpack: There is a way to implement your own require.ensure(). However, since chunk loading is an integral part of webpack, this requires to dive a little deeper. I'm not sure how important this is for you, but you might be interested how things work inside webpack, so let's take a look:
In webpack, all internal features are implemented as plugins. This way, webpack is able to support a lot of different features and environments. So, if you're interested how things are implemented in webpack, it's always a good idea to a) take a look at WebpackOptionsApply or b) search for a specific string/code snippet.
Chunk loading depends heavily on the given target, because you need different implementations for each environment. Webpack allows you to define custom targets. When you pass in a function instead of a string, webpack invokes the function with a compiler instance. There you can apply all the required plugins. Since our custom target is almost like the web target, we just copy all the stuff from the web target:
// webpack.config.js
const NodeSourcePlugin = require("webpack/lib/node/NodeSourcePlugin");
const FunctionModulePlugin = require("webpack/lib/FunctionModulePlugin");
const LoaderTargetPlugin = require("webpack/lib/LoaderTargetPlugin");
const JsonpChunkTemplatePlugin = require("webpack/lib/JsonpChunkTemplatePlugin");
const JsonpHotUpdateChunkTemplatePlugin = require("webpack/lib/JsonpHotUpdateChunkTemplatePlugin");
function customTarget(compiler) {
compiler.apply(
new JsonpTemplatePlugin(compiler.options.output),
new FunctionModulePlugin(compiler.options.output),
new NodeSourcePlugin(compiler.options.node),
new LoaderTargetPlugin("web")
);
}
module.exports = {
entry: require.resolve("./app/main.js"),
output: {
path: path.resolve(__dirname, "dist"),
filename: "bundle.js"
},
target: customTarget
};
If you take a look at each plugin, you will recognize that the JsonpTemplatePlugin is responsible for loading chunks. So let's replace that with out own implementation. We call it the XHRTemplatePlugin:
function customTarget(compiler) {
compiler.apply(
new XHRTemplatePlugin(compiler.options.output),
new FunctionModulePlugin(compiler.options.output),
new NodeSourcePlugin(compiler.options.node),
new LoaderTargetPlugin("my-custom-target")
);
}
Our XHRTemplatePlugin is responsible for providing the code in the main chunk, in each child chunk and for hot updates:
function XHRTemplatePlugin() {}
XHRTemplatePlugin.prototype.apply = function (compiler) {
compiler.plugin("this-compilation", function(compilation) {
compilation.mainTemplate.apply(new XHRMainTemplatePlugin());
compilation.chunkTemplate.apply(new XHRChunkTemplatePlugin());
compilation.hotUpdateChunkTemplate.apply(new XHRHotUpdateChunkTemplatePlugin());
});
};
Maybe, you can also re-use the JsonpChunkTemplatePlugin and JsonpHotUpdateChunkTemplatePlugin plugin, but this depends on your use-case/implementation.
Your XHRMainTemplatePlugin now may look like this:
function XHRMainTemplatePlugin() {}
XHRMainTemplatePlugin.prototype.apply = function (mainTemplate) {
mainTemplate.plugin("require-ensure", function(_, chunk, hash) {
return this.asString([
// Add your custom implementation here
"fetch()"
]);
});
};
I won't go any further here because I think this answer is already long enough. But I recommend to create a real small example project and to check the output created by webpack. The internal webpack plugins may look a little bit scary on first sight, but most of them are real short and do just one thing. You can also get some inspiration from them.