I'd like to better understand the differences between how promises are implemented in webpack. Normally, blissful ignorance was enough to get by as I mostly develop apps, but I am definately a little confused in how to properly develop a plugin/tool/lib.
In creating apps the two following approaches never caused any issues; I guess mostly cause it didn't matter
webpack.config.js - using babel-polyfill as an entry point
module.exports = {
entry: {
foo: [
'core-js/fn/promise', <-- here
'./js/index.js'
]
},
module: {
rules: [
{
test: /\.js$/,
loader: 'babel-loader'
}
]
}
}
Q: In this approach, since it's a polyfill it modifies the global Promise?
webpack config - shimming using webpacks provide plugin
module.exports = {
entry: './js/index.js',
module: {
rules: [
{
test: /\.js$/,
loader: 'babel-loader'
}
]
},
plugins: [
new webpack.ProvidePlugin({
Promise: 'es6-promise' <-- here
})
]
};
Q: Does this mean that the Promise is a module only specific to the webpack bundling process? Does the transpiled ES5 code have a local copy or es6-promise? Does it patch the global Promise?
In regards to creating a jquery plugin/tool/lib which is using babel for transpilation...
webpack.config.js - using babel-plugin-transform-runtime
module.exports = {
entry: {
foo: [
'./js/start.js'
]
},
module: {
rules: [
{
test: /\.js$/,
loader: 'babel-loader'
}
]
}
}
.babelrc
{
"presets": [ "es2015" ],
"plugins": ["transform-runtime"] <--here
}
start.js
require('babel-runtime/core-js/promise').default = require('es6-promise'); <--here
require('plugin');
Q: This aliases the es6-promise to the babel-runtime promise and is not global but only local to the tool?
Polyfill in webpack entry
entry: ['core-js/fn/promise', './index.js']
This has the same effect as if you imported it at the top of your entry point.
In this approach, since it's a polyfill it modifies the global Promise?
Yes, this polyfill changes the global Promise. Calling it a polyfill usually means that it patches the global built-ins, although this is not strictly adhered to. If they don't change existing APIs but only provide the functionality, they are sometimes called Ponyfills.
Webpack shimming with ProvidePlugin
new webpack.ProvidePlugin({
Promise: 'es6-promise'
})
The ProvidePlugin will import the configured module at the beginning of the module that uses it when the corresponding free variable is found. A free variable is an identifier that has not been declared in the current scope. Global variables are free variables in all local scopes.
Whenever a free Promise is encountered, webpack will add the following to the beginning of the module:
var Promise = require('es6-promise');
Does this mean that the Promise is a module only specific to the webpack bundling process?
That is correct, because ProvidePlugin is webpack specific and it's very unlikely that any other tool will respect any webpack settings.
Does the transpiled ES5 code have a local copy or es6-promise?
As with any other module, it is included once by webpack and all imports refer to that module.
Does it patch the global Promise?
It will only modify the global Promise if the imported module does it explicitly. The es6-promise you're using, does not patch the global by default as shown in Auto-polyfill.
Babel transform runtime
{
"plugins": ["transform-runtime"]
}
The babel-plugin-transform-runtime uses core-js to provide missing functionalities like Promise. As you will recall, I said that core-js modifies the global Promise. This is not true for this case, because babel uses the version that doesn't pollute the global namespace, which is in core-js/library as mentioned in the core-js README. For example:
const Promise = require('core-js/library/fn/promise');
Babel will import the core-js Promise and replace Promise with the imported variable. See also the example in babel-plugin-transform-runtime - core-js aliasing. This is essentially the same thing as webpack's ProvidePlugin except that babel does not bundle up the modules, so it's just adding the import.
This aliases the es6-promise to the babel-runtime promise and is not global but only local to the tool?
It is not global because it's just a module. Babel takes your JavaScript and outputs some other JavaScript where the configured features are transpiled to ES5. You will run or bundle the resulting JavaScript and it's practically the same as if you had written ES5 in the first place.
require('babel-runtime/core-js/promise').default = require('es6-promise');
With that you modify the export and therefore the modules will use es6-promise instead. But overwriting an export is not a good idea, especially since the imports of ES modules are immutable in the spec. Babel is currently not spec-compliant in that regard. For more details see Making transpiled ES modules more spec-compliant.
Which one should you use?
It depends on what you're doing. Apart from the difference of whether they change globals or not, you can choose whichever you prefer. For instance using babel's transform runtime allows you to use it with any tool that uses babel, not just webpack.
For a library
None.
Leave the polyfill to the application developer. But you might mention that it depends on a certain feature and when the user wants to use the library in an environment that doesn't support the feature, they have to polyfill it. It's also fairly reasonable to assume that Promises are widely supported and if an application targets older environments, they will very likely have polyfilled it already. Keep in mind that this doesn't mean that you shouldn't transpile new features / syntax. This is specifically for new functionality like Promise or String.prototype.trimLeft.
For a tool
That also depends on your definition of a tool. Let's assume a tool is a piece of software that is used by developers (e.g. webpack, eslint, etc.). In that case it is exactly the same as for any app, at the end of the day it's just another app but only targeting developers. Specifically speaking about command line tools, you should decide what minimum Node version you want to support and include anything that is needed for that, you can specify that in your package.json in the engines field.
For a plugin
Plugin is a very broad term and can be anything between a library and an app. For example a webpack plugin or loader should work as is, whereas a jQuery plugin will be part of a web app and you should treat it as a library (they should probably be called library instead of plugin). Generally you want to match the guidelines of whatever you're extending. Have a look at it and see what they are targeting. For example webpack currently supports Node verions >=4.3.0, so your plugin should too.
Related
I have a very old javascript code base and I do not want to use the modern way of compiling all of the javascript files into one using standard webpack because it is not possible due to the way the website code is written.
But I want to write new scripts using modern Javascript (e.g. Promises and Fetch) but still be able to support old browsers like IE11.
I have configured webpack and babel so it gets multiple entry javascript files and for each of them it does the classic transpiling/polyfilling using #babel/preset-env and corejs.
This works and polyfills every script based on the babel target config but it creates one issue. It encapsulates global variables/functions in the script so they are not accessible from other scripts which reference them (yes old javascript). Is there a way to disable this structural modifications?
Also I know I could use only Babel without Webpack for this but the problem is when I try to polyfill e.g. Fetch I have to use https://github.com/github/fetch which cannot be just used with Babel afaik.
Any help appreciated.
I think inevitably your refactorings are modernizing the code, and if you are not careful, one day you can end up bundling everything with webpack;)
The set up you describe, I achieved with with:
module.exports = {
entry: {
messages: "./src/messages",
"hello-world": "./src/hello-world",
},
output: {
library: {
type: "global",
},
filename: "[name].js",
},
};
every export from each file is put directly on window - if you load files in the right order, you can have invisible dependencies maintained across the codebase.
To this setup, you can add babel loader with presets as you stated in your question. Besides, you can tart doing the explicit imports across different files - even if function X is available on global scope, you can migrate some places to import/require it explicitly.
If you want to play with my code yourself, you can find it here:
https://github.com/marcin-wosinek/webpack-legacy/
and here is the example in action:
https://marcin-wosinek.github.io/webpack-legacy/
I am trying to answer,
when to use import/export and when to use require()/module.exports? But as I try to dig, it seems to get complicated.
Here's my understanding
require()/module.exports: this is nodejs implementation of the module system. This loads the modules syncronously.
with es6, we can use import/export. the docs says
The import statement is used to import bindings which are exported by another module. Imported modules are in strict mode whether you declare them as such or not. The import statement cannot be used in embedded scripts unless such script has a type="module".
Ques1: How does this work with babel or webpack or browsers in general?
As I was exploring I came across stuff like CommonJs, requireJs, Asynchronous Module Definition (AMD)
Ques2: I am more interested in knowing the timeline as how these things evolved in javascript ?
How does this work with babel or webpack or browsers in general?
Babel and Webpack follow the ES spec and transpile the import / export statement to one single file. As they also support the require syntax, they usually transpile the import statements to require() calls and the export statements to module exports, and then ship with a custom loader for modules., If you got for example:
// A.js
export default function() { }
// B.js
import A from "./A";
A();
Then it gets transpiled to the following require syntax:
//A.js
exports.default = function() {};
//B.js
var A = require("./A").default;
A();
That could then get wrapped to something like:
(function() { // prevent leaking to global scope
// emulated loader:
var modules = {};
function require(name) { return modules[name]; }
function define(name, fn) {
var module = modules[name] = { exports: {} };
fn(module, module.exports, require);
}
// The code:
define("A", function(module, exports, require) {
// A.js
exports.default = function() { };
});
define("B", function(module, exports, require) {
// B.js
var A = require("A").default;
A();
});
})();
how these things evolved in javascript ?
A few years ago, writing JS was restricted to browsers, and the only way to load multiple js sources was to use multiple <script> tags and use the global object to exchange functionality. That was ugly.
Then Nodejs was invented and they needed a better way to work with modules and invented the require() thing.
The writers of the spec saw a need for a native syntax for that, so import / export were introduced.
Babel and others then wrote transpilers.
What webpack the bundler does is the following:
You specify an input file in the config
You specify an output file the config
Webpack will look at all the files which the input file requires (commomJS module system) or imports (ES6 module system). It then funnels the code based on file name extention through loaders. Loaders can transpile the individual files to code the browser can understand. An example of a loader is babel or the sass/scss compiler.
After the different files are transpiled with loaders, the plugins can work at the
transform the bundle of generated code into something else. The bundle is just a bunch of code which together forms piece of functionality
In won't go into detail in the internals of webpack too deeply, but the most important thing to understand is:
You use webpack so you can use split up your code in multiple files, which makes them more maintainable and easier to work with. However then requesting all these files by the client would be horrible for performance (many HTTP requests overhead). Therefore, we bundle the files into one file, or a couple so this overhead is reduced.
Generally, you should write all modern code with import/export syntax if you are using a bundler like webpack, or translating with Babel... npm modules may favor require/module syntax but you can still import them.
Also worth noting is the import() method which returns a promise that should resolve to the root module being imported asynchronously. Webpack may bundle these as asynchronous modules if you have it configured to do so.
In practice the resolution through tooling like babel and webpack will fallback to node-style behaviors against the node_modules lookup, where the standard is transport paths, favoring relative paths. Additional support is per environment.
You can experiment with esm support in modern browsers and in current node (behind a flag as of this answer). The behaviors are somewhat inconsistent, but well defined. When in doubt, experiment and try.
I'm migrating/moving a project based on require.js to webpack v3. Since all my modules are using the following syntax:
define([modules,..], function(mod1,..)
Which declares which modules to use, and assigns the modules to the variables in the anonymous function. This seems to be deprecated since v2 of webpack. I can't find any information about this (except for the documentation for web pack v1).
Should I rewrite all my modules to the commonjs (including dependencies) or are there any smart way to use the AMD modules?
Help much appreciated :-)
Regards
AMD never found much use outside of requirejs so likely you will need to convert. There are tools that will help:
https://github.com/anodynos/uRequire can convert code from AMD -> UMD / CommonJS
There are caveats from (https://github.com/anodynos/uRequire/wiki/nodejs-Template):
Runtime translation of paths like models/PersonModel to ../../models/PersonModel, depending on where it was called from. You 'll still get build-time translated bundleRelative paths, to their nodejs fileRelative equivalent.
For most projects this is not an issue.
Can't use the asynchronous version of require(['dep'], function(dep){...})
You should be able to use the synchronous version of require. If using webpack2 you can use System.import or require.ensure
Can't run requirejs loader plugins, like text!... or json!...
You will find webpack version of all of these plugins
There's no mapping of /, ie webRootMap etc or using the requirejs.config's {baseUrl:"...."} or {paths:"lib":"../../lib"}
This can be replicated with https://www.npmjs.com/package/babel-plugin-module-alias
The CaptEmulation's answer is not valid for newer Webpack versions. Webpack supports AMD natively (neither additional loaders, nor plugins need to be installed). A thorough instruction is available here: https://webpack.js.org/api/module-methods.
This fact may easily go unnoticed when one tries to rewrite a RequireJS-based build to Webpack, as RequireJS uses relative paths without the trailing ./, e.g.
define('app/dep1', function(dep1) { ... });
which will not pass in Webpack without additional configuration (assuming that both require.config.js and webpack.config.js are in the same directory):
{
resolve: {
modules: [ './', ... ] // other entries possible here
}
}
Currently I can do:
require('./frontend/src/components/SomeComponent');
But if I set the following in my webpack.config.js:
resolve: {
root: path.resolve('frontend', 'src')
}
I can instead do:
require('components/SomeComponent');
The problem is, when I don't use Webpack (eg. in a test environment) all of my imports break. According to the Babel docs, the sourceRoot property sets the "root from which all sources are relative." This made me think I could add the following to my .babelrc to fix my imports:
"sourceRoot": "frontend/src"
... but no such luck. When I do require('components/SomeComponent'); in babel-node it fails. When I just use Babel to transpile the file, the require line is the same whether or not I set a sourceRoot.
So, my question is, is there any way (with or without sourceRoot) to simulate webpack's resolve.root in Babel?
P.S. I know there are several Babel plug-ins which address this problem, but all of the ones I've seen require you to add a ~ to the require path (which of course breaks imports in Webpack).
Many project have webpack + babel, and in many projects you sometimes bypass webpack (as in your case - for tests).
In such cases, all the resolve aliases should live in babel.
There are plugins out there to allow one reading the configuration of the other (and similar plugins for eslint etc.).
I have this in my webpack.config.js to create two different outputs from two sources:
module.exports = {
entry: {
'dist/index.js': ['babel-polyfill', './src/Component.jsx'],
'example/bundle.js': ['babel-polyfill', './src/Page.jsx'],
},
output: {
path: './',
filename: '[name]',
},
...
Compiling it with webpack works just fine, but if I load index.js in a browser I get this error:
Uncaught Error: only one instance of babel-polyfill is allowed
I need babel-polyfill for both outputs. What can I do?
When developing a library (as opposed to an application), the Babel team does not recommend including babel-polyfill in your library. We recommend either:
Assume the ES6 globals are present, thus you'd instruct your library users to load babel-polyfill in their own codebase.
Use babel-runtime by enabling babel-plugin-transform-runtime, which attempts to analyze your code to figure out what ES6 library functionality you are using, then rewrites the code to load the polyfilled logic from babel-runtime instead of from the global scope. One downside of this approach is that it has no way to polyfill new .prototype methods like Array.prototype.find since it can't know if foo.find is an array.
So my recommendation would be to remove babel-polyfill from your dist/index.js bundle entirely.
Use idempotent-babel-polyfill
import 'idempotent-babel-polyfill';
https://github.com/codejamninja/idempotent-babel-polyfill