This question already has answers here:
Is using an ES6 import to load specific names faster than importing a namespace?
(3 answers)
Closed 4 years ago.
Suppose I have a module foo like this:
export const f = x => x + 1;
export const g = x => x * 2;
I can use this module like this:
import { f, g } from 'foo';
console.log(f(g(2)));
Or like this:
import * as foo from 'foo';
console.log(foo.f(foo.g(2)));
I prefer the second way because it prevents name collisions between modules.
However, is import * less efficient? Does it prevent bundlers (such as Rollup and Webpack) from spotting unused imports and removing them?
When you specify imports as import { f, g } from 'foo'; you guarantee better performance in terms of speed in compilation and bundle size as you will be getting only the dependencies you need.
Notes: as loganfsmyth pointed out, some recent compiler/bundle are able to reference what actually is being used only, this IMO is an additional step which could cost some compilation time (although I did not have time to benchmark this assumption).
import * is less efficient in that you are using more memory to pull the entire library as opposed to just the specific methods that you actually need
Webpack at least (not sure about Rollup), is perfectly able to see that foo.f is a reference to the f exported name, so your two examples will behave the same.
For most bundlers this does not matter since everything has to be included, 'cause
export const g = (() => {
console.log("Side effects!");
return x => x * 2;
})();
Related
This question already has an answer here:
Relation between import and destructing es6 syntax?
(1 answer)
Closed 11 months ago.
In JavaScript I can rename and imported module this way
import { original_name as alias_name } from 'my_module'
Is there a good explanation for the need of as keyword and nod just follow the excellent destructuring standard?
Why not something like this:
import { original_name: alias_name } from 'my_module'
Just a curiosity always come to my mind
The answer is of course, they are different because they do different things, and only happen to have similarities in syntax.
But maybe there is a better reason why they were designed that way? I experimented with Node.JS modules in the following way:
// origin.js
export let specialNumber = 2;
setInterval(() => {
specialNumber = Math.random();
}, 400)
// importer.js
import { specialNumber as aliasAttempt } from './origin.js'
import * as StarImport from './origin.js'
let { specialNumber: destructuringAttempt } = StarImport;
setInterval(() => {
console.log(aliasAttempt, destructuringAttempt);
}, 400)
Here, destructuringAttempt will always give "2" (the value it got when it was destructured), whereas aliasAttempt will be keep getting changed. For example:
0.3600619094195876 2
0.33268826082163194 2
0.20684912705131553 2
0.8665522020482055 2
0.9349778920742413 2
It looks like destructuringAttempt copied by value during destructuring, whereas aliasAttempt keeps the reference to the let specialNumber variable.
(this behavior of destructuringAttempt is expected as let { specialNumber: destructuringAttempt } = StarImport; is just the same as let destructuringAttempt = StarImport.specialNumber; i.e. just copied the number)
So, maybe the reason was that, if the export is a non-const value, 'aliasing' gives a different result ("keep reference to the other variable") from typical destructuring behavior ("copy once"), therefore it's better to distinguish the syntaxes.
They do different things:
Destructuring is a form of property access, usually combined with a variable declaration like const, and possibly even a default initialiser
Importing a variable does declare an alias for a binding in the module scope
The important difference is that destructuring does run code (test object coercability, execute getters) while import declarations are fully declarative (setting up dependencies between modules, enabling cross-module hoisting even before the variables are initialised). This makes the latter statically analysable. Import aliases also have no concept of nested objects.
Both do allow "renaming", but they use different syntax for different things - it would've been too confusing otherwise. That their shorthand forms are similar to each other is mostly coincidental, caused by both using braces.
This question already has answers here:
Use of 'prototype' vs. 'this' in JavaScript?
(15 answers)
Closed 5 years ago.
I have been doing some exercises on the interesting "level up your coding" site, Exercism.io and they use the CommonJS style modules for their code samples and testing with Jasmine. I've always thought modules were a hassle I didn't want to deal with, but in these bite size chunks, they look like they could be very useful to begin using in my Single Page Applications. So I've been Googling around and searching Github for some good samples for using CommonJS Modules - and still haven't found one that explains in detail what the main patterns are, and how they differ. For example, one answer I submitted looked like this:
var HelloWorld = function () {};
HelloWorld.prototype.hello = function () {
return 'Hello, World!'
};
module.exports = HelloWorld;
But another one looked like this
var Bob = function () {
this.hey = function (input) {
input = input.split('');
if (input.indexOf('!') >= 0) {return 'Whoa, chill out!'}
if (input.indexOf('?') >= 0) {return 'Sure.'}
return 'Whatever.'
};
}
module.exports = Bob;
Specifically I'm wondering what the difference is between nesting a function inside the parent definition, as done with the Bob hey() function, as opposed to the way the HelloWorld hello() uses the prototype instead.
To start with the two functions you gave as examples are completely different from each other and have different purposes.
Based off of the code you have as an example, the way you are calling them is also incorrect.
For your Bob example, all you are doing is assigning a function to a variable. In order to call it, you simply have to do Bob(). If you do a Bob.hello() you are going to get an error.
The HelloWorld on the other is not just a function, well.. It is, since you declared it as an empty function that is what it will call if you do HelloWorld(). However you defined hello as it's prototype function, for you to invoke that directly you would have to do HelloWorld.prototype.hello(). I reckon that these are used mainly to alter behaviours of existing objects or functions.
You are asking what are the most efficient ways of writing modules, but in reality there is not right answer for that. All a module is is a piece of code that can be exported and reused by other files. They can be functions, objects, simple variables whatever you'd like!
So basically you can do all these:
// moduleThatExportsANumber.js
module.exports = 1
// moduleThatExportsAnObject.js
module.exports = {}
// moduleThatExportsAFunction.js
module.exports = function () { return 'say somethign!'}
// main.js Lets call all the modules!
const number = require('./moduleThatExportsANumber)
const object = require('./moduleThatExportsAnObject)
const function = require('./moduleThatExportsAFunction)
console.log(number) // 1
console.log(object) // {}
console.log(function) // function () { return 'say somethign!'}
console.log(function()) //say somethign!
The whole thing about modules is simply writing stuff in a file, exporting that stuff, which in the case of CommonJS is done through module.exports = [whatever you are exporting], and later importing, which for CommonJS is require('./filename')
Now... Going back to the original thing that was asked, your cheatsheet. I don't know any CommonJS ones unfortunately, however here is a decent blog post about the CommonJS module system and here is a JavaScript one that you may like as well.
In JavaScript, specifically in node.js setting, one can spell module.exports = 13; in module.js, then x = import ("module.js"); elsewhere and have 13 assigned to x directly.
This saves some code when a module exports a single function, and I notice a lot of widely used packages (such as through2) make use of it.
Is there a way to do the same in Python? With some black magic, maybe?
I do have heard of a thing called loader that's, I guess, supposed to do some manipulations with a module before making it available. In particular, I think SaltStack makes use of something like that in salt.loader, but the code is too hard for me to follow. I imagine we could write a function similar to this:
def loader(module):
m = __import__(module)
return m["__exports__"]
— Then define __exports__ somewhere in a module we want to import and enjoy functionality very similar to JavaScript's module.exports mechanics. But unfortunately TypeError: 'module' object has no attribute '__getitem__' prevents us from doing that.
Python has importing built in to the language at a more basic level than Javascript does, almost all use cases are covered by a simple import statement.
For your example, all it really boils down to is:
from module import exports as x
So, there's not need to look for code savings by changing module.
The other part of the question is how, as a module author, would you restrict people to seeing only a single symbol.
Generally this is not required except to help users know what are public functions vs implementation details. Python has a few common idioms for this:
Any names that start with a leading underscore, such as _helper, are considered private. They can be accessed as normal, but the implication is you should not.
If a module level variable __all__ = [...] exists, only the strings it contains are considered public. The names must seperatedly be declared in the module.
As well as being documentation, both of these do affect one aspect of the module import:
from module import *
Using a star import is generally discouraged, but only public names will be brought in to the local namespace.
After some thinking I understood that, while we can't say m["__exports__"] due to module object's class not having __getitem__ method, we can still access some of the module's elements with "dot" notation: m.__exports__ works.
Another way: screen all module level names off with an underscore and assign the object to be exported to a variable named after the module, then from module import *.
loader.py:
def loader(module):
m = __import__(module)
return m.__exports__
exports.py:
def _f():
return 13
_a = 31
exports = {"p6": _f, "p8": _a}
__exports__ = exports
Python 2.7:
>>> import loader
>>> e = loader.load ("exports")
>>> e
{'p8': 31, 'p6': <function _f at 0x7fb79d494cf8>}
>>> from exports import *
>>> exports
{'p8': 31, 'p6': <function _f at 0x7fb79d494cf8>}
Python 3:
>>> import loader
>>> e = loader.load ("exports")
>>> e
{'p6': <function _f at 0x7f88ae229ae8>, 'p8': 31}
>>> from exports import *
>>> exports
{'p6': <function _f at 0x7f88ae229ae8>, 'p8': 31}
In the first way proposed, I unfortunately cannot use __all__ in loader.load to filter only listed names from a module being loaded since __getitem__ is not defined for module object.
In the second way proposed I don't get so much control (in that a malicious module can export arbitrary names and manipulate my namespace) and flexibility (in that I cannot assign the module's exported object to arbitrary name anywhere in my code).
So, there is still a bit left to be desired here.
This question already has an answer here:
Is there an efficiency difference between const vs var while 'requiring' a module in NodeJS [closed]
(1 answer)
Closed 7 years ago.
Using ES6, and Node.js, what is the recommend way to require packages let or const?
let _ = require('underscore');
or
const _ = require('underscore');
Unless you plan on ever redefining the package within the scope of your file (or wherever you're requireing) then it's probably best to use const: This will protect against accidental reassignments of the package variable.
For instance:
const _ = require('underscore');
// ...
let [a, _] = foo; // SyntaxError: "_" is read-only
Since we're talking about ES6, const, and require, it makes sense to bring up import statements as well, which for the most part can be thought of as a more flexible version of require. [1]
import _ from 'underscore';
// ...
let [a, _] = foo; // TypeError: duplicate declaration "_"
ES6 imports register as const by default, thus similarly preventing reassignment.
So, when would you want to use let for a require? Let's say (no pun intended) you want to use a special version of a package in certain environments.
let secureLibrary = require('secureLibrary');
// override all security in dev mode
if (process.env['NODE_ENV'] === 'development') {
secureLibrary = secureLibrary.fake;
}
In this contrived example, during development, your use of secureLibrary will be replaced by a fake one, presumably ignoring self-signed SSL certificates or some other convenience that would be unsuitable for production.
In summary: most of the time use const but occasionally let provides necessary flexibility, and consider using import if you're already using ES6!
[1] Please note: under the hood there are many more differences between ES6 import and CommonJS require, please see Using Node.js require vs. ES6 import/export and http://www.2ality.com/2014/09/es6-modules-final.html for many more gory details.
I have variables in app.js:
var G = {};
module.exports = G;
var DATA = G.DATA = 'DATA';
var F1 = G.F1 = function(val)
{
return val;
};
In this manner, I can export variables under the object G, and at the same time, can access the variable directly writing DATA without G. prefix.
So far so good.
Now, I want to run a test for app.js in test.js:
var G = require('./app.js');
console.log(G.DATA); // -> DATA
This works, but I also want to access the variable directly writing DATA without G. prefix like console.log(DATA); // -> DATA
Surely, I could do like
var DATA = G.DATA; for every variables(property) export&required module G object, but obviously it's a tedious process to add every variable to the test file manually to correspond the G objects.
Is there any way to do this automatically?
So far, I'm pessmistic since
JS function encloses var in the own scope, so in theory there's no way to have a helper function to var for every object property.
Thanks.
PS. I would like to avoid any eval or VM of node solution. I have tried them in past, and too much problems.
I could assign a local variables for every property export&required module G object, but obviously it's a tedious process to add every variable to the test file manually to correspond the G objects.
No, that's how it is supposed to work. You - and only you - are in charge of what local variables exist in your module scope. No changes in the "export variables" of an included module should break your code.
Accessing properties on the imported module (with a self-chosen name) is the way to go. This is quite equivalent to Python's import app or import app as g.
If you want some particular properties as local variables, you will usually choose them manually, as in Python's from app import DATA, F1. In JS, you will need a multiple var statement like the one you've shown in your question. However, there is a syntax feature called destructuring assignment which will make this more fluid. You can use this in JavaScript 1.7+ (Gecko), CoffeeScript, or EcmaScript 6:
var {DATA, F1} = require("app.js");
Is there any way to do this automatically?
Yes and No. You should not do this, but you can - just like Python's frowned-upon from app import *. To cite what they say, which is equally true for JavaScript:
[It] introduces an unknown set of names into the interpreter, possibly
hiding some things you have already defined.
Note that in general the practice of importing * from a module or
package is frowned upon, since it often causes poorly readable code.
However, it is okay to use it to save typing in interactive sessions.
In JavaScript, you can[1] use the with-statement:
with (require("app.js")) {
…
}
[1]: Not in ES5.1 strict mode, though - which is recommended for optimisation