I know that npm libraries, when installed, can install multiple versions of the same library in a hierarchical tree, like this:
a#0.1.0
-> b#1.0
-> c#2.0
-> b#2.0
In the above, package a at version 0.1.0 is pulled in, and its dependencies b#1.0 and c#2.0. Similarly, c#2.0's dependency is
pulled in, which is b#2.0.
I have heard from someone that, even though package b is installed at two different versions, only one of them is actually loaded into memory and used. I have also heard that this may or may not be the case for node deployments of javascript versus browser deployments.
So my question is: Is it true that only one package of b is loaded into memory? If it is, is this true for both node and browser, or are there differences?
Nodejs
Multiple versions of a Node module/library can be loaded, depending on where they are loaded from. The module loader is covered in depth in the Node.js documentation.
The require() call caches modules based on the resolved file path.
require('b') from module a will resolve to .../a/node_modules/b
require('b') from module c will resolve to .../a/node_modules/c/node_modules/b
So seperate modules will be loaded for the same call. This can be demonstrated with a small example.
Module B - node_modules/b/index.js
module.exports = {
VERSION = 'b-0.5.0'
}
Module C - node_modules/c/index.js
module.exports = {
VERSION: 'c-1.0.0',
BVERSION: require('b').VERSION,
}
Module C's copy of B - node_modules/c/node_modules/b/index.js
module.exports = {
VERSION: 'b-9.8.7',
}
Create a program to output the versions.
console.log('b', require('b').VERSION)
console.log('c', require('c').VERSION)
console.log('cb', require('c').BVERSION)
Then output the versions
→node index.js
b b-0.5.0
c c-1.0.0
cb b-9.8.7
So two modules with different paths using require('b').VERSION get a different value.
Traversal
It's worth noting that if a Node.js require('b') fails to find a locally installed ./node_modules/b then it will traverse up the directory tree to look for b in parent node_modules directories.
Using the above example case again, but removing a/node_modules/c/node_modules/b.
Module c does a require('b') but can't find ...a/node_modules/c/node_modules/b then it will go into the parent node_modules/ and look for b, In this case it would load the cached copy of ...a/node_modules/b
→node index.js
b b-0.5.0
c c-1.0.0
cb b-0.5.0
Browser
Browsers have no concept of an npm module. You need a loader that supports CommonJS module's like Browserify or Webpack. If the loader doesn't respect CommonJS/Node.js rules then it's unlikely to work with npm packages.
You should be able to package up the above example with your loader and see how it behaves.
Related
I'm bundling a JS library using Rollup. This lib has a dependency on #tensorflow/tfjs-core.
On tfjs's code, there's a function that fetches a URL. If it's in the browser environment, it uses the global fetch function; if it's not, it tries to import node-fetch.
Something among these lines:
fetch(path: string, requestInits?: RequestInit): Promise<Response> {
if (env().global.fetch != null) {
return env().global.fetch(path, requestInits);
}
if (systemFetch == null) {
systemFetch = require('node-fetch');
}
return systemFetch(path, requestInits);
}
My library is made to run in the browser, so it always uses the global fetch function. However, Rollup still bundles node-fetch's require in my lib's assets.
It should not be an issue, but some consumers are reporting errors when using the library in a React project that uses webpack:
Failed to compile.
./node_modules/[my lib]/index.js
Cannot find module: 'node-fetch'. Make sure this package is installed.
You can install this package by running: npm install node-fetch.
Question is: is there some way I can tell Rollup not no bundle this?
I thought about replacing the require('node-fetch') by undefined after the bundle is generated, but it feels like a dirty hack. Any other sugestions?
PS: I believe marking node-fetch as external on consumer projects would fix the issue, but since I do not use node-fetch in my lib, it would be nice to remove it from final output.
Other package managers can include or exclude files based on the environment, test, development, production, etc.
There is any number of ways of implementing this, even going so far as
# Makefile
ENVIRONMENT ?= test
ROLLUP = $(which rollup)
ENVSUBST = $(which envsubst)
rollup.config.js: src/$(ENVIRONMENT)
${ENVSUBST} < $# > $^
${ROLLUP} $^ -o $(ENVIRONMENT).js
If you created files named after your environments, you could compile them using
make -e environment=browser
I don't expect my code to work, only to express ideas.
There is this loc which is used to exclude node-fetch from the bundle. You could consider a similar approach in your rollup configuration. (I think) If you add that, node-fetch will/should not be a part of your minified library.
Basically I need to know how Ember Js can share parent app package.json file dependency(xyz:3.0.0) to child engine and addons without being used again in child engine and addon package.json file. So that I can reduce the size of the application.
As of now in our application we installing common package dependency in all our child engines and addons even though we used in parent app its increases application size.
Here is my clear example of my project scenario.
parentApp(xxx):
Which has package.json file containes few dependency like ex: vendor-package1:10.0.0, vendor-package2:4.0.0, Child Engine1(yyy), Child Engine2(zzz)
Child Engine1(yyy)
Which has package.json file containes few dependency like ex: vendor-package1:10.0.0, vendor-package2:4.0.0
Child Engine2(zzz)
Which has package.json file containes few dependency like ex: vendor-package1:10.0.0, vendor-package2:4.0.0
So if you notice parent app and child engines has same dependency(vendor-package1:10.0.0, vendor-package2:4.0.0) which I need to do npm install for all three app. I'm adding (vendor-package:10.0.0, vendor-package2:4.0.0) to all my child engines because it should available to my engines.
Because of this my dist folder has (vendor-package1:10.0.0, vendor-package2:4.0.0) to all my parentApp and engines, which increase in size.
If I add(vendor-package1:10.0.0, vendor-package2:4.0.0) only to my parentApp(xxx) then my child engines cannot access those component inside vendor-package1 and vendor-package2.
Please suggest some solution where I don't want to add dependency to all my apps.
I've setup a demo ember 3.12 app at https://github.com/bartocc/so-58343095.
This app depends on ember-concurrency and also has an in-repo addon core that depends on ember-concurrency.
I've also added ember-cli-bundlesize to help analyse the bundle size of the built app.
Here are the results of ember bundlesize:test before and after adding the in-repo addon.
Before
$ git checkout 6c5dfc7
$ ember bundlesize:test
ok 1 - app:javascript: 165.89KB <= 500KB (gzip)
ok 2 - app:css: 40B <= 50KB (gzip)
After
$ git checkout 9c9c9a9
$ ember bundlesize:test
ok 1 - app:javascript: 165.89KB <= 500KB (gzip)
ok 2 - app:css: 40B <= 50KB (gzip)
Bundlesize check was successful. Good job!
As you can see, the bundlesize does not change.
The same goes for an in-repo engine:
With in-repo engine depending on ember-concurrency
$ git checkout 2662b63
$ ember bundlesize:test
ok 1 - app:javascript: 170.08KB <= 500KB (gzip)
ok 2 - app:css: 40B <= 50KB (gzip)
Bundlesize check was successful. Good job!
The small difference you see between 165.89KB and 170.08KB is made of:
the ember-engines modules:
;define("ember-engines/-private/engine-ext")
;define("ember-engines/-private/engine-instance-ext")
;define("ember-engines/-private/route-ext")
;define("ember-engines/-private/router-ext")
;define("ember-engines/components/link-to-component")
;define("ember-engines/components/link-to-external-component")
;define("ember-engines/engine")
;define("ember-engines/initializers/engines")
;define("ember-engines/routes")
the my-engine modules:
;define("my-engine/config/environment")
;define("my-engine/engine")
;define("my-engine/resolver")
;define("my-engine/routes")
;define("my-engine/templates/application")
And finally the ember-concurrency modules aliased to be available inside the my-engine resolver:
;define.alias("ember-concurrency/helpers/cancel-all", "my-engine/helpers/cancel-all");
;define.alias("ember-concurrency/helpers/perform", "my-engine/helpers/perform");
;define.alias("ember-concurrency/helpers/task", "my-engine/helpers/task");
;define.alias("ember-concurrency/initializers/ember-concurrency", "my-engine/initializers/ember-concurrency");
YMMV depending on what addon you use though, but you can use this demo app as a starting point to check whether some code is really duplicated or not.
Hope this helps
Update in answer to updated question 11/12/19
The short answer is you cannot make each of your dependencies only part of your app's package.json. What you have where you specify each dependency in each addon and app's package.json is correct.
Npm will only install one version of each package. It does some complicated 'magic' to resolve which dependency gets used. If you think about it, this is the only thing that could happen because JS and the browser can only have on version of each library available since things are made available globally on the window. Your app would have now way of telling the difference between version x.x.x of a library and x.x.y of the same library because that library is always exposed with the same name globally, e.g., Ember.
Original answer
We have had some success reducing packages installed by using lerna and a mono repo: https://github.com/lerna/lerna. Note however, we have not had any success using the lerna commands. Instead we simply run npm i in each addon/app. The order npm i is run is critical: you must start with the base of your tree first i.e., start with the addon that does not consume an of your other addons/apps and move your way up.
Our mono repo contains three ember applications and two addons:
addon-1
addon-2
- consumes addon-1
app-1
- consumes addon-1
app-2
- consumes addon-1 and addon-2
app-3
- consumes addon-1 and addon-2
In the structure above, npm i must be run in this order: addon-1, addon-2 and app-1, app-2 and app-3.
We have experimented with different ways of including the packages in package.json of each addon/app. It's come down to this, example for app-2:
"devDependencies": {
"addon-1": "file:../addon-1",
"addon-2": "file:../addon-2",
},
file: allows you to reference a module in your mono repo using a relative path.
Atom exposes some global APIs that you can access from require('atom')
How does this functionally work? Atom packages don't explicitly have atom as a dependency, yet they can still do this. Moreover, how can I do this in my own Electron application with my own global package?
I've gone through and analyzed Atom's source myself to determine how this happens, and this is what I've come up with.
Atom packages are required using the normal node require. However, according to the apm readme:
The other major difference is that Atom packages are installed to
~/.atom/packages instead of a local node_modules folder...
So the require('atom') package isn't retrieved from a parent node_modules directory like normal node modules. Instead, Atom overrides the module loader to change the behavior a bit.
More specifically, they override Module._resolveFilename like so:
Module = require 'module'
Module._resolveFilename = (relativePath, parentModule) ->
resolvedPath = resolveModulePath(relativePath, parentModule)
resolvedPath ?= resolveFilePath(relativePath, parentModule)
resolvedPath ? originalResolveFilename(relativePath, parentModule)
It attempts to resolve the path of a module with its own module cache logic before defaulting to normal behavior. This is done for a couple reasons that I can tell.
It lets them hardcode the path of builtin modules like 'atom', even though the normal behavior would never have found it.
It prevents loading package dependencies twice when packages have the same dependency with compatible versions. If packageA loads lodash#4.x.x and later packageB attempts to load lodash#>=3, then Atom steps in and gives packageB the lodash that packageA loaded.
I'm writing a javascript library that contains a core module and several
optional submodules which extend the core module. My target is the browser
environment (using Browserify), where I expect a user of my module will only
want to use some of my optional submodules and not have to download the rest to
the client--much like custom builds work in lodash.
The way I imagine this working:
// Require the core library
var Tasks = require('mymodule');
// We need yaks
require('mymodule/yaks');
// We need razors
require('mymodule/razors');
var tasks = new Tasks(); // Core mymodule functionality
var yak = tasks.find_yak(); // Provided by mymodule/yaks
tasks.shave(yak); // Provided by mymodule/razors
Now, imagine that the mymodule/* namespace has tens of these submodules. The
user of the mymodule library only needs to incur the bandwidth cost of the
submodules that she uses, but there's no need for an offline build process like
lodash uses: a tool like Browserify solves the dependency graph for us and
only includes the required code.
Is it possible to package something this way using Node/npm? Am I delusional?
Update: An answer over here seems to suggest that this is possible, but I can't figure out from the npm documentation how to actually structure the files and package.json.
Say that I have these files:
./lib/mymodule.js
./lib/yaks.js
./lib/razors.js
./lib/sharks.js
./lib/jets.js
In my package.json, I'll have:
"main": "./lib/mymodule.js"
But how will node know about the other files under ./lib/?
It's simpler than it seems -- when you require a package by it's name, it gets the "main" file. So require('mymodule') returns "./lib/mymodule.js" (per your package.json "main" prop). To require optional submodules directly, simply require them via their file path.
So to get the yaks submodule: require('mymodule/lib/yaks'). If you wanted to do require('mymodule/yaks') you would need to either change your file structure to match that (move yaks.js to the root folder) or do something tricky where there's a yaks.js at the root and it just does something like: module.exports = require('./lib/yaks');.
Good luck with this yak lib. Sounds hairy :)
So i've come across an interesting use case where i'm using Browserify to bundle all of my assets together in a project, but a large external (external to the project) module needs to be loaded in when a certain in-app window is accessed. (It's a video player module made up of three scripts that get pulled in asynchronously when required).
At the moment i'm getting all kinds of errors from uncalled object errors if the requireJS module is loaded in before the Browserified app.js file, to cannot find module errors if loaded in after the Browserified code.
Is there anyway i can get Browserify and RequireJS to play nicely on the same page? I'm losing my mind!
TL;DR - use window.require in your browserified script.
Maybe it would still help someone.
I happen to use an 'external' dojo-based library totally based on requireJS-style AMD, which is absolutely un-"browserifyeble" and un-convertable to CommonJS or anything sane. My own code is totally Browserified. It's working OK like this:
Load the AMD loader (which defines the global require function) before the browserified script:
<script src="dojo/dojo.js"></script> <!-- RequireJS/AMD loader, defining a global 'require'-->
<script src="app/main.js"></script> <!-- The Browserify bundle -->
In your own js, call the require function on window object ('cause you'll have a local browserify-require shadowing the global one)
window.require(['dojo/dojo'], function (dojo) { ... });
The 'external' app or library, which calls require on its own to load submodules etc., works just fine 'cause that code is out of browserify context and the global require is not shadowed there.
Maybe if you have some pure nice standard RequireJS modules, you could somehow convert them to be Browserifiable, but in my case that wasn't an option.
There is a tool called browserify-derequire that resolves this issue by renaming browserify's require statmenets to avoid the naming collision.
It can be installed with npm using:
npm install -g browserify-derequire
Use it as a browserify plugin by changin your build command to:
browserify src/*.js -p browserify-derequire > module.js
There is more discussion on this issue at: https://github.com/substack/node-browserify/issues/790
For a gulp friendly solution (similar to what #Cride5 proposed) you can use gulp-derequire plugin.
Basic example from docs:
var derequire = require('gulp-derequire');
var browserify = require('browserify');
var source = require('vinyl-source-stream');
gulp.task('build', function() {
var bundleStream = browserify({entries: './index.js', standalone: 'yourModule'}).bundle();
return bundleStream
.pipe(source('yourModule.js'))
.pipe(derequire())
.pipe(gulp.dest('./build'));
});
Plugin is also based on derequire module so all options are supported.