i am building a tool of my own to trans compile and pack the related js files (that are written in ES6) into a bundle. so far it goes as expected with local files, but when i come to public modules, for example, react and redux etc, it's different. and i am wondering how to include these modules into the bundle? i found that there are always dist folders in most of the public modules with distributed versions residing in. so, are the dist folders always available in any module directory?
Webpack uses the same module resolution as Node.js. node_modules have a package.json which has a main field, that determines which file is being imported when you import the module in your code. Additionally webpack looks for the browser or module fields in package.json and prefers them over main, if they are present. This makes it easy to publish a build that is different from the regular Node.js build (for instance to use ES modules (import/export), which are not supported by yet Node.js but by bundlers like webpack). This behaviour can be configured with the option resolve.mainFields. For an example have a look at the package.json of Redux.
None of these fields are mandatory, but at least main is supposed to be present, so you can simply import a module with:
import module from 'module';
Or with require:
const module = require('module');
Webpack automatically includes the modules you import into the bundle.
The dist directory is not any special, but it's very common to have a dist directory that contains an UMD build. Especially as Unpkg allows you to import a node module without having to publish it manually to a CDN, it uses the dist or umd by default (as described at the bottom of the homepage).
Related
In the documentation for Node's native support of ECMAScript modules, they state
There are three types of specifiers:
...
Bare specifiers like 'some-package' or 'some-package/shuffle'. They can refer to the main entry point of a package by the package name, or a specific feature module within a package prefixed by the package name as per the examples respectively. Including the file extension is only necessary for packages without an "exports" field.
...
The definition of bare specifiers indicate you can import ECMAScript Modules from "packages".
What's considered a "package" in Node.js? Does node just search the entire node_modules folder for any folder with a package.json file and consider that a package? Or is it more complicated than that? Is it the same for CommonJS modules and ESMAScript modules?
What's considered a "package" in Node.js?
In the sense of "bare" named packages for ESM, any name that matches a node builtin package, or is in node_modules/{{ bare name }} in the current directory or subsequently any parent directory, that has a package.json that loads and also has the correct name field.
Does node just search the entire node_modules folder for any folder with a package.json file and consider that a package? Or is it more complicated than that?
Basically, node doesn't care about anything you haven't named as an import though. It's always more complicated, the algorithm is documented further down on the ESM modules page.
Is it the same for CommonJS modules and ESMAScript modules?
No, although the part that resolves the package exports is shared once a "package" is located and vetted by the ESM or CommonJS rules). Big differences are
Global node_modules are not considered in ESM (i.e. traversing $NODE_PATH, $HOME and the node $PREFIX).
Outside of the process for loading this subset or "bare names" there are more differences.
ESM modules or mjs files will not load via require.
No native or JSON imports
Generally the ESM loader is a much more strict subset of CommonJS as anything Node specific doesn't apply.
The standard approach when packaging code for npm is to place transpiled or bundled code in a separate lib, dist or build directory and to set the package.main property (and module/browser) to point to the built files, which makes the existence of the build directory transparent when doing bare imports (without a path). However, if deep imports are needed (importing specific modules not referenced in package.json properties), it means that the build directory needs to be included in the path like so: require('package/lib/some-module'), which looks bad.
Are there any examples of an approach that would at the same time:
Support deep imports without the build directory in the path
Support npm-link and tools like Lerna
Still place the built files in a separate directory for tidiness
Every approach I've tried or seen fails to fit one of the criteria; for example, copying package.json into the build directory (lib/dist) allows publishing or linking the package from the build directory, but tools like Lerna don't support this approach since they expect the package.json to be in the root of the package instead of the build directory, so only points 1 and 3 are met.
Placing sources in the root of the package and the transpiled or built files next to the sources allows deep imports and is also supported by Lerna, but it doesn't meed the 3rd criteria (tidiness).
Having import maps would solve this issue, but I'm not sure if any tool supports import maps yet.
The question is: is there some approach that I've missed that could meet all three listed points?
I'm trying to create a library with webpack. My main goal is mostly to publish the ES6 module files so that my client applications and import and tree shake the library. (It is meant to be a library of icons, so that users can just bundle the ones they use in their app).
I have the library building the umd module to dist and I've set the module field in the package.json to the src directory file that is exporting all of my icons.
Each icon is a React component. I've imported React and PropTypes to create each one and set them to be externals in the webpack config, however I don't think that matters in this use case because...
I made a test application that npm linked my library and imported one of my icon components from my module. It's definitely going to the source files but my webpack here in my application cannot resolve react and prop-types in those modules. It complains for each in every component.
e.g:
Module not found: Error: Can't resolve 'react' in '/code/svg-icons/src/components'
# ../svg-icons/src/components/AlertIcon.js
# ../svg-icons/src/index.js
# ./src/index.js
I suppose this makes sense, as the AlertIcon module from my library doesn't have a node_module directory to find react.
How do I have other applications provide react, prop-types, and whatever else I want to import in my library's source modules?
For your svg-icons package, you only need to run it through Babel, to transpile it to ES5, that's it.
Your application that depends on svg-icons is the one that will be built with webpack, and it will take care of bundling the imported icons into your app bundle.
There is a great FREE course on the subject by Kent C. Dodds on egghead.io. It will clear everything on the subject for you.
I'm struggling to get my head around how npm manages dependencies - in terms of how they are actually referenced in HTML.
Say I have a specific version of a plugin installed, which includes a version number in its path or file name - if npm is configured to update to a new minor release - the files referenced via script tags will no longer be present.
I've also read that exposing node_modules path is incorrect and should be avoided.
How then should these files be referenced so that they are loaded and so version updates do not break a site?
The idea is that you use these modules in your code. Let's say you have a main.js file which has your application, then you import modules using import $ from 'jquery'; (this could depend on your configuration, you could also use 'require'). Then use a tool like browserify which is going resolve all your dependencies for you and package it into a nice file which can then be loaded into your browser.
This is only one setup out of many so this could vary, for example if you use webpack this will be different but the idea is the same, you import what you need into your main.js.
npm uses package.json file as reference to build dependency map. And installs all dependencies in node_modules folder. When you publish an update to your module, you also publish a new version of package.json file which will include modifications to dependencies.
So short answer is - package.json file... I hope you can figure things out from this.
I am new to JavaScript development and could do with some advice regarding how best to work with multiple modules of my own creation.
Module1
--src
----a.js
----b.js
Module2
--src
----c.js
----d.js
Module3
--src
----e.js
----f.js
Module4
--src
----g.js
----h.js
Now each module is a collection of js files that use module.exports to export functions that they need to.
In this example Module1 and Module2 are library module. Module2 depends on Module1. Both of these modules are platform agnostic and can be ran inside a browser or nodejs.
Module3 is a project module which depends on both Module2 and Module1. It is designed for a browser and uses browserify to combine them all into a bundle.
Module4 is another project module which also depends on Module2 and Module1. This is designed to run under nodejs.
Each module has its own git repo.
Now the problem I am having is to do with relative paths inside the require. For example c.js currently does require("../../Module1/src/a.js"); Similarly h.js does require("../../Module2/src/c.js");
This causes foldering structure to be an absolute pain and each project has to be cloned from git in the correct setup.
The following is what I am trying to achieve.
To do away with relative paths so I simply do require ("ModuleX/src/foo.js"); This needs to work with both browserify and nodejs.
To clone a project module from git and have it get all of its dependent modules (perhaps git submodules?). Not caring about folder structure as this should be solved using the point mentioned above.
Once a project and it's dependent modules have been cloned to be able to edit each of these modules, make changes and push them to their individual git repo.
I imagine what I am trying to do is pretty standard. Creating my own modules and reusing them in different projects. Am I trying to go about solving it in the standard way? I've read that there are different ways to achieve this using NODE_PATH which is supported by both node and browserify but both discourage it. Browserify supports a paths option but doesn't work for node. Also read about putting the modules inside node_modules but not sure how that would help with relative paths.
Any advice would be greatly appreciated
Thanks
What you probably want to do is commit your reusable code to git (here GitHub) and then npm install <git remote url>:. An example would be npm install git+https://isaacs#github.com/npm/npm.git. On the other hand your repo needs a package.json and a index.js holding your code or pointing to it. Alternatively you could define the location of your main file in the package.json via main:"" key.
The dependencies between those projects would be defined in the respective package.jsons. Upon install npm does it's magic to avoid circular dependencies, caching etc.
Once you've done all that you would be able to just var x = require("ModuleX") to get ModuleX/src/foo.js if you so wish to.
Alternatively use npms private modules.