How to explicitly share files part of a NPM package? [duplicate] - javascript

I would like to publish a npm package that contains my source as well as distribution files. My GitHub repository contains src folder which contains JavaScript source files. The build process generates dist folder that contains the distribution files. Of course, the dist folder is not checked into the GitHub repository.
How do I publish a npm package in a way that when someone does npm install, they get src as well as dist folder? Currently when I run npm publish from my Git repository, it results in only the src folder being published.
My package.json file looks like this:
{
"name": "join-js",
"version": "0.0.1",
"homepage": "https://github.com/archfirst/joinjs",
"repository": {
"type": "git",
"url": "https://github.com/archfirst/joinjs.git"
},
"main": "dist/index.js",
"scripts": {
"test": "gulp",
"build": "gulp build",
"prepublish": "npm run build"
},
"dependencies": {
...
},
"devDependencies": {
...
}
}

When you npm publish, if you don't have an .npmignore file, npm will use your .gitignore file (in your case you excluded the dist folder).
To solve your problem, create a .npmignore file based on your .gitignore file, without ignoring the dist folder.
Source: Keeping files out of your Package

Take a look at the "files" field of package.json file:
package.json, files
From the documentation:
The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)

Minimal example of how to use data files from a script
Another common use case is to have data files that your scripts need to use.
This can be done easily by using the techniques mentioned at: How can I get the path of a module I have loaded via require that is *not* mine (i.e. in some node_module)
The full example can be found at:
Source: cirosantilli/linux-kernel-module-cheat/npm/data-files/
Published: cirosantilli-data-files
With this setup, the file mydata.txt gets put into node_modules/cirosantilli-data-files/mydata.txt after installation, because we added it to our files: entry of package.json.
Our function myfunc can then find that file and use its contents by using require.resolve. It also just works on the executable ./cirosantilli-data-files of course.
package.json
{
"bin": {
"cirosantilli-data-files": "cirosantilli-data-files"
},
"license": "MIT",
"files": [
"cirosantilli-data-files",
"mydata.txt",
"index.js"
],
"name": "cirosantilli-data-files",
"repository": "cirosantilli/linux-kernel-module-cheat",
"version": "0.1.0"
}
mydata.txt
hello world
index.js
const fs = require('fs');
const path = require('path');
function myfunc() {
const package_path = path.dirname(require.resolve(
path.join('cirosantilli-data-files', 'package.json')));
return fs.readFileSync(path.join(package_path, 'mydata.txt'), 'utf-8');
}
exports.myfunc = myfunc;
cirosantilli-data-files
#!/usr/bin/env node
const cirosantilli_data_files = require('cirosantilli-data-files');
console.log(cirosantilli_data_files.myfunc());
The is-installed-globally package is then useful if you want to generate relative paths to the distributed files depending if they are installed locally or globally: How to tell if an npm package was installed globally or locally

just don't mention src and dist inside the .npmignore file to get the scr and dist inside the node_modules ... that's it
Another point is if there is a .gitignore file, and .npmignore is missing, .gitignore's contents will be used instead.

Related

dotenv process.env variable undefined in globally installed custom CLI tool

I'm migrating one of my CLI tools over to a global installation so that it can be installed globally and used anywhere on my system. Most of my src files include require('dotenv').config() at the top of them, but for some reason the env is undefined now that it's installed globally.
What am I missing?
My package JSON looks like:
{
"name": "scoop",
"version": "1.9.0",
"main": "bin/scoop.js",
"dependencies": {
"axios": "0.20.0",
"cli-spinners": "2.4.0",
"commander": "6.1.0",
"dotenv": "8.2.0",
"log-symbols": "4.0.0",
"ora": "5.1.0",
"readline": "1.3.0"
},
"bin": {
"scoop": "./bin/scoop.js"
}
}
bin/scoop.js then contains the following at the top:
#!/usr/bin/env node
require('dotenv').config();
const forms = require('../src/utils/LocateForms');
...
And I'm loading in additional JS src files that are exported, I've got an .env in my project my my custom variables just come up as undefined now?
I think it's expected behaviour.
The default path value for dotenv is Default: path.resolve(process.cwd(), '.env') according to the GitHub readme.
Now process.cwd changes depending upon from where you execute the executable file.
For example if u start node /a/b/c.js then the cwd would be /a/b and if you start it from node /a/b/d/c.js the cwd would be /a/b/d.
So, in order to get the the .env file that you want either you have to store the .env file in a common area like ~/.yourenv like most other executables do(think .bashrc).
Or, you can try to get the installation folder and get the .env file using an absolute path.
For example you can try to import npm and get the prefix to find out the installation folder.
var npm = require("npm")
npm.load({}, function (er) {
if (er) return handleError(er)
console.log(npm.get('prefix'));
})
Or, you can use some package like https://www.npmjs.com/package/get-installed-path

Using node pkg to create executable with npm config

I'm using pkg to create an executable for my node js application. This works great. However, I'm also using the config module to load yaml config files based on the environment. When packaging the app using pkg, I'm specifying that the config folder should be included.
"pkg": {
"assets": [
"config/*"
]
}
When I run pkg . --debug, I can see that the config files are being packaged up. However, if I then rename the config folder, delete the folder, or just move the newly packaged exe to a different folder, it says No configurations found in configuration directory:C:\Users\me\folderwithexe\config (this config directory doesn't exist because I moved the exe to a new folder)
From what I can tell, the config module appears to be looking for the config folder relative to where the exe is being executed. It's not getting it from the packaged exe file even though it's in there. So if you were to run this exe on another computer (which is the plan), then it's looking for the config folder outside of the exe. None of the other modules appear to have this problem. It's just this config module.
Any idea how I can get the pkg module and the config module to work together?
Here is my full package.json
{
"name": "my-app",
"version": "1.0.0",
"description": "",
"main": "app.js",
"author": "Me",
"license": "UNLICENSED",
"dependencies": {
"config": "^3.3.1",
"js-yaml": "^3.14.0",
},
"bin": "app.js",
"pkg": {
"assets": [
"config/*"
]
}
}
The pkg will bundle every require dependency and every assets or scripts that it found in configuration lists (assets and scripts). So, first you need to keep your config files away from pkg.
To keep pkg away from your config files, you can use a variable path that pkg can't resolve, by example:
const config = require(path.join(__dirname, 'config/config.json'));
At this point the pkg don't bundle your config file, but if you run the build, you notice that the path of your config.json is something like /snapshot/config/config.json (https://www.npmjs.com/package/pkg#snapshot-filesystem)
The alternative is to get the config file from binary path using the process.execPath:
const config = require(path.join(process.execPath, "../","./config/config.json"));
After that, the executable will get the config file from "relative path" of running directory.
I don't test with the config module but I thing if you remove the assets: ["config/*"] from the pkg property from package.json and add the new path to the config module (above) it will work.
process.env["NODE_CONFIG_DIR"] = path.join(process.execPath, "../","./config/");
const config = require("config");

running npm run build when installing Babel does not add file to output

after the Babel Setup, I was following a longa a video series in laracasts, I ecountered this problem:
Created a folder, inside the folder did the babel installation via npm in the command line,
I then created a person.js file for the demo to try it..
After setting up babel, the package.json file was edited and the following has been added:
{
"devDependencies": {
"babel-cli": "^6.26.0",
"babel-preset-env": "^1.6.0"
},
"scripts": {
"build": "babel src -d output"
}
}
and also added two folders a src folder
inside the person.js i added:
class Person {
constructor(name){
this.name = name;
}
}
this is the file image
After running this command in the command line i should get :
num run build
it doesnt add any files to the src folder or the output folder
I tried to look at youtube and answers in here couldn't find a solution
Solution provided by #loganfsmyth
Thank you very much, it worked I had 3 issues cleared npm cache using
'npm cache clean --force' 2. I moved the folder as you said to the src
folder, 3. I installed npm again using 'install npm' and it worked
perfectly thank you
babel src -d output
says
Compile the src folder and put the result in the output folder.
Since your Person.js file is not in the src folder, it has no effect.

How to require a file other than the npm main file in node?

package.json:
...
"name": "mypackage",
"main": "src/index.js"
...
Directory structure:
|- src/
|--- index.js
|--- other.js
I can require src/index.js with require('mypackage');, but how can I require src/other.js?
If the answer is require('mypackage/src/other');, is there a way to make it so I can require it with require('mypackage/other'); (i.e. teaching node what the source file directory is of your module?
AFAIK You'll have to explicitly expose it in the root:
Directory structure:
|- src/
|--- index.js
|--- other.js
|- other.js
Then in /other.js
module.exports = require('src/other.js');
Now you can do require('mypackage/other')
I'm currently looking into the exact same thing.
Package.json has a property called 'files':
http://blog.kewah.com/2014/npm-as-a-front-end-package-manager/
https://docs.npmjs.com/files/package.json
The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder.
But I have yet to find how to do a import/require of such a file.
I don't really see another point in listing these files other then to be able to import/require them?
I was able to import a file from a package if it was listed in this files array.
{
"name": "local-ui-utilities",
"version": "0.0.1",
"description": "LOCAL UI Utilities",
"main": "index.jsx",
"author": "Norbert de Langen",
"license": "none",
"dependencies": {
},
"files": [
"/colors/sets/variables.css"
]
}
I'm able to import the css file from the package using postcss-import:
#import "local-ui-utilities/colors/sets/a.css";
This probably isn't your use-case, but postcss-import just uses npm under the hood. So This should work for your use-case as well, I would think.
This question and accepted answer seem related:
Node/NPM: Can one npm package expose more than one file?
You'll have to explicitly expose the file in the root folder, but many projects (including older versions of lodash) do this as part of a pre-publish step. In fact there's a package that does exactly what #Creynders suggests, adding module.exports = require('./path/to/file') files in your root folder. A while back I wrote up a guide on getting started, but the gist is pretty simple.
Install
npm install --save-dev generate-export-aliases
Configure
{
"name": "my-package",
"scripts": {
"prepublish": "generate-export-aliases"
},
"config": {
"exportAliases": {
"other": "./src/other"
}
}
}
Use
const other = require('my-package/other')
DISCLAIMER: I'm the author of the package
Edit
Don't use prepublish anymore. Instead, use prepublishOnly.
My own approach to the solution:
As no one has an idea of how to perform the desired behaviour, we can't stand still, the best answer now is:
Solution 1:
From Patrick solution, and his package generate-export-aliases: we can add some npm scripts to automate the whole publish process.
Either you write your code directly in commonjs inside ./src/ subdirectory or you used some new shining ES feature transpiled in ./dist, you will have your module files to be exposed, so add npm scripts:
"scripts": {
"expose": "generate-export-aliases",
"prepublishOnly": "npm run expose",
"postpublish": "git reset --hard HEAD"
}
Or a more save scripts
"scripts": {
"expose": "generate-export-aliases",
"prepublishOnly": "git ceckout -b prepublish-expose && npm run expose",
"postpublish": "git add . && git stash && git stash drop && git checkout master && git branch -d prepublish-expose"
}
Solution 2: Without generate-export-aliases
You can use gulp task runner (transpile if needed and put the files directly in the root dir, no need to copy or move again).
Indeed, this is the exposing step, you can keep prepublishOnly and postpublish scripts unchanged and just change the expose script. Save time and build in the root dir directly.
From node 12.7.0 there is the [exports][1] property of the package.json that can help you.
{
"main": "./main.js",
"exports": {
".": "./main.js",
"./other": "./src/submodule.js"
}
}```
If you have a lot of submodules and you want to export all files you can use a [subpath pattern][1]:
```//package.json
{
"exports": {
"./*": "./src/*.js"
},
}```
[1]: https://nodejs.org/api/packages.html#subpath-patterns
just import it as a simple file.
const otherfile = require('./node_modules/other_package/other_file.js');

Running scripts inside multiple node project directories with npm

The Context
I have one main project which has multiple node projects inside that as subdirectories. each one with their own node_modules directories and package.json files. I want to have an npm script defined in my main package.json files which runs npm scripts from each of those projects concurrently.
The Code
My directory structure is like this:
main:
...
package.json
- sub-project-1
- ...
package.json
- sub-project-2
...
package.json
Sub-project-1/package.json:
...
"scripts": {
"start": "node run foo.js"
}
Sub-project-2/package.json:
...
"scripts": {
"start": "node run bar.js"
}
Main package.json:
...
"scripts": {
"start": "/* Here I want to do something like npm sub-project-1/ start && npm sub-project-2/ start */
}
Now, obviously I could copy and paste the commands in sub-project-1/package.json's start script and sub-project-2/package.json's start script into a bash script and run that instead. But I want to be able to change those npm start scripts without having to manually change the bash script every time. Is there a way to do this?
The following is not concurrent; however, you can change the npm start scripts with this approach.
I should mention also that it's not the world's most scalable approach. But it's simple, so I thought it might be helpful to share.
Here goes ...
Main package.json:
...
"scripts": {
"start": "cd Sub-project-1 && npm run start && cd ../Sub-project-2 && npm run start && cd ../ && <parent project start script here>"
}
With this script, you're just meandering through the sub projects and then coming back up to run the parent project's script.
1.You can also write a python script that parses all the package.jsons of the subdirectories and then generate the master package.json file
2.or You can also write a python script that parses all the package.jsons of the subdirectories and then generate a shellscript where the mater package.json
will call it in the "start".

Categories