I'm using pkg to create an executable for my node js application. This works great. However, I'm also using the config module to load yaml config files based on the environment. When packaging the app using pkg, I'm specifying that the config folder should be included.
"pkg": {
"assets": [
"config/*"
]
}
When I run pkg . --debug, I can see that the config files are being packaged up. However, if I then rename the config folder, delete the folder, or just move the newly packaged exe to a different folder, it says No configurations found in configuration directory:C:\Users\me\folderwithexe\config (this config directory doesn't exist because I moved the exe to a new folder)
From what I can tell, the config module appears to be looking for the config folder relative to where the exe is being executed. It's not getting it from the packaged exe file even though it's in there. So if you were to run this exe on another computer (which is the plan), then it's looking for the config folder outside of the exe. None of the other modules appear to have this problem. It's just this config module.
Any idea how I can get the pkg module and the config module to work together?
Here is my full package.json
{
"name": "my-app",
"version": "1.0.0",
"description": "",
"main": "app.js",
"author": "Me",
"license": "UNLICENSED",
"dependencies": {
"config": "^3.3.1",
"js-yaml": "^3.14.0",
},
"bin": "app.js",
"pkg": {
"assets": [
"config/*"
]
}
}
The pkg will bundle every require dependency and every assets or scripts that it found in configuration lists (assets and scripts). So, first you need to keep your config files away from pkg.
To keep pkg away from your config files, you can use a variable path that pkg can't resolve, by example:
const config = require(path.join(__dirname, 'config/config.json'));
At this point the pkg don't bundle your config file, but if you run the build, you notice that the path of your config.json is something like /snapshot/config/config.json (https://www.npmjs.com/package/pkg#snapshot-filesystem)
The alternative is to get the config file from binary path using the process.execPath:
const config = require(path.join(process.execPath, "../","./config/config.json"));
After that, the executable will get the config file from "relative path" of running directory.
I don't test with the config module but I thing if you remove the assets: ["config/*"] from the pkg property from package.json and add the new path to the config module (above) it will work.
process.env["NODE_CONFIG_DIR"] = path.join(process.execPath, "../","./config/");
const config = require("config");
Related
Question
Why does importing from a linked local NPM pacakage (built as an ES module) using the pacakge name works when the pacakge.json has the "main" property, but fails when it has the "module" property?
Setup
More spicifically, if we have the following setup:
exporter: A Node.js package which is an ES module that exports something (e.g., using export default).
importer: A Node.js module that tries to import what exporter exports, using import something from 'exporter'.
Using npm link to locally link exporter to importer.
Then:
The setup runs successfully if exporter's package.json uses the main property.
The setup run fails if exporter's package.json uses the module property.
This failure can be "fixed" by using import something from 'exporter/dist/bundle.js', but that's unacceptable.
Why is that? I guess I'm doing something wrong?
For some more info about this setup, see my other question
Code
exporter
|- webpack.config.js
|- package.json
|- /src
|- index.js
|- /dist
|- bundle.js
webpack.config.js:
import path from "path";
import { fileURLToPath } from "url";
const __dirname = path.dirname(fileURLToPath(import.meta.url));
export default {
mode: "development",
entry: "./src/index.js",
output: {
filename: "bundle.js",
path: path.resolve(__dirname, "dist"),
library: {
type: "module",
},
},
experiments: {
outputModule: true,
},
};
package.json:
{
"name": "exporter",
"version": "1.0.0",
"main": "dist/bundle.js", <-- *** NOTE THIS LINE ***
"scripts": {
"build": "webpack"
},
"devDependencies": {
"webpack": "^5.51.1",
"webpack-cli": "^4.8.0"
},
"type": "module"
}
index.js:
function util() {
return "I'm a util!";
}
export default util;
importer
|- package.json
|- /src
|- index.js
package.json
{
"name": "importer",
"version": "1.0.0",
"type": "module"
}
index.js
import util from 'exporter';
console.log(util());
Then:
Linking:
⚡ cd exporter
⚡ npm link
⚡ cd importer
⚡ npm link exporter
Executing:
⚡ node importer.js
I'm a util!
However, if exporter's package.json is changed to:
{
"name": "exporter",
"version": "1.0.0",
"module": "dist/bundle.js", <-- *** NOTE THIS LINE ***
"scripts": {
"build": "webpack"
},
"devDependencies": {
"webpack": "^5.51.1",
"webpack-cli": "^4.8.0"
},
"type": "module"
}
Then:
Executing:
⚡ node importer.js
Fails:
Error [ERR_MODULE_NOT_FOUND]: Cannot find package 'importer\node_modules\exporter\' imported from importer\src\index.js
But Why?
When resolving a node_modules package, Node.js first checks if there's a "exports" field in the package.json, if there's none it looks for a "main" field, and if it's. also not there, it checks if there's a file named index.js – although this behavior is deprecated and may be removed in a later version of Node.js, I would advise against relying on it and always specify "exports" and/or "main" fields. You can check out Package entry points section of Node.js docs to get more info on that.
"module" is simply not a field Node.js uses, it's used by some other tools so it's certainly okay to have it defined in your package.json, but you should also have "main" and/or "exports" fields. Node.js will use the file extension to determine if the file is an ES module (dist/bundle.js is using .js as extension and you have "type": "module" in your package.json, so you're all set).
I'm migrating one of my CLI tools over to a global installation so that it can be installed globally and used anywhere on my system. Most of my src files include require('dotenv').config() at the top of them, but for some reason the env is undefined now that it's installed globally.
What am I missing?
My package JSON looks like:
{
"name": "scoop",
"version": "1.9.0",
"main": "bin/scoop.js",
"dependencies": {
"axios": "0.20.0",
"cli-spinners": "2.4.0",
"commander": "6.1.0",
"dotenv": "8.2.0",
"log-symbols": "4.0.0",
"ora": "5.1.0",
"readline": "1.3.0"
},
"bin": {
"scoop": "./bin/scoop.js"
}
}
bin/scoop.js then contains the following at the top:
#!/usr/bin/env node
require('dotenv').config();
const forms = require('../src/utils/LocateForms');
...
And I'm loading in additional JS src files that are exported, I've got an .env in my project my my custom variables just come up as undefined now?
I think it's expected behaviour.
The default path value for dotenv is Default: path.resolve(process.cwd(), '.env') according to the GitHub readme.
Now process.cwd changes depending upon from where you execute the executable file.
For example if u start node /a/b/c.js then the cwd would be /a/b and if you start it from node /a/b/d/c.js the cwd would be /a/b/d.
So, in order to get the the .env file that you want either you have to store the .env file in a common area like ~/.yourenv like most other executables do(think .bashrc).
Or, you can try to get the installation folder and get the .env file using an absolute path.
For example you can try to import npm and get the prefix to find out the installation folder.
var npm = require("npm")
npm.load({}, function (er) {
if (er) return handleError(er)
console.log(npm.get('prefix'));
})
Or, you can use some package like https://www.npmjs.com/package/get-installed-path
I would like to publish a npm package that contains my source as well as distribution files. My GitHub repository contains src folder which contains JavaScript source files. The build process generates dist folder that contains the distribution files. Of course, the dist folder is not checked into the GitHub repository.
How do I publish a npm package in a way that when someone does npm install, they get src as well as dist folder? Currently when I run npm publish from my Git repository, it results in only the src folder being published.
My package.json file looks like this:
{
"name": "join-js",
"version": "0.0.1",
"homepage": "https://github.com/archfirst/joinjs",
"repository": {
"type": "git",
"url": "https://github.com/archfirst/joinjs.git"
},
"main": "dist/index.js",
"scripts": {
"test": "gulp",
"build": "gulp build",
"prepublish": "npm run build"
},
"dependencies": {
...
},
"devDependencies": {
...
}
}
When you npm publish, if you don't have an .npmignore file, npm will use your .gitignore file (in your case you excluded the dist folder).
To solve your problem, create a .npmignore file based on your .gitignore file, without ignoring the dist folder.
Source: Keeping files out of your Package
Take a look at the "files" field of package.json file:
package.json, files
From the documentation:
The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)
Minimal example of how to use data files from a script
Another common use case is to have data files that your scripts need to use.
This can be done easily by using the techniques mentioned at: How can I get the path of a module I have loaded via require that is *not* mine (i.e. in some node_module)
The full example can be found at:
Source: cirosantilli/linux-kernel-module-cheat/npm/data-files/
Published: cirosantilli-data-files
With this setup, the file mydata.txt gets put into node_modules/cirosantilli-data-files/mydata.txt after installation, because we added it to our files: entry of package.json.
Our function myfunc can then find that file and use its contents by using require.resolve. It also just works on the executable ./cirosantilli-data-files of course.
package.json
{
"bin": {
"cirosantilli-data-files": "cirosantilli-data-files"
},
"license": "MIT",
"files": [
"cirosantilli-data-files",
"mydata.txt",
"index.js"
],
"name": "cirosantilli-data-files",
"repository": "cirosantilli/linux-kernel-module-cheat",
"version": "0.1.0"
}
mydata.txt
hello world
index.js
const fs = require('fs');
const path = require('path');
function myfunc() {
const package_path = path.dirname(require.resolve(
path.join('cirosantilli-data-files', 'package.json')));
return fs.readFileSync(path.join(package_path, 'mydata.txt'), 'utf-8');
}
exports.myfunc = myfunc;
cirosantilli-data-files
#!/usr/bin/env node
const cirosantilli_data_files = require('cirosantilli-data-files');
console.log(cirosantilli_data_files.myfunc());
The is-installed-globally package is then useful if you want to generate relative paths to the distributed files depending if they are installed locally or globally: How to tell if an npm package was installed globally or locally
just don't mention src and dist inside the .npmignore file to get the scr and dist inside the node_modules ... that's it
Another point is if there is a .gitignore file, and .npmignore is missing, .gitignore's contents will be used instead.
I'm running my app with ng serve and am wondering if there is a way to fetch the package.json file inside my app.
My initial idea was to copy package.json to the folder ./dist and read it from there but this seems to not be an option when using ng serve, which works in-memory and doesn't use the dist folder.
Is there a way to get the file when using ng serve or alternatively to make ng serve use the dist older instead of running in in-memory mode?
I am using Angular 4 and angular CLI version 1.3.2 (together with npm 4.3.0).
Thanks!
What build mechanism are you using? If you're in something like webpack, with the appropriate loader you can do
import package from '../package.json'; //es6, or
var package = require('../package.json'); //commonJS
console.log( package.version );
This will bundle up the package.json file as part of your build step. This can also be done using Browserify (but you'll probably need a transform).
I was able to solve it by setting the glob property in the project assets configuration inside .angular-cli.json
"apps": [
{
"root": "src",
"outDir": "dist",
"assets": [
"assets",
"favicon.ico",
{ "glob": "package.json", "input": "../", "output": "./assets/" }
],
}
The solution allows the file to be available from the outside therefore making GET requests possible.
package.json:
...
"name": "mypackage",
"main": "src/index.js"
...
Directory structure:
|- src/
|--- index.js
|--- other.js
I can require src/index.js with require('mypackage');, but how can I require src/other.js?
If the answer is require('mypackage/src/other');, is there a way to make it so I can require it with require('mypackage/other'); (i.e. teaching node what the source file directory is of your module?
AFAIK You'll have to explicitly expose it in the root:
Directory structure:
|- src/
|--- index.js
|--- other.js
|- other.js
Then in /other.js
module.exports = require('src/other.js');
Now you can do require('mypackage/other')
I'm currently looking into the exact same thing.
Package.json has a property called 'files':
http://blog.kewah.com/2014/npm-as-a-front-end-package-manager/
https://docs.npmjs.com/files/package.json
The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder.
But I have yet to find how to do a import/require of such a file.
I don't really see another point in listing these files other then to be able to import/require them?
I was able to import a file from a package if it was listed in this files array.
{
"name": "local-ui-utilities",
"version": "0.0.1",
"description": "LOCAL UI Utilities",
"main": "index.jsx",
"author": "Norbert de Langen",
"license": "none",
"dependencies": {
},
"files": [
"/colors/sets/variables.css"
]
}
I'm able to import the css file from the package using postcss-import:
#import "local-ui-utilities/colors/sets/a.css";
This probably isn't your use-case, but postcss-import just uses npm under the hood. So This should work for your use-case as well, I would think.
This question and accepted answer seem related:
Node/NPM: Can one npm package expose more than one file?
You'll have to explicitly expose the file in the root folder, but many projects (including older versions of lodash) do this as part of a pre-publish step. In fact there's a package that does exactly what #Creynders suggests, adding module.exports = require('./path/to/file') files in your root folder. A while back I wrote up a guide on getting started, but the gist is pretty simple.
Install
npm install --save-dev generate-export-aliases
Configure
{
"name": "my-package",
"scripts": {
"prepublish": "generate-export-aliases"
},
"config": {
"exportAliases": {
"other": "./src/other"
}
}
}
Use
const other = require('my-package/other')
DISCLAIMER: I'm the author of the package
Edit
Don't use prepublish anymore. Instead, use prepublishOnly.
My own approach to the solution:
As no one has an idea of how to perform the desired behaviour, we can't stand still, the best answer now is:
Solution 1:
From Patrick solution, and his package generate-export-aliases: we can add some npm scripts to automate the whole publish process.
Either you write your code directly in commonjs inside ./src/ subdirectory or you used some new shining ES feature transpiled in ./dist, you will have your module files to be exposed, so add npm scripts:
"scripts": {
"expose": "generate-export-aliases",
"prepublishOnly": "npm run expose",
"postpublish": "git reset --hard HEAD"
}
Or a more save scripts
"scripts": {
"expose": "generate-export-aliases",
"prepublishOnly": "git ceckout -b prepublish-expose && npm run expose",
"postpublish": "git add . && git stash && git stash drop && git checkout master && git branch -d prepublish-expose"
}
Solution 2: Without generate-export-aliases
You can use gulp task runner (transpile if needed and put the files directly in the root dir, no need to copy or move again).
Indeed, this is the exposing step, you can keep prepublishOnly and postpublish scripts unchanged and just change the expose script. Save time and build in the root dir directly.
From node 12.7.0 there is the [exports][1] property of the package.json that can help you.
{
"main": "./main.js",
"exports": {
".": "./main.js",
"./other": "./src/submodule.js"
}
}```
If you have a lot of submodules and you want to export all files you can use a [subpath pattern][1]:
```//package.json
{
"exports": {
"./*": "./src/*.js"
},
}```
[1]: https://nodejs.org/api/packages.html#subpath-patterns
just import it as a simple file.
const otherfile = require('./node_modules/other_package/other_file.js');